Latest News

[fblike]
Is Artificial Intelligence Dangerous for People with OCD?

Is Artificial Intelligence Dangerous for People with OCD?

OCD and AI-ssurance Seeking

Obsessive-Compulsive Disorder (OCD), often referred to as The Doubting Disease, is a condition that impacts 3% of Australians, subjecting individuals to relentless Intrusive Thoughts that lead to severe distress. To alleviate these thoughts, individuals resort to compulsions, which typically involve repetitive actions, avoidance behaviors, and assurance seeking. In this blog article, we delve into the question: “Is Artificial Intelligence Dangerous for People with OCD?” Uncover the potential risks and benefits of integrating AI into the lives of those with OCD, exploring its impact on daily living, treatment options, and the significance of responsible AI implementation in supporting mental health.

OCD Intrusive Thought Example and Assurance-Seeking Compulsion.

Steve is bombarded with Intrusive Thoughts that he is a terrible human being and feels severe guilt from this. Steve hates these thoughts and the idea that he is a bad person. So, over time he has begun asking his friends ‘Am I a bad person?’ ‘Have I done anything to upset you?’. Now, Steve calls his friends multiple times a day to ask the same questions and frequently searches google for a definitive answer.

In this example Steve is using the compulsion of assurance seeking to try and negate his intrusive thoughts, to prove them wrong. The problem is that when the brain receives this assurance it tends to strengthen the neural pathway of intrusive thought and compulsion. So the next time Steve has similar intrusive thoughts, he will likely feel even more compelled to get assurance.

ChatGPT, Google Bard and OCD

AI programs like ChatGPT and Google Bard have exploded in popularity. With chatgpt boasting 100 million users, which is growing by 10% each month. These programs offer people the chance to generate highly intelligent responses in seconds. For those with OCD this could present an opportunity for unlimited and instant reassurance.

Whilst chatGPT has a fairly plain interface which might not compare to assurance from a friend, there are many AI alternatives that can provide other stimulation. For instance, character ai allows users to select a fictional, historical or contemporary figure with whom they want to talk to. So, people with OCD could choose to get reassurance from Brad Pit.ai, Doja Cat.ai or even Cleopatra.ai. Furthermore, AI technology now allows users to generate videos of these people speaking to them. All of this may make the likelihood of AI-susurance seeking more dangerous for someone with OCD. Especially, considering they are not actually talking to a friend who could encourage them to get professional help.

Research on AI and OCD

Research into this area is non-existent due to the rapid rise in ai technology. However, there are signs from adjacent research that AI may be able to help those with OCD. Research into machine learning (ai for data) has already shown that it can detect depression by viewing someone’s tweets or their YouTube videos with > 80% accuracy. Furthermore, platforms like chatGPT seem to be working hard to make conscientious choices, such as refusing to engage in hate speech and terrorist related searches. Whilst the future of AI and how it may relate to OCD is uncertain (something people with OCD struggle with), in the short term it might be wise to use AI with caution. I have high hopes that AI developers will integrate psychological knowledge which could detect and even help those with OCD.

 

 

author avatar
Monique Jones

Recent Posts