Using ChatGPT too much can create emotional dependency, study finds

Using ChatGPT too much can create emotional dependency, study finds

OpenAI continues to unveil new AI models regularly in an effort to enhance its ChatGPT chatbot for its vast user base of 400 million individuals. However, the convenience offered by this AI tool raises concerns about the possibility of excessive reliance on technology.

The artificial intelligence company has now turned its attention to exploring the potential psychological impact that ChatGPT may have on its users. OpenAI recently released the findings of a collaborative study with MIT Media Lab, which revealed a link between increased usage of the ChatGPT chatbot and heightened feelings of loneliness among users.

Both organizations conducted separate studies and combined their results to arrive at a unified conclusion. OpenAI’s study analyzed over 40 million ChatGPT interactions spanning a month, with no human involvement to safeguard user privacy. Meanwhile, MIT observed around 1,000 participants engaging with ChatGPT over a 28-day period. It’s important to note that these studies have not yet undergone peer review.

MIT’s study explored various functions of ChatGPT that could influence users’ emotional experiences, such as using text or voice input. The results indicated that both modes had the potential to evoke feelings of loneliness or impact users’ social interactions during the study period. Factors like voice tone and conversation topics were also significant points of comparison.

The study found that a neutral tone when using ChatGPT’s voice mode was less likely to result in negative emotional outcomes for participants. Additionally, there was a correlation between users engaging in personal conversations with ChatGPT and an increased sense of loneliness, although these effects were temporary. Even users engaging in text conversations about general topics exhibited higher levels of emotional dependence on the chatbot.

The study also observed that individuals who viewed ChatGPT as a friend and those with a predisposition towards strong emotional attachments in relationships were more likely to feel lonelier and emotionally reliant on the chatbot during the study period.

OpenAI’s study provided further insights, highlighting that emotional interactions with ChatGPT were uncommon. Moreover, heavy users who utilized the Advanced Voice Mode feature and considered ChatGPT a friend reported low emotional reactions during interactions with the chatbot.

OpenAI emphasized that these studies aim to understand the potential challenges associated with its technology and establish guidelines for the appropriate use of its models.

While OpenAI suggests that its interaction-based study mirrors real-life behaviors, some individuals have confessed on platforms like Reddit to using ChatGPT as a substitute for seeking therapy to address their emotions.






Leave a Reply

Your email address will not be published. Required fields are marked *