Study Explores ChatGPT’s Potential Role in Reducing Mental Health Stigma

New Delhi — While artificial intelligence cannot replace professional mental health care, chatbots such as ChatGPT may help reduce stigma around mental health, particularly among individuals who are reluctant to seek face-to-face support, according to a new study.

Researchers from Edith Cowan University (ECU) in Australia surveyed 73 individuals who had used ChatGPT for personal mental health support. The study examined how people use the chatbot and how effective they perceive it to be, especially in relation to mental health stigma.

Scott Hannah, a Master of Clinical Psychology student at ECU, said the findings indicate that users who believe the tool is effective are less concerned about being judged by others. “The results suggest that perceived effectiveness plays an important role in reducing fears of external judgment,” he noted.

Stigma remains a significant barrier to seeking mental health support, often worsening symptoms and preventing individuals from accessing timely care. The study focused on two forms of stigma: anticipated stigma, which involves fear of being judged or discriminated against, and self-stigma, where individuals internalise negative stereotypes that undermine confidence and willingness to seek help.

Participants who viewed ChatGPT as helpful were more likely to continue using it and reported lower levels of anticipated stigma, meaning they felt less afraid of being judged for their mental health concerns.

With AI tools becoming increasingly accessible, many people are turning to chatbots for private and anonymous discussions about mental health. “Although these tools were not designed for therapeutic use, AI systems like ChatGPT are increasingly being used for mental health-related purposes,” Hannah said.

However, the researchers cautioned against uncritical reliance on such tools. While AI may feel easier to talk to, anonymous digital platforms raise important ethical concerns. “ChatGPT was not created as a therapeutic tool, and research shows its responses can sometimes be inaccurate or inappropriate. Users should therefore engage with AI-based mental health tools thoughtfully and responsibly,” Hannah added.

The study underscores the need for further research to determine how AI technologies can safely and effectively complement existing mental health services, rather than replace professional care.

 

With inputs from IANS

Follow Us
Read Reporter Post ePaper
--Advertisement--
Weather & Air Quality across Jharkhand