Petaling Jaya: Stigma and cultural beliefs already deter some Malaysians from seeking help for mental health issues.
Compounding the problem is that people in need of treatment are increasingly turning to chatbot platforms for guidance.
Dr Adnan Omar, a psychologist and vice-president of the Malaysian Society of Mind, Spiritual Wellbeing, said the growing trend of using platforms such as ChatGPT for mental health inquiries is a cause for concern. .
“The conversational capabilities and round-the-clock availability of these platforms make them go-to sources of information. They also offer convenience and anonymity, and while they may provide some insight, their diagnostic may be inaccurate and dangerous.”
Kelly Tan, 31, an accountant, experienced sleep disturbances for several nights in a row, which affected her work performance.
Frustrated by not getting enough sleep for almost four months, she turned to ChatGPT for a solution.
“A careful reading of the information I provided suggested that my symptoms were caused by insomnia.” “I bought some insomnia medication,” she said.
However, as time went on, Tan realized that he desperately needed medication to help him fall asleep.
“At first it seemed like a temporary solution, but it turned into an addiction. Because I felt like I couldn’t sleep without the medication, my normal sleep-wake cycle was also disrupted, and I started feeling drowsy and tired during the day. became.
“I finally consulted a psychologist who told me that I didn’t have insomnia and that my sleep problems were caused by other factors.”
Adnan, who is also a suicidologist, said that while older people may prefer traditional methods such as seeking help from family members or healers, younger people are influenced by social and cultural factors and prefer to use online platforms. He said that this is likely to affect the level of trust and dependence. on a chatbot platform.
He said psychologists and psychiatrists are important when it comes to mental illnesses because mental health is a multifaceted aspect of human existence and includes biological, psychological, spiritual and social aspects. .
“This is because some patients may appear to have a mental health problem if their symptoms are the result of another biological or physiological condition. Two or more at the same time. Some people have health conditions.
“Attempts to use ChatGPT to summarize symptoms, diagnoses, and treatments are dangerous because important nuances can be overlooked, leading to misdiagnosis and inappropriate treatment.”
Adnan said some mental illnesses, such as bipolar disorder, schizophrenia and depression, require medication as part of their treatment.
The dosage of such drugs varies from person to person, which ChatGPT cannot handle.
He said that since mental health concerns an individual’s cognitive and emotional functioning, inaccurate diagnosis and treatment provided by ChatGPT can cause significant psychological harm to individuals.
Adnan said that while chatbot platforms are positively contributing to increasing the accessibility of mental health knowledge and understanding to more people, it is not a good idea to rely solely on chatbot platforms. .
“Such platforms lack accuracy and therefore the advice they provide cannot be trusted. Without proper regulation and oversight, misinformation and inappropriate guidance can spread, worsening mental illness in patients. There is a possibility that it will.”
Mr Adnan said that while Malaysians have made commendable progress in understanding and treating mental health, education and literacy needs to be improved to strengthen support systems and address systemic barriers to accessing care. He said there is a need to continue efforts in this area.
“By working together to tackle these challenges, we can work towards building a society where mental health is prioritized, supported and accessible to everyone.”