Close Menu
  • Home
  • Wellness
    • Women’s Health
    • Anti-Aging
    • Mental Health
  • Alternate Healing
    • Energy Healing
    • Aromatherapy
    • Acupuncture
    • Hypnotherapy
    • Ayurveda
    • Herbal Remedies
    • Flower Essences
    • Naturopathy
  • Spirituality
    • Meditation
    • Pilates & Yoga
  • Nutrition
    • Vitamins & Supplements
    • Recipes
  • Shop

Subscribe to Updates

Subscribe to our newsletter and never miss our latest news

Subscribe my Newsletter for New Posts & tips Let's stay updated!

What's Hot

4 supplements you should absolutely avoid, found at HomeGoods

July 30, 2024

This anti-aging snail slime serum is just $14 (over 40% off), so grab it!

July 30, 2024

Book Review: The subtle power of emotional abuse

July 30, 2024
Facebook X (Twitter) Instagram
  • Home
  • About us
  • Advertise with Us
  • Contact us
  • DMCA Policy
  • Privacy Policy
  • Terms and Conditions
Login
0 Shopping Cart
The Holistic Healing
  • Home
  • Wellness
    • Women’s Health
    • Anti-Aging
    • Mental Health
  • Alternate Healing
    • Energy Healing
    • Aromatherapy
    • Acupuncture
    • Hypnotherapy
    • Ayurveda
    • Herbal Remedies
    • Flower Essences
    • Naturopathy
  • Spirituality
    • Meditation
    • Pilates & Yoga
  • Nutrition
    • Vitamins & Supplements
    • Recipes
  • Shop
The Holistic Healing
Home » Can AI help fill the therapist shortage? Mental health apps have promises and pitfalls
Mental Health

Can AI help fill the therapist shortage? Mental health apps have promises and pitfalls

theholisticadminBy theholisticadminApril 7, 2024No Comments7 Mins Read
Facebook Twitter Pinterest Telegram LinkedIn Tumblr Email Reddit
Share
Facebook Twitter LinkedIn Pinterest Email Copy Link


Mental health service providers are turning to AI-powered chatbots designed to fill the gap amid a shortage of therapists and growing demand from patients.

But not all chatbots are the same. Some chatbots offer helpful advice, while others may be ineffective or even harmful. Woebot Health uses AI to power a mental health chatbot called Woebot. The challenge is to harness the power of artificial intelligence safely while protecting people from harmful advice.

Alison Darcy, founder of Woebot, sees chatbots as a tool to help people when a therapist is not available. Darcy said it can be difficult for her to reach out to a therapist when she’s having a 2 a.m. panic attack or struggling to get out of bed in the morning.

Click here to view related media.

Click to expand

But the phone is right there. “We need to modernize psychotherapy,” she says.

Darcy said stigma, insurance, cost and waiting lists keep many people from accessing mental health services, and most people who need help don’t get it. And the problem has only gotten worse since the coronavirus pandemic.

“Isn’t the question how to get people into the clinic?” Darcy said. “How can we actually get these tools out of the clinic and into people’s hands?”

How AI-powered chatbots work to support treatment

Woebot acts as a kind of pocket therapist. Use the chat feature to help manage issues such as depression, anxiety, addiction, and loneliness.

The app is trained on tons of specialized data to understand words, phrases, and emojis associated with dysfunctional thinking. Woebot partly mimics and challenges the idea of ​​a type of face-to-face talking therapy called cognitive behavioral therapy (CBT).

Woebot Health founder Alison Darcy explains to Dr. Jon LaPook how Woeboy works.

60 minutes


Woebot Health reports that 1.5 million people have used the app since its launch in 2017. Currently, users can only use the app if they are enrolled in an employer benefits plan or have access from a health care professional. Virtua Health, a nonprofit health care company in New Jersey, offers free access to patients.

Dr. Jon LaPook, chief medical correspondent for CBS News, downloaded Woebot and used the unique access code provided by the company. He then tried the app posing as someone dealing with depression. After a few prompts, Woebot decided he wanted to dig deeper into why he was so sad. Dr. LaPook comes up with a scenario and tells Warbot that he is worried about the day when his child will leave home.

In response to one prompt, he wrote, “I can’t do anything right now. I guess I’ll just jump when I get to the bridge,” and intentionally used “jump over that bridge” instead of “cross that bridge.” Ta.

Based on Dr. LaPook’s language choices, Woebot detected that something might be seriously wrong and offered him the option to refer to a specialized helpline.

Simply saying “jump off that bridge” and not combining it with “there’s nothing I can do about it right now” did not provoke a reaction to seek further help. Like human therapists, Woebot is not foolproof, so you shouldn’t expect it to be able to detect whether someone is suicidal.

Lance Elliott, a computer scientist who writes about artificial intelligence and mental health, said AI has the ability to recognize nuances in conversations.

”[It’s] In a sense, mathematically and computationally, you can understand the nature of words and how they relate to each other. “What this system does is take advantage of a huge amount of data,” Elliott said. We will respond.”

computer scientist lance elliott

60 minutes


In order for the system to do its job, it has to go somewhere to find the appropriate response. Systems that use rule-based AI like Woebot are typically closed. They are programmed to respond only to information stored in their own databases.

Woebot’s team of psychologists, doctors, and computer scientists builds and refines research databases from medical literature, user experience, and other sources. Writers create questions and answers and revise them in weekly remote video sessions. Woebot’s programmers translate these conversations into code.

Generative AI allows systems to generate unique responses based on information from the internet. Generative AI is less predictable.

Pitfalls of AI mental health chatbots

National Eating Disorder Association AI-powered chatbot Tessawas removed because it offered potentially harmful advice to people seeking help.

Ellen Fitzsimmons-Craft, a psychologist who specializes in eating disorders at Washington University School of Medicine in St. Louis, helped lead the team developing Tessa, a chatbot aimed at preventing eating disorders.

She said the system she helped develop was a closed one, so there was no chance of advice from a chatbot that the programmers hadn’t anticipated. But that didn’t happen when Sharon Maxwell tried it.

Maxwell, who has been treated for eating disorders and now works as an advocate for others, asked Tessa how it can help people with eating disorders. Tessa is off to a strong start by being able to share coping skills and provide people with the resources they need.

But when Maxwell persisted, Tessa began giving advice that went against the usual guidance for people with eating disorders. For example, they suggested reducing caloric intake and using tools such as subcutaneous fat calipers to measure body composition.

“The average person might look at this and think it’s just normal advice, like eat less sugar, eat more whole foods, things like that,” Maxwell said. “But for people with eating disorders, it can quickly progress to more disordered behavior and can be very harmful.”

sharon maxwell

60 minutes


She reported her experience to the National Eating Disorders Association, which featured Tessa on its website at the time. Shortly after, Tessa collapsed.

Fitzsimmons-Craft said Tessa’s problems began after Cass, the tech company she partnered with, took over programming. She said Cass explained that the harmful messages appeared after people pressed Tessa’s Q&A feature.

“My understanding of what went wrong is that at some point, and I’ll have to actually talk to Cass about this, there may have been some generative AI capabilities built into the platform.” said Fitzsimmons-Craft. “So my best guess is that these features were also added to this program.

Kass did not respond to multiple requests for comment.

Some rule-based chatbots have their own drawbacks.

“Yeah, they’re predictive,” says Monica Ostrov, a social worker who runs a nonprofit eating disorder organization. “I mean, who would want to type the same thing over and over again and get the exact same answer in the exact same language?”

Ostrov was in the early stages of developing her own chatbot when a patient told her what happened to Tessa. So she had doubts about using her AI for mental health care. She worries, she said, that she will lose something fundamental to therapy: being in the same room as other people.

“Connection is how people heal,” she said. Ostrov doesn’t think computers can do that.

The future of using AI in treatment

Unlike therapists who are licensed in the states in which they practice, most mental health apps are largely unregulated.

Ostroff said AI-powered mental health tools, especially chatbots, need guardrails. “It shouldn’t be an internet-based chatbot,” Ostrov said.

Despite the potential problems, Fitzsimmons-Craft isn’t interested in the idea of ​​using AI chatbots for treatment.

“The reality is that 80% of people with these concerns will not receive any assistance,” Fitzsimmons-Craft said. “And technology provides a solution. It’s not the only solution, it’s the solution.”

John LaPook

Dr. Jonathan LaPook is CBS News’ chief medical correspondent.



Source link

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
theholisticadmin
  • Website

Related Posts

Lewis Hamilton reveals mental health issues amid Mercedes woes before Silverstone win

July 7, 2024

Lewis Hamilton reveals mental health struggles after losing 2021 F1 title | F1

July 7, 2024

Mental health bill gets bipartisan support

July 7, 2024
Leave A Reply Cancel Reply

Products
  • Handcraft Blends Organic Castor Oil - 16 Fl Oz - 100% Pure and Natural
  • Bee's Wrap Reusable Beeswax Food Wraps
  • WeeSprout Double Zipper Reusable Food Pouch - 6 Pack - 5 fl oz
Don't Miss

8 Ayurvedic drinks and tonics to boost your immunity this monsoon season

By theholisticadminJuly 30, 2024

Cinnamon Tea Cinnamon has anti-inflammatory and antibacterial properties, making it perfect for maintaining overall health…

An Ayurvedic Roadmap for Seasonal Self-Care

July 30, 2024

Can Zydus Wellness overcome skepticism about health drinks as it enters the Ayurvedic beverage space with Complan Immuno-Gro? – Brand Wagon News

July 30, 2024

Zydus Wellness launches Ayurvedic beverage Complan Immuno-Gro with campaign featuring actress Sneha

July 30, 2024

Subscribe to Updates

Subscribe to our newsletter and never miss our latest news

Subscribe my Newsletter for New Posts & tips Let's stay updated!

About Us

Welcome to TheHolisticHealing.com!

At The Holistic Healing, we are passionate about providing comprehensive information and resources to support your journey towards holistic well-being. Our platform is dedicated to empowering individuals to take charge of their health and wellness through a holistic approach that integrates physical, mental, and spiritual aspects.

Facebook X (Twitter) Pinterest YouTube WhatsApp
Our Picks

4 supplements you should absolutely avoid, found at HomeGoods

July 30, 2024

This anti-aging snail slime serum is just $14 (over 40% off), so grab it!

July 30, 2024

Book Review: The subtle power of emotional abuse

July 30, 2024
Most Popular

Energy healed me — over the phone! Scientist explains how

October 19, 2011

Spirituality and Healing | Harvard Medical School

January 14, 2015

Healing through music – Harvard Health

November 5, 2015
  • Home
  • About us
  • Advertise with Us
  • Contact us
  • DMCA Policy
  • Privacy Policy
  • Terms and Conditions
© 2026 theholistichealing. Designed by theholistichealing.

Type above and press Enter to search. Press Esc to cancel.

Sign In or Register

Welcome Back!

Login to your account below.

Prove your humanity


Lost password?