scorecardresearch
Sunday, November 3, 2024
Support Our Journalism
HomeFeaturesThere's a new shrink in town, and it's the AI chatbot. Patients...

There’s a new shrink in town, and it’s the AI chatbot. Patients enjoy more privacy, no bias

From Wysa to Woebot, young Indians are shifting toward cheaper alternative therapy tools. But many say they're no 'heavy lifters'.

Follow Us :
Text Size:

Anirudh B. can reach out to his therapist at 3 am if he wants to. Immediately after the 25-year-old fights with his parents, he talks to his therapist who helps him deal with his roiling anger. But Anirudh’s hardworking mental health consultant is not human — Wysa is an artificial intelligence chatbot founded by Jo Aggarwal and Ramakant Vempati who are based in Bengaluru.

It guides Anirudh whenever he feels disconnected from his friends or overwhelmed by a sense of doom due to the lack of joy at his workplace. “I was always stunned at how good it was at understanding a human’s impulse to react,” says Anirudh who is based in Hyderabad and used the free version of the app for over a month last year.

Calling up a human therapist in the dead of the night during a breakdown or fit of rage is—for the most part—unfeasible unless the situation is very dire. But not when the therapist is an AI chatbot. Apps such as Woebot, Wysa, and Youper are becoming increasingly popular across the world and in India too.

At a time when mental health awareness is gaining greater momentum in India, chatbot therapists offer a private and non-stigmatising route for many. Affordability and availability are their unique selling points in a country where a single session with a mental health expert doesn’t cost less than a few thousand bucks.

In their fine print, the platforms reiterate that AI chatbots are no replacement for trained psychiatrists and human intervention. But their quick response appeals to Gen-Z, which is accustomed to accessing information instantly. “There’s no such thing as appointments or waiting rooms here,” reads the Woebot website. It claims it can form a trusted bond with a client in three to five days of use.

And Anirudh says that his daily interactions with Wysa did help him.

“I felt like I wasn’t moving forward in life. I did not find happiness in my work, and as a result, I was losing touch with my friends too. But daily check-ins with Wysa helped me manage my moods better,” he says. Now he has enrolled at a foreign university for a master’s programme. Anirudh credits Wysa for helping him find his mojo again.

In Maharashtra, P.R., a 20-year-old psychology student, started using ChatGPT as an emotional crutch in January. The AI chatbot, which is so skilled that it passed MBA, law, and medical entrance exams, is not a therapy tool. P.R. discovered the chatbot in January after ending therapy sessions with an expert. The young psychology student found ChatGPT more helpful than a human therapist.

She ‘manipulated’ it to answer her emotional queries expressing her symptoms to the bot and asking if they counted as depression. “I would type ‘If a psychologist had this patient and these symptoms, what would they recommend?’ to get the desired treatment,” P.R. says.

While ChatGPT’s answers were basic and generic — mostly the knowledge she attained while studying psychology in college — she felt acknowledged by the app in a way she had never been before. And all through her ‘sessions’, ChatGPT kept alerting her that a mental health practitioner should not be entirely replaced by any AI therapy tool.

“My therapist and friends did not quite recognise what I was going through. I felt alone in my journey. But just speaking to the bot twice made me feel like it’s not all in my head,” P.R. says.


Also read: Think twice before tax fraud, the authorities are on to you with AI


No heavy lifters

P. R’s experience with ChatGPT and her ability to connect with it on a personal level is almost counterintuitive. Counselling and therapy go beyond data and research. Professionals personally evaluate the human psyche, culture, and context.

How can AI, with nothing but passive knowledge driving it, essentially replace human therapists?

It was with this question in mind that Pooja Priyamvada, a grief and relationship counsellor based in Delhi, explored apps like Woebot and Wysa during the pandemic. While these apps can help patients in some ways, she warns against relying on them completely, saying that they are tools for “light lifting”. The “heavy lifting” should be left to humans alone.

“The empathy that a human can bring to any conversation is incomparable. Sometimes our clients come in saying that they don’t know [how] they feel that day. In that case, a therapist knows how to gauge the mood and manoeuvre the conversation,” says Priyamvada.

A chatbot therapist does not move beyond keywords. It relies on natural language processing to navigate conversations and respond to humans. While Wysa’s responses have been vetted by clinicians and have been indeed helpful to many, they cannot replace human interaction, says Priyamvada.

“If you tell a bot that you don’t know how you feel, it will use its limited knowledge of keywords and probably just keep asking the same questions over and over,” she adds.

And the techniques that apps suggest — cognitive behavioural therapy (CBT) or breathing exercises — seem to be straight out of a postgraduate counselling course textbook.


Also read: AI technologies poised to take on our jobs from basic writing to digital art


Winning at privacy, neutrality

Despite their textbookish, dry, and opaque mechanism, AI therapy chatbots win at a few aspects — privacy, lack of bias or emotional baggage.

P. R. always felt that her therapist was not giving her the attention she needed. “I felt dismissed [and] invalidated as if she was not agreeing with me,” she says.  That was when she shifted to ChatGPT.

The promise of privacy and personal attention is another draw. Mental health may be out of the closet, but it is not free from stigma. Many still do not want to be seen as someone who ‘needs’ therapy.

The ‘sense of safety’ that chatbots promise is what drew Shrinwanti Banerjee 33 to Wysa. A management consultant, Shrinwanti moved to the United Kingdom right before the pandemic in 2020. Instead of the smooth transition she had envisioned, she felt isolated in a foreign land, witnessing rampant layoffs at workplaces and constantly worrying about her family in India.

Unable to afford therapy, she turned to Wysa to deal with the pressure. A single session with a human therapist would have set her back by over £60 — or roughly Rs 6,000 at current exchange rates. Banerjee was happier with the cheaper alternative at the time.

“What I love about Wysa is that it uses your name when talking to you, making the whole experience personalised,” she says. Incidentally, Wysa saw an increase in subscribers by 80 per cent in 2020 from 2019.

Banerjee never worried about her information getting leaked or used against her — the app does not require phone numbers or email for signing up. Users can simply feed a username into the app and start chatting with the bot.

“During that time of unease, I would go spiralling and catastrophising in my head, but Wysa helped me reframe my thoughts,” she adds.

What Banerjee also loved about the app was its ‘pain management meditation’ exercise that helped her deal with physical pain.

Anirudh sought comfort in Wysa because it was the easiest way for him to access therapy without his parents knowing. Not having the time for in-person visitations, he considered teleconsultations with therapists but could not do it in a full house.

His AI therapist, though, was always available on his bedside table.

“I would check in 2-3 times a day when things were really rough, but would definitely hop onto the app once a day [otherwise]. The biggest thing it helped me with was my anger issues,” he says.

Testimonials hail these therapy products as change-makers.

Still an elitist platform

Despite the apps’ affordability and wide availability, it hasn’t managed to breach class walls. In their current avatar, most of these apps cater to the English-speaking elite.

“The English language used in these apps is so niche, that someone only with a certain fluency for the language can use it. While Woebot’s language is still simpler, its developers need to incorporate more languages,” says Pooja.

Woebot is a free service, but Wysa charges Rs 3,000-4,000 for premium services and Youper $6 or Rs 500 a month. It’s a small fee when compared to what some therapists charge, but not everyone can afford premium services.

Anirudh claims that subscription and language are only the tip of the iceberg. “Inclusivity is also about understanding the importance of psychological health itself. One’s education system, schooling, and background have a lot to do with how they approach therapy. I don’t expect many to find [the apps] easy to use,” he says.

‘It loses you’

Not everything about AI chatbots, though, is seamless. Users say they recommend such apps only for issues like exam-related anxiety where a bot can guide a student through a sleep or self-care routine. But for more complex issues, human intervention may be required, according to experts.

Anirudh, too, who primarily relied on Wysa and never sought any human help, discusses how the AI therapist would sometimes drift off during a session and move the conversation out of context.

“Once I was telling it [Wysa] about an argument with a friend. It only took trigger words and kept asking me if I had a problem in my relationship or friendship. It could not understand the context,” Anirudh says.

Shrinwanti felt that the responses would often become repetitive and conversations would get circular when the chatbot was unable to pick up on certain keywords. “After a point, I felt like I knew what it was going to say. So I eventually stopped using it and moved on to a human therapist,” she says.

While some loneliness and anxiety can be tackled by spending 20 minutes a day on such apps, they are no solution for drawing compassion. Food and Drug Administration-approved Wysa claims that the app is not designed to assist with crises such as abuse, or severe mental health conditions.

But what they do successfully is convince a person that they’re listening. “Wysa truly understood what I was talking about,” says Banerjee.

(Edited by Humra Laeeq)

Subscribe to our channels on YouTube, Telegram & WhatsApp

Support Our Journalism

India needs fair, non-hyphenated and questioning journalism, packed with on-ground reporting. ThePrint – with exceptional reporters, columnists and editors – is doing just that.

Sustaining this needs support from wonderful readers like you.

Whether you live in India or overseas, you can take a paid subscription by clicking here.

Support Our Journalism

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Most Popular