
Photo By BSIP/UIG Via Getty Images, logo illustration by Mateusz Slodkowski/SOPA Images/LightRocket via Getty Images
Is your therapist using ChatGPT to diagnose you?
Therapists have historically seen patients in an intimate, in-person setting. Since COVID shutdowns, however, impersonal meetings have become more frequent and normalized, on top of what was already an increasingly remote, digital world.
The mental health sector has been incredibly affected by these changes, spawning online therapy outlets like Talkspace and BetterHealth. Conceivably, a patient could conduct an online video call with a licensed therapist, who could diagnose the patient or talk through issues without ever being in the same room.
As it turns out, therapists also could be cheating.
'Here's a more human, heartfelt version with a gentle, conversational tone.'
A recent report by MIT Technology Review featured some eye-opening testimonies of online-therapy consumers who have caught their practitioners cutting corners in terms of their mental health care.
One patient named Declan was having connection trouble with his therapist online, so the two decided to turn off their video feeds. During this attempt, the therapist accidentally started sharing his screen, revealing he was using ChatGPT to procure his advice.
"He was taking what I was saying and putting it into ChatGPT and then summarizing or cherry-picking answers," Declan told the outlet. "I became the best patient ever," he continued, "because ChatGPT would be like, 'Well, do you consider that your way of thinking might be a little too black and white?' And I would be like, 'Huh, you know, I think my way of thinking might be too black and white,’ and [my therapist would] be like, ‘Exactly.’ I'm sure it was his dream session."
While Declan's experience was right in his face, others noticed subtle signs that their therapists were not being completely honest with them.
MIT Tech Review's own author Laurie Clark admitted in her article that an email from her therapist set off alarm bells when she noticed it was strangely polished, validating, and lengthy.
A different font, point-by-point responses, and the use of an em dash (despite being in the U.K.) made Clark think her therapist was using ChatGPT. When confronted by her concerns, the therapist admitted to using it to draft her responses.
"My positive feelings quickly drained away, to be replaced by disappointment and mistrust," Clark wrote.
Similarly, a 25-year-old woman received a "consoling and thoughtful" direct message from a therapist over the death of her dog. This message would have been helpful to the young woman had she not seen the AI prompt at the top of the page, which was accidentally left intact by the therapist.
"Here's a more human, heartfelt version with a gentle, conversational tone," the prompt read.
More and more people are skipping the middle man and heading straight to the chatbots themselves, which of course, some doctors have advocated against.
RELATED: ‘I said yes’: Woman gets engaged to her AI boyfriend after 5 months
For example, the president of the Australian Psychological Society warned against using AI for therapy in an interview with ABC (Australia).
"No algorithm, no matter how intelligent or innovative we think they might be, can actually replace that sacred space that gets trudged between two people," Sara Quinn said. "Current general AI models are good at mimicking how humans communicate and reason, but it's just that — it's imitation."
The American Psychological Association calls using chatbots for therapy "a dangerous trend," while a Stanford University study says AI can "lack effectiveness compared to human therapists" but also contributes to the use of "harmful stigma."
Blaze News asked ChatGPT if AI chatbots, like ChatGPT, are better or worse than real-life therapists. It answered:
"AI chatbots can offer support and guidance, but they are not a substitute for real-life therapists who provide personalized, professional mental health care."
Like Blaze News? Bypass the censors, sign up for our newsletters, and get stories like this direct to your inbox. Sign up here!
Andrew Chapados