Why You Really Shouldn’t Use ChatGPT as a Therapist

Salesforce's CEO says that he uses ChatGPT as his therapist, but that might not be the best idea in the long run...

At Dreamforce 2024, Salesforce CEO Marc Benioff admitted to using ChatGPT as a therapist during a fireside chat with NVIDIA CEO Jensen Huang.

He’s not the only one. The practice of using ChatGPT and other popular AI chatbots as a sounding board for personal problems as become surprisingly common, if only because the cost is understandably more manageable.

Still, there are some notable reasons why using ChatGPT as a therapist is not a good idea, even if it is saving you a bundle on medical bills.

ChatGPT is Consistently Inaccurate

It’s no secret that AI chatbots like ChatGPT are less than 100% accurate. In fact, we’ve been collecting a list of all the errors and mistakes that AI has made over the last few years, and to say it’s extensive would be a grave understatement.

A fair amount of the misinformation and mistakes from AI chatbots is, however, fairly harmless. Silly portraits showing people with six fingers and laughably incorrect statistics pulled out of thin air aren’t going to dramatically impact the trajectory of someone’s life.

Therapy, on the other hand, does have a significant effect on the individual taking part, and with ChatGPT’s track record on medical inaccuracies, there’s an obvious risk associated. All that to say, maybe don’t put your mental health in the hands of a platform that can’t tell the difference between the Mona Lisa and Shrek.

ChatGPT Doesn’t Have any Qualifications

Non-AI chatbot therapists are obviously a lot more regulated than AI chatbots, which means that they are up to a higher standard of care for their patients.

ChatGPT doesn’t have a PhD, it didn’t go to medical school, and perhaps most importantly, it’s not yet subject to legal responsibility for bad medical advice.

 

About Tech.co Video Thumbnail Showing Lead Writer Conor Cawley Smiling Next to Tech.co LogoThis just in! View
the top business tech deals for 2024 👨‍💻
See the list button

“AI is not a substitute for good human judgment, and, for now, there are few options for defending against malpractice claims or holding AI providers accountable for bad AI-generated medical advice.” – Matthew Chung, Managing Editor of the Harvard Journal of Law and Technology in a post

Not only could you be getting therapy that is not equipped to handle your problem, but you’ll also have no legal recourse should something go wrong.

No Doctor-Patient Confidentiality with ChatGPT

Getting your diagnosis wrong is one thing, but what about your privacy? It’s safe to assume that as a person seeking help from a therapist, you don’t want your personal information to be available around the world.

With ChatGPT, that could very well be the case. The platform has been breached in the past, and hackers are getting more and more advanced thanks to the very platform you want to use for therapy. Heck, ChatGPT isn’t HIPAA compliant, so that’s a good place to start.

Given the decidedly sensitive nature of therapy and the relative lack of security in AI chatbots, the privacy risk alone should be worthy of investing in a real therapist instead of ChatGPT.

6 Things You Should Never Share with ChatGPT

ChatGPT Lacks any Sense of Empathy

We’ll be the first ones to admit that AI has gotten really good over the last few years. It’s gotten so good that many users forget they’re even talking to AI chatbots in the first place.

While the lines between AI and humans is more blurred than ever before, experts are quick to point out that AI is nowhere near advanced enough to handle all the nuances of the human experience, particularly when it comes to therapy.

“AI does a really good job in gathering a lot of knowledge across a continuum. At this time, it doesn’t have the capacity to know you specifically as a unique individual and what your specific, unique needs are.” – Olivia Uwamahoro Williams, PhD, co-chair of the American Counseling Association Artificial Intelligence Interest Network to Health.com

Therapy is more than just gleaning information about mental health and self-applying it to your own life. It’s designed to help you build a trusting relationship with a human being who is qualified to provide valuable insight into your ongoing situation. ChatGPT, and all AI chatbots for that matter, don’t have that capability, and it’s unclear if they ever will.

Did you find this article helpful? Click on one of the following buttons
We're so happy you liked! Get more delivered to your inbox just like it.

We're sorry this article didn't help you today – we welcome feedback, so if there's any way you feel we could improve our content, please email us at contact@tech.co

Written by:
Conor is the Lead Writer for Tech.co. For the last six years, he’s covered everything from tech news and product reviews to digital marketing trends and business tech innovations. He's written guest posts for the likes of Forbes, Chase, WeWork, and many others, covering tech trends, business resources, and everything in between. He's also participated in events for SXSW, Tech in Motion, and General Assembly, to name a few. He also cannot pronounce the word "colloquially" correctly. You can email Conor at conor@tech.co.
Explore More See all news
Back to top
close Building a Website? We've tested and rated Wix as the best website builder you can choose – try it yourself for free Try Wix today