Can AI Really Help With Mental Health? A Look at the Claims and Concerns

Exploring the role of AI in mental health support and why the results aren’t as clear-cut as they seem

Lately, there’s been a lot of talk about AI mental health support, especially after some studies showed that AI like GPT-4 scored impressively on psychology exams. On the surface, this sounds promising — the idea that AI could lend a hand with basic mental health issues like stress or anxiety is pretty appealing. But when you dig a little deeper, things get a bit murky.

The first thing to understand is how these studies measure AI’s ability to help with mental health. For example, one study looked at ChatGPT Plus’s performance on psychology and reasoning tests. The AI scored between 83% and 91%, and the researchers were optimistic, suggesting it could handle simple mental health support. But that’s where the problems start.

Testing AI Mental Health Support: Is It Reliable?

The way AI was tested might not truly reflect its capabilities. Instead of running tests in a controlled API environment, the researchers used ChatGPT Plus as any regular user would. That means the AI’s responses likely varied a lot depending on how the questions were phrased. If you’ve used ChatGPT, you already know that rewording a question can change the answer quite a bit.

This inconsistency is a big red flag when it comes to something as sensitive as mental health. People seeking mental health support need reliable, consistent help, not answers that shift with slight wording changes.

Strange Results in AI Reasoning and Math Skills

Some results were downright puzzling. For instance, ChatGPT aced logic tests with a 100% score. But the researchers admit this might be due to the AI spotting patterns in the test answers rather than genuine logical reasoning.

Even odder, ChatGPT performed well on algebra problems (about 84%) but poorly on geometry questions from the same exam (only 35%). Normally, if someone is good at one branch of math, they tend to be decent at others too. This inconsistency suggests that the AI might not truly understand math concepts deeply but is relying on other strategies to answer questions.

Can AI Match Real Therapy?

Even if we give AI the benefit of the doubt on test scores, these tests miss a huge part of what mental health support really involves. Therapy isn’t just about giving logical answers or solving problems — it’s about understanding emotions, reading between the lines, and adapting to each individual’s unique personality and needs.

AI can’t pick up subtle emotional cues or build a trusting relationship like a human therapist can. As a result, relying on AI for anything beyond very basic support feels risky.

What Does This Mean for AI Mental Health Support?

While AI mental health tools might offer some help with simple issues, these studies show there are still big questions about reliability and depth. It’s definitely an area worth watching as technology improves, but for now, it’s best to approach AI mental health claims with caution.

If you’re curious about the study I mentioned, you can check it out here: Study on AI and mental health.

For more on how AI works and its limits, you might find this article from MIT Technology Review helpful. And if you’re interested in how mental health therapy actually works, consider resources from Psychology Today.

In the end, AI is a helpful tool, but when it comes to our mental health, nothing quite replaces the human touch.