WHEN is the right moment to cure mental illness with Artificial Intelligence?

“Overall, AI has the promise to provide critical resources we need to overcome our mental health crisis.” OK, but WHEN?

WHEN is the right moment to cure mental illness with Artificial Intelligence? /img/neoliberalism-creates-loneliness.jpg

The quote above is the conclusion of a post describing “The Incredible Ways Artificial Intelligence Is Now Used In Mental Health”. The post begins with a compact summary of the “mental health crisis”:

"…In the United States, one in five adults suffers from some form of mental illness. Every 40 seconds one person dies from suicide and for every adult who dies from suicide, there are more than 20 others who have attempted to end their life… mental health also has a tremendous economic impact for the cost of treatment as well as the loss of productivity."

Then, after mentioning AI algorithms that might predict depression, it describes apps or services already using artificial intelligence to “help screen, diagnose and treat mental illness”:

  • spot and refer patients with possible mental conditions to some therapy program
  • chat that provides direct counseling services to employees
  • an app that allows patients being treated with depression, bipolar disorders, and other conditions to create an audio log where they can talk about how they are feeling
  • a parental control phone tracker app, looking for signs of cyberbullying, depression, suicidal thoughts and sexting on a child’s phone.

These products are presented (emphasis mine) as “just a few of the innovative solutions that support mental health” and could make of AI “a powerful tool to help us solve the mental health crisis”. Namely, AI can provide always-on, much cheaper tools that could:

  • suggest possible treatments to mental health professionals
  • reach patients who may feel more comfortable talking to a bot than with a human therapist

Guess which word is missing?

The earlier a (correct) diagnosis, the better. And there is no question that Artificial Intelligence can and should “remake” some parts of healthcare, including management of mental ilnesses. Good examples are here and here (Italian)).

But here is the real issue I have with that post. As good and needed as they almost surely are, all the products and researches mentioned there are after-the-fact patches. Some of them aim to just find more customers, earlier, for more drugs or or more counseling. It feels like studies about overeating that tell the wonders of hearthburn pills, without mentioning that people should eat less in the first place.

OK, that post obviously has another focus. Still, its main limit is that it does not even mention prevention. That post seems only about AI used to fix health, instead of what damages health. It would be great if Artificial Intelligence for mental health focused, above all, on what could make less people so anxious or depressed that they end up considering suicide.

“Mental health has a tremendous economic impact for the loss of productivity”? How much sense does such a concerns make, in a moment when US holds two surely unrelated primacies, and one of them is just “most competitive country in the world”?

Maybe Artificial Intelligence could fight the mental health crisis much better if it focused on the obsession with productivity, and the rules that create it. Here are some pointers (*):

(*) which are also the sources of the screenshots in this post.