Thrive Therapy

View Original

Is AI Going to Make My Therapist Obsolete?

Why would I pay for therapy when I can just tell Siri about my bad day?

We now have mental health apps for our phones and tablets, as well as bots like Siri, Alexa and Google Assistant, that can respond to our emotional statements and inquiries, and even help us regulate our emotions.

A recent article in Aeon by Polina Aronson and Judith Duportail examines some of the serious implications of this technology. It began with an anecdote highlighting the fact that Artificial Intelligence (AI)  is not objective, but reflective of the values of the programmers.

Someone comments, “I feel sad”. Google Assistant responds, “I wish I had arms so I could give you a hug”. Awww…. AI is so sweet and loving.

But wait, the same comment to Russian AI bot “Alisa” elicits a different response, “No one said life was about having fun”. Hmmmmm.

Nonetheless, people are increasingly using AI assistance for emotional support. It’s easy, accessible, and cheap.

So why should we be concerned that it is so easy to interface with Artificial Emotional Intelligence? Two reasons:

1.  Someone else's values

Most of us would like to feel like we are sharing our deepest emotions with “someone” we can trust.

The differences in Google Assistant and Alisa highlight the fact that emotional intelligence is always a product of a social construct.  We are implicitly relying on the programmers to build their AI platform based on values that we can only hope are similar to our own.

Aronson and Duportail explain:

"AI technologies do not just pick out the boundaries of different emotional regimes; they also push the people that engage with them to prioritise certain values over others. ‘Algorithms are opinions embedded in code,’ writes the data scientist Cathy O’Neil in Weapons of Math Destruction (2016). Everywhere in the world, tech elites – mostly white, mostly middle-class, and mostly male – are deciding which human feelings and forms of behaviour the algorithms should learn to replicate and promote."

We don’t have to look very far to see the Big Brother possibilities for Emotional AI:

"Might our language of feelings become more standardised and less personal after years of discussing our private affairs with Siri? After all, the more predictable our behaviour, the more easily it is monetised."

On the other hand, when you visit a therapist, you have all kinds of clues to whether or not you have shared values. You can ask your therapist direct questions about how they work– what are the theories that inform how they “do therapy”, and what are the techniques they will use in their work with you.

You won’t be able to determine all the nuances of your therapist’s value system, but you will hopefully be able to know their response to your question, “Is it okay for a husband to beat his wife?” Last October, users of Alisa got this answer to that question: “Of course…(and you should) be patient, love him, feed him, and never let him go.”*

Needless to say, it could be dangerous to put your most fragile emotions in the hands of an anonymous programmer.

2. Loss of Critical Human Connection

Studies show that people are more likely to disclose personal information to a bot than to a human.

This is not a huge surprise! We know that one of the hardest things to do is to share our scariest, most hidden self, with another human being. Many of the people we think of as tough or strong are also emotionally distant. To share our emotions is to be vulnerable in the deepest sense of the word, and many people are afraid to let down their guard.

What I know from both my professional life as a therapist, and my personal life as a human, is that sharing your most secret feelings is the only way to connect to another human being on the deepest level. It is also the most self-affirming, satisfying, and meaningful experience we can have. Once you are connected to someone through shared emotions, your scary secrets lose much of their power to drive your feelings and behavior.

Sharing your feelings with a bot may be easier, and may even make you feel better about yourself than you did before (if the bot is kind enough to want to give you a hug), but in the end, you will miss that quintessential human-to-human connection that gives our lives meaning.

Dan Siegel, MD, Co-Director of the Mindful Awareness Research Center at UCLA and Executive Director of The Mindsight Institute, explains the science behind the way our brains dictate that we need to connect to another person through attunement of our inner states, in order to be secure and develop properly as humans.

When we attune with others we allow our own internal state to shift, to come to resonate with the inner world of another. This resonance is at the heart of the important sense of “feeling felt” that emerges in close relationships. Children need attunement to feel secure and to develop well, and throughout our lives we need attunement to feel close and connected."

 Artificial Intelligence is not going to go away. But we should be wary of believing that AI can replace anything. Sharing our deepest secrets with a bot carries not only the obvious risk to security, it can jeopardize the vital human connection that gives our lives a very special meaning.

Using AI to streamline tasks or assist with planning and executing dreaded chores makes sense. Using AI to track your mood or help you do a mindful meditation can certainly be valuable. Nevertheless, using AI to replace human connection is a very slippery slope, and we should always be aware of the risks inherent in this function, lest we be seduced by slick advertising that, at best, will promote someone else’s value set, and worse, can skew our thinking toward predictable and marketable behavior.

If you would like to have a chat about what old-fashioned (as in old-fashioned evidence-based) compassionate human-to-human therapy can do for you, please give me a call at 323-999-1537, or shoot me an email at amy@thrivetherapyla.com, and we will set up a free 20 minute consultation either on the phone or in my office, whichever is more convenient for you.

 

* Alisa’s response to “Is it okay for a husband to beat his wife” has been changed in response to public outcry. The response is now, “He can, although he shouldn’t”.

 

.