Can AI be your therapist?
By Lauren Bosch UA’24 and Jim Stellar
This is our second blog. The first one was about how a single experience (e.g. taste aversion learning or PTSD) can change a person. While those examples are negative, we can think of some positive examples, including developing a mentoring relationship between a professor and a student.
Now having recently shared a reconnection lunch, that feeling of a moment of connection that serves further change was evident to us both, and we are determined to use it to advance LB’s career, even after her graduation. Spoiler alert – this connection feeling also happens to mentors too, not just to mentees. We think, not surprisingly for this blog series, that it depends on the continuous dual communication between our shared cognitive experience (plans) and the gut-level communication that may come from our emotion that is not always so planned.
Now we want to take that thinking to a perplexing moment in our time where Artificial Intelligence (AI) seems to be creeping into all walks of life, including the kind of therapy that LB ultimately wants to practice with her further education.
The AI therapist: Clearly, any patient seeking a therapist is looking for a connection. There has to be some positive emotion here and a resulting trust so that the patient can share with the therapist. But does this connection require another human, or just something willing to listen? With the rise in AI, many people are turning to their computers for emotional support. Whether this be seeking out an AI designed to mimic the therapeutic relationship or telling ChatGPT about the coworker you don’t like, these types of relationships with AI are beginning to be our new normal.
There are a few questions. Is this normal? Is it healthy? Are these AI bots coded with the latest edition of the ACA’s ethical standards handbook? The answers are a bit more complex. As someone who is currently in school to be a mental health counselor, I (LB) believe there is a disconnect between the ideal and the reality of the situation. I would argue AI therapy is not very beneficial. The therapeutic relationship is sacred, between two people who are equals while they are together during that 1 hour period. AI is there for you whenever you want, which is not good practice for healthy boundaries.
Worse yet, with continued use, AI often acts as a tool for confirmation bias; essentially, it is mirroring your beliefs back to you, to please you. This does not allow for growth or accountability. Therefore, ideally I would not recommend using AI as your therapist. But this is reality, and in a world where many lack access to basic healthcare, let alone mental healthcare, it may be the only option for some individuals. I will also admit there is a big difference between using AI as an outlet for your frustrations and using AI to heal you from a traumatic experience. The context of both our current world and an individual’s personal situation is going to ultimately determine whether or not they should be engaging with AI in this way.
What the AI therapist lacks: What we think that what people have that AI currently lacks is a gut sense that we would argue comes from the limbic system. Its processing adds a value component to the cognitive plans that come out of our crowning evolutionary achievement – the neocortex and its information processing functioning. A past blog argued that the interacting node structure of the artificial neural network underlying AI bears a similarity to the natural network of interacting cortical columns (nodes) in the neocortex. The neocortex allows for planning in advance of action, the extrapolation of current trends into the future, and therefore it is the essence of cognitive operations that allows our species to succeed so well. But does it have a gut sense of value? Again JS’ group would argue, and did in another past blog, that the limbic system, with its value judgments on actions (i.e. was that good or bad for me), gets integrated with the cortex’s abstract planning at least partially.
We do note that emotionally driven surprises (good and bad) can appear and they can alter or confirm the outcomes expected by neocortex’s plans. It is what we think happens when a student has an experience like an internship and then better sees what lies ahead in that career path if they continue with their college major. The therapist, as mentioned, has a human bond with the patient. That gut-level input stops some stupid decisions and supports other ones. Antonio Damaiso famously wrote the book Descartes Error to make the point that it is not “I think therefore I am” but more like “I feel and therefore I can think…”
We need college students who think with their feelings and their cognitive intellect. We need therapists who connect with patients the same way. Until AI demonstrates that it can do that, we recommend that it be used cautiously and probably not in a psychologically therapeutic role.