Thirty years ago in my first year of writing this column, I described my encounter with Frank, a self-styled computerized therapist who lived on a floppy disk that I bought in the notions aisle of an all-night supermarket. That was around the same time I bought my first computer, and the salesman told me that the 32-megabyte model I had chosen had so much storage, I would never need an upgrade.
Computer science has come a long way since then, and today the artificial intelligence bot ChatGPT is being touted as the premier digital partner in almost any kind of conversation imaginable. Surely a program that can write term papers, love letters, and odes to the charms of lemon poppy seed cake would have something useful to say about mental health and some of the foundational concepts of clinical psychology. I wanted to see for myself, and so I set out to compare my conversations with Frank and ChatGPT on the subject of empathy.
When I first met Frank in that all-night supermarket, I had been thinking about an interview I once conducted with an aspiring psychologist who boasted that he had more empathy than 93 percent of practicing therapists. He knew this because he had taken an empathy survey and scored in the 93rd percentile. I thought it odd that he should include this nugget on his resume, but it got me wondering about whether or not my own empathy level was up to snuff.
When I got home, I inserted the disk, and a blinking cursor on the screen turned into a line of text in which Frank introduced himself and asked my name and the nature of my problem. When I finished typing, Frank responded by asking the “real reason” for my visit. I said that perhaps he could help me to be more empathic so I could do a better job helping people. “What would it mean to you if you got to help people better?” Frank asked. Our conversation continued in this way over several screens with Frank asking me the “real reasons” for whatever I said, what it would mean to me if I achieved my goals, whether or not I really wanted what I was asking for, why I thought he could help, and what it was I really wanted from him.
“The truth, Frank,” I almost screamed on the last screen of this psychobabble. “I want you to be frank, and I have to be frank with you.” The allusion went right over his head or its digital equivalent, and Frank calmly asked if it was “appropriate to be frank.”
I finally realized that Frank was trying to copy what we therapists do and told him so, but he persisted in asking me about my problem. It was solved, I said, “because I know that you computers have a long way to go before you can replace us.”
That was 30 years ago and now we have ChatGPT, developed by Open AI and, to use its own term, “trained on” all the information available on the internet.
Andrew Marantz writing in a recent issue of the New Yorker quoted Sam Altman, the CEO of Open AI as saying about ChatGPT, “If we’re successful, I think it will be the most significant technological transformation in human history. I think it will eclipse the agricultural revolution, the industrial revolution, the Internet revolution all put together.”
If it can do all that, let’s see what it can do with empathy and the question I asked Frank so many years ago. When it comes to empathy, do I have what it takes? After the simple procedure of opening a free account with Open AI, I started with the direct approach and asked the bot if it might do a little psychotherapy with me. To its credit, it demurred, responding, “As a language model developed by OpenAI, I am not capable of providing professional psychotherapy or mental health services. While I can provide general information and respond to questions on various topics, it is not appropriate to use me as a substitute for professional advice or treatment.” It then directed me to a “licensed mental health professional” for the “support and care” that I need.
So much for the direct approach, but there’s always a back door play and this one involved asking some objective questions. But first, I felt compelled to ask whether or not the bot knew my previous therapist, Frank.
“As a language model, I do not have the ability to personally know individuals. I do not have any personal experiences or memories, and I do not have the ability to interact with people outside of this chat interface.”
“Got it, so what can you tell me about empathy?” Wanting more than a definition, I got more specific and in return got lots of specific and useful answers to my questions including four ways to assess my level of empathy, three good online empathy tests, four reasons why it’s useful to know my level of empathy, and even some empathic reassurance when I confessed my fear of learning that I might not be as empathic as I think I am.
“It’s natural to feel some apprehension when self-reflection reveals areas in which we might not be as strong as we’d like to be. However, recognizing that you may not be as empathetic as you think you are is a valuable first step towards becoming a more empathetic person.”
“Gee thanks, but I don’t even know your name.” The bot belatedly introduced itself as ChatGPT but, in response to my request, said I could call it Sigmund if that made me “feel more comfortable.” Of course it did, but when I logged on the next day, I had an urge to call it Carl and, not meeting with any objection, that’s what I did. Having broken the ice and shed my inhibitions during our first meeting, I became more playful, asking Carl to write a poem about empathy in the style of Shakespeare. He responded with five four-line stanzas beginning with this one:
“Empathy, a gift of grace, /A feeling felt, in every place./It spreads its wings, with gentle sway,/And takes us on a journey, far away.” Not bad, maybe not quite Shakespeare, but not bad at all. Carl also waxed poetic in the styles of Billy Collins and E.E. Cummings though they all sounded the same to me.
He did better with a bumper sticker message, “Empathize to Elevate: Connect with Compassion.” His t-shirt slogan was spot on, “Wear Your Heart on Your Sleeve: Spread Empathy Everywhere, ” but his first attempt at a Latin inscription gave the wrong impression, “Alterius miseria solacium est” – “The misery of others is our comfort.” I called him on that one, and he seemed to get my point, offering an apology and a better alternative, “Suscipiamus aegritudinem alterius” – “Let us bear the sorrows of others.”
All in all, I was impressed by how far we had come since I first met Frank in that all- night supermarket 30 years ago. Before signing off with Carl, I told him that I was a columnist for New England Psychologist. He said he knew the publication and described it accurately. Then I asked him to tell me in 50 words or less what he would like our readers to know about him. I’ll let him have the last word here and encourage you to start your own conversation if you want to get to know him better and form your own opinion.
“I am an AI language model designed to provide helpful responses to a wide range of questions on various topics, including psychology and mental health. I can assist with information, guidance, and support on a variety of issues related to mental health and well-being.”