I’m a late adopter of technology. I had a flip phone for an embarrassingly long time, resisted Facebook and kept a paper calendar well into the digital age. Despite this, curiosity about the open-source ChatGPT motivated me to watch a quick YouTube video and within minutes, create an online free account.
With breathtaking speed, ChatGPT revised my awkwardly worded email and provided me with quotes about psychotherapy, such as, the following from Carl Rogers, “The curious paradox is that when I accept myself just as I am, then I can change.”
Naturally, I had to test how it would respond to statements such as, “I feel stressed.” The default response followed an algorithm of empathic statement, followed by disclaimers (e.g., I’m here to offer support but am not a mental health professional) and a half dozen tips on how to reduce stress written magazine-style. It felt a great deal like having an instant response from an advice columnist such as “Dear Abby.”
Next, I listed some symptoms of depression and the algorithmic response was disclaimer, along with an assessment that I had listed “common symptoms of depression,” followed by guidance to “reach out to a therapist, counselor, or psychiatrist,” eight tips for self-care and wrapped up with empathic statements and encouragement to reach out to a mental health professional.
None of this gave rise to concern on my part, that people will soon prefer psychotherapy to be delivered by artificial intelligence (AI) over therapy by competent human professionals. If anything, ChatGPT seemed more likely to help the curious to segue into seeking out a human therapist.
Humans cannot compete with AI in terms of the speed with which it can process data. If I was in a race with ChatGPT to identify the meaning behind Carl Rogers’ curious paradox, it would provide the answer in less time than it took me to understand the question.
However, people who seek therapy want human connection, a fundamental aspect of Maslow’s hierarchy. While it is not hard to imagine a world in which our present-day Alexa will evolve to be an assistant who serves as a benevolent source of knowledge, guidance, and support, I don’t see AI replacing human psychologists.
AI could enhance psychological assessment and serve as a catalyst for advancement in our understanding of psychopathology. These developments will improve psychiatric diagnostic reliability. The current state of psychiatric diagnosis in clinical practice leaves a lot to be desired. Personalized psychological assessment, administered via the patient’s electronic medical record would inform practice.
Other areas of medicine, for example, cardiology utilize a wide variety of data to diagnose and treat disease. These include behavioral reports of symptoms, machine electrocardiogram testing, biomarker testing and imaging.
By contrast, psychologists in independent practice work with far more limited data in assessing psychopathology. We diagnose using a taxonomy that itself is primitive. “The Diagnostic and Statistical Manual” has come a long way, but as our field becomes capable of integrating brain imaging, biomarkers, genetics, and more sophisticated methods of psychological assessment, we should make significant progress in overcoming limitations of the DSM.
Diagnostic precision is important but insufficient in treating people with psychological problems. Human clinicians have an advantage over AI to help a client to accept a potentially stigmatizing diagnosis.
In my own practice as a clinical health psychologist, I don’t see how AI could credibly help clients wrestling with existential questions triggered by life threatening illness. Unlike AI, psychologists and their clients “are all much more human than otherwise” as noted by Harry Stack Sullivan.
AI will likely disrupt the way that we assess, understand, and diagnose psychological illnesses. It could help to match patients to appropriate treatment models and provide educational content and training. But while AI can augment and improve the delivery of psychological treatments, human psychologists will remain irreplaceable.
Footnote: Of course, I had to know how much better this column would have been, had it been edited with ChatGPT. Indeed, a quick check revealed that the prose would have been more concise and the flow improved. Still, I hold fast to the hope that we will continue to prefer flawed but authentic writing over faux, computer-generated prose.