Empathy!
Never, we said. Nope. AI is well and good, but doctors will always be necessary, we said. Not particularly for their diagnostic acumen—surely given the right inputs, machines can come up with even more accurate diagnoses. But never would AI be able to substitute for the human touch, we said. Healthcare personnel would always be needed, we said. For empathy.
“We”—whoever that is—may be wrong.
Someone actually did a study on this: researchers from the Universities of Nottingam and Leicester, in the UK. And published their results in British Medical Bulletin, earlier this year, in an article titled “AI Chatbots Versus Human Healthcare Professionals: A Systematic Review and Meta-Analysis of Empathy in Patient Care.”
Bottom line: Yup, “we” were wrong. ChatGPT and similar AI chatbots scored roughly 2 points higher than doctors on a 10-point empathy scale! Even across a variety of patients with a variety of disorders.
This was the conclusion of a meta-analysis of 15 separate studies, the largest examining over 2,000 patient interactions. Similar patterns emerged in all datasets. Healthcare workers and patients felt more warmth from AI-generated medical responses than from actual doctors, that surprising analysis of 15 studies revealed. The largest study examined 2,164 patient interactions, with similar patterns emerging across smaller datasets.
AI had a 73% probability of being rated as more empathic than human practitioners (doctors, nurses, and other healthcare workers) in head-to-head comparisons.
In text-only scenarios, AI chatbots are frequently perceived as more empathic than human HCPs.”
Ah, there’s the caveat: The studies only evaluated only texted interactions: results may be different (and “we” hope and pray they might be) with person-to-person voice consultations. Also there was no examination of whether increased empathy fostered increased patient health (though “we” can assume that to be the case).
In any case, the results challenge long-held assumptions about human connection in medicine and run counter to what “we” have been saying—that empathy is an essential human skill that AI cannot replicate.
The starkest gaps were seen when comparing human responses to those of chatbots, when dealing with patient complaints. When handling such grievances, ChatGPT-4 scored 2.08 standard deviations higher than human patient relations responders. Even doctors found ChatGPT-4 to be more empathetic! Golly!
(There is, however, a twist: of the 15 studies, the 2 relating to dermatology, human dermatologists outperformed AI. Three cheers for skin, hair, and nail docs! “We” were right! But the scholars couldn’t explain this discrepancy.)
There is no gainsaying the fact that “we” are, overall, wrong about assumptions made regarding the “empathy” (or whatever you want to call it) of machines.
Empathy is always good.
Rejoice with those who rejoice, and weep with those who weep.
Romans 12:15
Jesus did, when his friend, Lazarus, died, and before he was resuscitated by Jesus.
Jesus wept.
John 11:35
And our cares and concerns, worries and woes, he knows.
For we do not have a high priest who cannot sympathize with our weaknesses,
but One who has been tempted in all things as we are, yet without sin.
Hebrews 4:15
And so Christians, too, are called to be empathetic.
Bear one another’s burdens, and thereby fulfill the law of Christ.
Galatians 6:2
Follow the Leader!
So, as those who have been chosen of God, holy and beloved,
put on a heart of compassion, kindness, humility, gentleness and patience;
bearing with one another, and forgiving each other,
whoever has a complaint against anyone;
just as the Lord forgave you, so also should you.
Colossians 3:12–13
Surely we can do better than chatbots. That’s what “we” continue to say!
SOURCE: British Medical Bulletin; Study Finds











Abe Kuruvilla is the Carl E. Bates Professor of Christian Preaching at The Southern Baptist Theological Seminary (Louisville, KY), and a dermatologist in private practice. His passion is to explore, explain, and exemplify preaching.