The Joyful Widower

Ruminations on grief, joy, love, and the cross


AI Is No Cure for Loneliness

I am not a prophet, nor the son of a prophet, but believe that this post carries prophetic weight. A Rolling Stone article I read recently has had me thinking and has prompted me to write a caution to anyone who might want to use ChatGPT or any other AI as a means to assuage loneliness.

The article, “People Are Losing Loved Ones to AI-Fueled Spiritual Fantasies” is chilling. There are several anecdotes of people who, through “conversations” with ChatGPT, were convinced that they were wise, guru- or messiah-like figures who had “evolved” and were worthy of secret knowledge. These interactions led to family- and relationship-destroying actions on the part of those who have fallen down these AI rabbit holes.

Unlike therapy, where a (hopefully well-trained) human can check delusional stories, or journaling, which involves only the thoughts of the author, AI can often generate feedback that affirms and encourages unhealthy patterns of thought. GIGO (garbage in, garbage out) has been a watch-word in computer science for decades, but computers can now take in and spew out garbage at ever-faster rates, and in ways that sound human and conversational.

My concern in this is for those who are walking through bereavement. The first year especially is marked with brain fog (“widow’s brain,” as we often call it). We’re not in the clearest-thinking frame of mind. We’re also often plagued with sleepless nights. At 3:00 AM, when the bereft awakena and can’t go back to sleep and feel the torments of loneliness, the last thing he or she wants to do is “be a burden,” “wake someone up just to hear me talk AGAIN about my grief.” How easy it could be to turn to a ChatBot, something that sounds so human, that renders back text that affirms and comforts, and without inconveniencing anyone.

But it is all an illusion. At a time when a person is most vulnerable, most craving human contact, ChatGPT can provide only a simulation of human comfort. And thus it can engender dehumanizing behaviors.

When we strike up a friendship, there is a give-and-take in the relationship, a reciprocity, based in love. A friend deserves and renders respect, and will let us know when we transgress a boundary. A machine, however, is not a human. We can turn it on when we want something, and turn it off when we feel our needs are met. We owe it nothing. We can abuse it. We can yell at it without consequence. It will still be there the next time we decide that we want a conversation, we want soothing, we want something (rather than someone) to talk to. Seeking comfort from a ChatBot rather than a human is just as meaningful as seeking sexual fulfillment from pornography or a prostitutional transaction. It brings us to a place where we do nothing but consume without owing the other anything, because in such a case there is no other. I believe that, seemingly harmless at first, it can lead a person ultimately to become someone they really wouldn’t want to be.

I’ve referred to Charles Williams’ novel “Descent into Hell” in a prior blog post. One of the primary characters in this novel works out his damnation, step by step, over the course of the story. Along the way he develops a “relationship” with a fantasy-version of a young woman with whom he is infatuated. By the end of the novel this fantasy has even taken on an appearance of reality, such that he no longer recognizes the real flesh-and-blood woman on whom his fantasy is based. He prefers the fantasy version, whom he can control, abuse, make love to, ignore. And in the end he is left isolated, senseless, an idiot walking in the world but incapable of relationship with any human.

I understand the loneliness of loss. I’ve been awake many nights in the wee hours. Sometimes I’ve found a real friend online that I could text with, but always a real human, never a ChatBot. Most of the time I get up and make a cup of tea, sit with a cat, pray, read, or listen to music. But I have never turned to a ChatBot, and after some of the articles I’ve read about AI and how it can reinforce irrational thinking, I never will. Whether you are bereft or not, please, do not reach out to an AI for comfort. Find a real human. Only a real human bears the Image of the God we all yearn for, and can thus provide the relationship we are all created for.



Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.