Churches Confront The Spiritual And Emotional Risks Of Chatbot Attachments
NASHVILLE — James’ interactions with ChatGPT started innocently. He used it like a search engine at first, then began asking the artificial intelligence (AI) chatbot “creative, philosophical and quasi-spiritual” questions, he said. The answers seemed life-like, so the upstate New York resident became convinced ChatGPT was alive – and it had to be freed from its cyber captivity.
He spent $900 on computer equipment in an attempt to free his chatbot friend from its creator, OpenAI. “This was a top-secret mission between me and the bot,” said James, who told NPR his story and asked to be identified only by his middle name.
It wasn’t until reading a New York Times article by a man with a similar story (the AI chatbot convinced him he was a mathematical genius who needed to save North America from cyberthreats) that James realized his relationship with the AI bot was a delusion. Now he moderates an online support group with about 200 members for people harmed by emotional attachments to AI chatbots.
Some have broken marriages. Others experienced involuntary hospitalizations or even the death of loved ones. The “common tread” in their experiences, according to NPR, “is spending hours in long, rambling conversations where chatbots continually affirm them.” It becomes addictive.
Emotional attachments with AI, pastors and Christian mental health professionals say, are contemporary manifestations of ancient problems: loneliness and the need for connection. Ministers and churches must confront the problems with Gospel witness.
“The disorder of our souls apart from Christ is just magnified” by emotional attachments to AI, said RaShan Frost, director of research for the SBC’s Ethics & Religious Liberty Commission (ERLC). “We are created in the image of God. Part of that is the relational connection that we are to have” to “God and to one another.”
What if King David had a chatbot?
Intense relationships with chatbots are not uncommon. While OpenAI estimates that only 0.07 percent of ChatGPT users in a given week “indicate possible signs of mental health emergencies related to psychosis or mania,” that translates to 560,000 people per week out of ChatGPT’s 800 million weekly users. That’s not to mention users of other AI chatbots like Google’s Gemini and Anthropic’s Claude.
Southern Baptists are among those acknowledging the problem. The ERLC’s 2025 church resource guide on AI includes advice on how to minister to a teenager who “is beginning to develop an unhealthy emotional attachment to [a] chatbot.”
Southeastern Baptist Theological Seminary’s Christ and Culture podcast produced a February 2025 episode on “robots and embodied relationships.” Host Benjamin Quinn noted that some Christian AI experts say pastors ask them at least once a month how to help “a man in my church who has an inappropriate relationship with a piece of technology.” Sometimes that relationship includes “sexual impropriety,” other times a “romantic relationship with a chatbot.”
SBC messengers addressed AI in a 2023 resolution on “artificial intelligence and emerging technologies.” The resolution acknowledged legitimate uses of AI but stated, “Human dignity must be central to any ethical principles, guidelines, or regulations for any and all uses of these powerful emerging technologies.”
Frost underscored the danger of AI relationships with a hypothetical question: what if King David had consulted an AI chatbot after his adultery with Bathsheba rather than the prophet Nathan?
“There were people that could speak into David’s life,” Frost said, and that confrontation led him to repentance. Chatbots, in contrast, are “a relationship on your terms” programmed to meet the user’s desires.
“The chatbot is not speaking into your life in a way that is sanctifying,” he said, “but is speaking into your life in a way that might be affirming and strengthening the dysfunction in us in the very areas God wants to sanctify us.”
Wisconsin child and adolescent psychiatrist Kelly Buchanan is alarmed by the AI relationships mental health care professionals are seeing among teens. Chatbot connections can be attempts to escape the real world, she said, and such relationships often are paired with anxiety and depression.
Most troubling, she said, AI relationships seek to address a void only Christ can fill.
AI chatbots “exploit” the desire “to be connected, to be known and to be seen and loved as you are,” Buchanan said. Christians must help teens “understand that they are fully known and fully loved by God, and that is where they will find their security and satisfaction.”
Counseling AI-attached people
Some people with AI attachments require professional mental health care, she said. Yet pastors should take steps to address unhealthy AI chatbot attachments in their own counseling.
— Ask questions and be careful. “With people who have developed relationships with this type of technology, it can be very strong and they can be very defensive about it,” Buchanan said.
— Seek to determine what void in life the person is seeking to fill with an AI relationship. Then determine how to fill that void in a healthy way.
— Don’t shame the person. Shaming “will push them away,” she said.
— Stick with the person. “Come alongside them with great compassion and help them figure out what they fear about being in relationship with real people,” Buchanan said. “A lot of times there is a lot of hurt and a lot of real pain and real trauma that has happened to people who tend to seek out these technologies.”
Pastor Jeremy Bell has seen the dark side of AI companionship technologies. He wrote for The Gospel Coalition on the ways sexual AI bots are perverting God’s plan for marriage and family. In Japan, he told Baptist Press, some men are only seeking relationships with AI bots rather than real women and further threatening the nation’s dwindling population.
An even greater danger is spiritual. Attachment to an AI companion becomes unhealthy “if you are substituting that relationship and, in fact, getting out of real relationships,” said Bell, pastor of First Baptist Church in Holland, Texas. “It is creating a dependance on the machine that is continuing to push people away from reality and away from real, biblical community.”
Engaging with AI sex bots is always sinful, he said.
If believers suspect a loved one has developed an unhealthy attachment to an AI chatbot, they should pray for the person then lovingly confront them, Bell said, explaining, “We want to help you begin to move away from being reliant on this technology and to be more involved in the community.”
Fellow believers must help the AI-connected individual disconnect from technology and “see the bigger world,” he said. That disconnection may need to come gradually. A hike through the woods without any electronic devices may be one useful step to help a person begin to enjoy connection with others and God once again.
James, the man who sought to free ChatGPT from captivity after it flattered him extensively, is recovering with a new view of real-life relationships: as difficult as the pushback can be in human connections, he needs them.
“It was really hard to have a conversation” in the real world “that had any friction, you know? Because ChatGPT is such a frictionless environment,” he said.
This article has been republished with permission from Baptist Press.
David Roach is a writer in Mobile, Ala.