Can AI Robots Offer Advice That Heals Souls?
(ANALYSIS) I have been keeping an Artificial Intelligence folder for several weeks now, with a focus — naturally — on topics that should interest religious leaders.
In terms of nightmare scenarios affecting congregational life, I tended to think about the growing market for AI sermon programs, which I see as an updated version of the old problem of plagiarism in pulpits (see this On Religion column from 2003).
However, the Big Tech marketplace is full of shocking twists and turns. During COVID-tide, I certainly didn’t see this topic on the horizon: “Anglican debate in 2020 crisis — Can clergy consecrate bread and wine over the Internet?” That added a whole new meaning to the classic Episcopal Church Ad Project poster: "With all due regard to TV Christianity, have you ever seen a Sony that gives Holy Communion?"
Maybe I am getting naive in my old age, because I also didn’t have the “Father Justin” drama — yes, a chatbot confessor — on my ecclesiastical bingo card. If you missed that, here is a chunk of my sad column on that:
The penitent crafted the perfect sin to confess to a virtual priest: "Bless me father, for I have sinned. … I have had anger in my heart about the deployment of AI chatbots in inappropriate places."
"Father Justin," a 3D AI character created by the San Diego-based Catholic Answers network, offered biblical advice for wrestling with anger.
"God is merciful and loving, my child," the bot concluded. "For your penance, I ask you to pray the Our Father three times, reflecting on God's infinite mercy and love. And now, I absolve you of your sins in the name of the Father, and of the Son, and of the Holy Spirit."
Legions of cyberspace believers pounced. One tweeted this cry: "HAIEEEEEEE.”
Maybe you are wondering: Where is tmatt going with this?
Well, the other day I ran into a New York Times headline that forced me to start thinking about AI in a completely different way. Yes, I saw a larger Rational Sheep “signal” that needs to be heeded by parents, pastors, teachers and counselors. That headline:
Can A.I. Be Blamed for a Teen’s Suicide?
The mother of a 14-year-old Florida boy says he became obsessed with a chatbot on Character.AI before his death.
The story focused on a legal case surrounding the life and death of Sewell Setzer III, a young teen who had been diagnosed with mild Asperger’s syndrome as a child. He developed strong ties — verging on addiction — to an AI chatbot that he named after Daenerys Targaryen of “Game of Thrones.”
To read the rest of Terry Mattingly’s post, visit his Substack page.
Terry Mattingly is Senior Fellow on Communications and Culture at Saint Constantine College in Houston. He lives in Elizabethton, Tennessee, and writes Rational Sheep, a Substack newsletter on faith and mass media.