Can artificial intelligence pray? Tech entrepreneur Yossi Tsuria wanted to find out.
He asked the AI chatbot to generate a prayer.
If Joe was praying for his son’s health, Mr. Tsuria asked in 2023, how should he pray? The machine responded, “Heavenly Father, In this trying time, I come before you with a heavy heart.”
Why We Wrote This
Could your next spiritual guide be artificial intelligence? AI is offering Christians, Jews, and others an alternative to priests, rabbis, and other faith leaders.
But Mr. Tsuria was thinking of Orthodox Jews, not Catholics. He revised his question.
Within seconds, the AI generated a new prayer: “Amen. Dear God, In this challenging time, I turn to you with a heavy heart. My beloved Perry, my young one, is facing [a] battle … I stand by his side, feeling the weight of worry and fear.”
It was an indication that, even in their early days, AI chatbots had captured the language of at least some major religions. And since, people have begun turning to chatbots as therapists, spiritual advisers, and even companions.
Last fall, a Catholic Church in Switzerland temporarily installed an AI Jesus – a hologram animated by AI – in a confessional. Faith leaders have delivered sermons written by AI. And numerous apps have been developed offering AI-generated guidance through prayer and meditation.
Still, many researchers and faith leaders are skeptical about the depth and veracity of religious guidance from a chatbot and see limitations in the technology, known as generative AI. For one, many chatbots don’t handle complex moral or religious issues very well. They also tend to stereotype non-Western religious traditions.
But the advancements present seemingly endless possibilities for exploring and practicing one’s faith. With that come any number of questions about what it means to consider spiritual and moral quandaries with the aid of a machine that doesn’t have a conscience.
“I’m excited in some ways about this,” says David Brenner, board chair for the organization AI and Faith, an organization that convenes discussions around AI, religion, and ethics. “But I really believe that we need to be careful in how we apply it and how we can continue to bring to bear our human understanding in the ways we work best to interact with this technology.”
Many of the questions AI developers consider are also explored in faith, he says: “Who are we vis-à-vis animals in our creation? What is the meaning and purpose of life? How do you preserve truth and justice in life? How do you preserve agency?”
A number of researchers agree that, used correctly, large language models (LLMs) like ChatGPT can be tools for answering those questions and prompting deeper spiritual reflection. Models trained on the Torah, for example, can synthesize what the sacred Jewish text says about forgiveness. But people are divided on how insightful answers from LLMs really are, as well as their proper uses in a religious context.
“The God that I do believe in is one that embodies truth and understanding,” says Johnny Flynn, who just graduated with a degree in religious studies and philosophy from the University of North Carolina at Charlotte. “If I’m ever trying to engage with the spiritual … then I’d want to go to a source that also has understanding and can grasp truth.”
While chatbots deliver words with empathy and emotion, they don’t feel or even know what emotions are. To be angry, for example, is “to have the belief that you have been wronged in some unjustified way,” says Alba Curry, a lecturer in philosophy at the University of Leeds in the United Kingdom.
An LLM can’t make that kind of judgment. Instead, it uses prediction. Trained on more written words than a person could read in a lifetime, chatbots use advanced technology to guess with high accuracy what word comes next in a sentence and craft responses that mimic conversation. This makes them sound human and easy to talk to.
Metaphysics and AI
That can be a problem for someone who is searching for real spiritual advice. “Large language models right now are sycophants. They really want to give you what you want,” says Dr. Curry. That isn’t the same as “the strength and the grit” that a priest or rabbi might offer someone facing a question of religious duty, for example.
The models are not well attuned to emotional vulnerabilities, either. And people who have worked as AI researchers and served as spiritual advisers say the two aren’t interchangeable.
Marcus Schwarting, an AI researcher pursuing his doctorate at the University of Chicago, is a commissioned Stephen Minister, a Christian layperson who’s a trained caregiver. In that capacity, Mr. Schwarting has met weekly with a person seeking support. That offered one way for him to compare how the conversations might have gone with a chatbot, versus himself.
“I don’t really think an AI model is capable of having that sense of presence,” he says.
Still, he’s not ruling out all ways that an AI could be a useful tool for spiritual exploration. If someone is talking with an AI, they’re doing 90% of the work, he says. “I don’t really think there’s anything metaphysical happening with the AI model, but there could be something metaphysical happening to you.”
Other researchers say chatbots could help someone think through how to confess a sin to a pastor or other religious authority, or could provide companionship while reading a holy text.
“The Christian community might start realizing these are the sort of uses that would be helpful for our community,” says Dr. Curry of the University of Leeds. But “we will never use large language models for these really deep moral debates.”
So far, the vast majority of material chatbots are trained on is Western, resulting in a bias against religions from other parts of the world, says Flor Plaza, a computer science professor at Leiden University in the Netherlands. The LLMs demonstrate nuance when discussing major religions in the United States and Europe. However in a study, she and others found that Eastern religions, such as Hinduism and Buddhism, are strongly stereotyped, and Judaism and Islam get stigmatized.
Chatbots generally encourage positive things, such as respect for various faiths, and guard against certain ideas, such as religious violence or self-harm, like suicide. (However, many companies have yet to develop reliable safeguards against machines suggesting the latter). The values are determined by the companies developing and training AI – which means that their workers will influence how religion is portrayed.
That’s all the more reason to bring faith leaders into the conversation, says Elias Kruger, a data scientist who started a blog called AI Theology in 2016. There’s potential in using theological thinking to explore AI from an ethical perspective, he says.
“Ethics has to do with our relationships not just to each other as humans, but to our whole universe,” he says. “We used to treat machines and human-built things as things, and now we’re switching into treating them as beings.”
That shift could present issues when it comes to holding true to some individual faiths’ values. There’s a risk of what many Abrahamic traditions would call idolatry, since AI seems to share some attributes with different faiths’ conceptions of God, like omniscience, omnipotence, omnipresence, says Mr. Brenner, from AI and Faith. But it lacks others, like love, concern, care, truth, and other qualities that create “the full dimension of God.”
AI will only change if the people engineering it do, says Mr. Kruger. Most of the models are developed and maintained by a Silicon Valley workforce that skews male. “How do we start addressing the problem, empowering people of many faiths and of diverse backgrounds to become builders?” says Mr. Kruger, who has a masters in theology from Fuller Seminary. “I think that is really what is going to change the arc of development of AI.”
The purpose of religion
Many faith groups are concerned about the risks to thinking that AI is omniscient. “It’s made to give us a response, whether or not that response is true,” says Meredith Gardner, media literacy director for Mormon Women for Ethical Government. The group signed on to a recent letter asking Congress to reject a proposed moratorium on AI regulation.
AI can play the role of a spiritual director in asking questions and offering prompts, says the Rev. David Kim, CEO of Goldenwood, who developed a bot and runs workshops with faith groups interested in exploring AI tools. For him, it comes back to an idea of “hopeful intelligence.” Imagination has always been a key part of his own faith journey, he says, and he sees AI as a creative tool.
“We’re certainly aware of all that can go wrong with it, but given theological commitments, we have this mandate to move forward with things that we cultivate to a very hopeful orientation,” he says.
Even though AI doesn’t have a consciousness, Mr. Kruger says that he has no doubt that people can use it to explore their faith. But it’s important to keep a sense of perspective and not seek spiritual guidance only from ChatGPT, he says.
“Religion has to be about, Does it bring us closer together or isolate us more?”