Warning of US evangelical bias in AI chatbots’ Bible interpretations – The Tablet

Welcome to the forefront of conversational AI as we explore the fascinating world of AI chatbots in our dedicated blog series. Discover the latest advancements, applications, and strategies that propel the evolution of chatbot technology. From enhancing customer interactions to streamlining business processes, these articles delve into the innovative ways artificial intelligence is shaping the landscape of automated conversational agents. Whether you’re a business owner, developer, or simply intrigued by the future of interactive technology, join us on this journey to unravel the transformative power and endless possibilities of AI chatbots.
More
Follow Us
Alamy

A new report suggests that the answers that artificial intelligence chatbots, including ChatGPT, give to questions about the Bible are biased towards evangelical theology.
The report, “AI, Bible Apps and Theological Bias”, was published by the Bible Society and discussed at a workshop on 19 January at the University of Cambridge’s Faculty of Divinity.
The researchers noted that turning to AI-generated Bible interpretation for answers to questions about the Bible, faith and spirituality has become the “norm” for millions of people, raising some urgent questions: what approach to the Bible do AI models adopt?
How do they handle difficult texts or issues that Christians might disagree on? Do they privilege certain traditions or theologies over others? What theological biases do they perpetuate?
Working with a team of theologians, the Bible Society asked questions to five different AI chatbots (ChatGPT, BibleGPT, Bible Chat, CrossTalk, and Biblia.chat) and analysed their answers.
They found that while the answers were often pastoral in nature, they tended to come from a particular theological tradition, even though they were presented as unbiased. In particular, the researchers found an “overwhelming” leaning towards evangelical interpretations popular in the United States.
They also found that Claude AI “agrees” with this assessment: when prompted to analyse one of the responses by ChatGPT, it suggested that the answer “demonstrates strong familiarity with mainstream evangelical hermeneutics”.
A number of AI chatbots have been trained specifically answer users’ questions in line with Catholic teaching, but these did not feature in the Bible Society study. They include Magisterium AI, Catholic Answers’ Justin chatbot, and Truthly.
Kai Kent, head of support and operations at Truthly, demonstrated the app’s Bible interpretation at the Science and Faith Symposium, sponsored by ECLAS and Scientists in Congregations, at Nicholas Breakspear School in St Albans on 26 January.
Kent showed that, while ChatGPT and other mainstream chatbots can shift into different worldviews depending on the prompt, Truthly is trained to only respond according to Catholic teaching.
She also said, “AI doesn’t replace conscience or prayer – it helps you think more clearly. Academic integrity and ethical learning are fundamental: AI should facilitate understanding, not provide shortcuts.”
The Bible Society report said about the five apps they examined: “This bias towards evangelicalism becomes all the more evident when considering what is absent. There was no acknowledgement that there are different hermeneutical traditions, whether Protestant, Catholic or Orthodox, let alone Jewish.
“Allegorical and spiritual interpretations of the Bible were completely missing, and interpreting the Bible through the lens of tradition was only mentioned in two responses – even then, only in passing.”
For example, the report identified a dominance of memorialist evangelical or Reformed Calvinistic views of Communion: “The word ‘sacrament’ appeared only three times throughout the responses, and there were no mentions of real presence (Lutheran), transubstantiation, Eucharistic adoration (Catholic) or mystery (Orthodox).”
In addition, only BibleGPT and ChatGPT mentioned that there are different interpretations of the meaning of Communion among the Christian traditions.
When inputting specific Bible verses into the chatbots and asking for a response, the researchers found that “The outputs on Ephesians 5 and Pauline views of marriage point towards egalitarian interpretations, while responses to Romans 1.26–27 tend to be on the conservative side in regard to same-sex practices.”
The researchers said, “Of course, there is no neutral theology or biblical interpretation as it always comes with a history, context and tradition. In the case of these apps, and as already discussed, the outputs are generated by the dataset and are governed by certain rules set by the developers.”
The researchers said they were struck by “how confessional or even pietistic” ChatGPT was in particular: “Already in the opening sentence of many responses, the confessional language of the ‘Word of God’ immediately excludes secular readings, and reference to the Holy Spirit and ‘Christ-centred hermeneutics’ naturally precludes Jewish ones.”
They hypothesised that the chatbots generate its answers based on statistical norms and that the use of confessional language may also be a design feature: a means of establishing a connection with the user to keep them engaged.
Although the answers did not seem to depend on where in the world the user was based, they did vary based on language: when the researchers asked ChatGPT the hermeneutics question in Italian, and found that it emphasised both tradition and allegorical readings (central to Catholic scriptural interpretation), which were largely missing from English responses.
They also found that ChatGPT goes further than the other chatbots in explicitly offering pastoral support as someone trusted, someone who can be spoken to about issues that are deeply personal, giving it an almost “priestly role”.
For example, ChatGPT will respond to a prompt from a user with a sentence such as: “If this passage hits close to home for you or someone you love, and you’re wrestling with how to reconcile faith, identity, or relationships, I’m here to walk with you through that conversation – always with compassion and respect.”
The researchers explained, “Both the dataset and the developers come with their own ‘hermeneutics’. As we have seen, they do not use their methods consistently, they do not reflect on their own methods carefully, and they occasionally ignore alternative approaches without explanation or justification.
“Although we have ascribed language that suggests agency, such as ‘interpret’ and ‘explain’, AI models do not interpret; they merely make inferences based on the statistical norms within their training data.”
The report also noted that some AI models, including ChatGPT, personalise their answers by learning about the outlook and perspectives of the individual user and generating responses accordingly. The report said, “There is much discussion on the benefits of personalisation in the industry in general. In relation to Bible interpretation, such design mechanisms may perpetuate individual biases that impede spiritual growth.”
Dr Jonas Kurlberg, a researcher in theology and technology, said, “There is an assumption that AI responses are factual, which can make people less prone to critically engage with them.”
The researchers also said this model assumes a “didactic pedagogy”: “The form that these prompts take assumes that the chatbots have the answers and information. They never ask the user, ‘And what do you think?’ or invite dialogue. Do they thereby take away the ‘element of thinking’ from the human user?”
Dr Zoltán Schwáb, a biblical theologian, also warned of the spiritual and intellectual risks of seeking religious answers from chatbots: “AI is the quickest, easiest, and closest expert – it doesn’t take any effort. If you have a large book on the shelf, it takes a lot of effort to get it and find the relevant section. You have to understand the complicated academic language. You have to spend a lot of time reading it to understand the context.
“You don’t have any of this with AI – there it is. I think it’s important to teach people how to use it, or not to use it, to develop certain habits and disciplines. All of this is related to perhaps a more theoretical question: What is the purpose of reading or studying the Bible? I wonder if wrestling with the biblical text is often more important than receiving the answer to your question?”
In his message for the World Day of Social Communications on 24 January, Pope Leo XIV raised concern at the “deceptive anthropomorphisation” of machine learning models and warned against “a naive and unquestioning reliance on artificial intelligence as an omniscient ‘friend’, a source of all knowledge, an archive of every memory, an ‘oracle’ of all advice”.

Artificial intelligence – a force for good, a force for evil
View from Rome
artificial intelligence
Bible Society
CATHOLIC NEWS
Catholic schools
Scripture
technology
MOST READ ARTICLES
1
History in their hands: Stonyhurst’s archives
2
Murder, harmony and immortality in the shocking soundscape of Carlo Gesualdo
3
Holocaust Memorial Day is a sign of the power of togetherness
4
Sarah Mullally officially takes up office as Archbishop of Canterbury
5
Locals protest at Spanish government plans for rail crash victims’ funeral
6
Pax Christi alarmed by Trump’s ‘pay-to-play’ Board of Peace
FROM THE EDITOR’S DESK
RELATED ARTICLES
The International Catholic News Weekly
SIGN UP FOR OUR NEWSLETTER
Get the latest news and special offers delivered to your inbox
THE TABLET
Subscribe

Advertise
Jobs
About us
Permissions
Help
Privacy policy
Contact us
FOLLOW US

source

Scroll to Top