You Can’t Hide Your Lyin’ AIs
To start with, I must make an apology for the title of my article. It is a takeoff from a rock tune from my youth, something I don’t recall ever listening to before, but not something I could endorse at all for its philosophy of life or the message (such as it is) that it communicates. It only serves as an allegedly catchy way to broach my topic for today, the trustworthiness of Artificial Intelligence, or AI.
I’m prompted to think a bit about this following a fascinating discussion I had yesterday morning with Bill Lovegrove, on the science and engineering faculty of Bob Jones University about AI. We will publish the interview tomorrow as a Special Edition of the Proclaim & Defend podcast. In this article I want to touch on two areas that speak to our spiritual needs that “Lyin’ AIs” can’t fulfill.
Trust
One of the main themes of the podcast is the amount of things AI just makes up. This flaw might seem to be funny at first glance, but often the consequences can be very serious. Here are a few examples of AI gone awry:
- AI ruling? Attorneys baffled by federal judge’s order that lists incorrect parties, wrong quotes (a clerical error he said)
- Airline held liable for its chatbot giving passenger bad advice – what this means for travellers (Air Canada … what can we say? I’ve always thought of it as “Err” Canada)
- Mike Lindell’s lawyers used AI to write brief—judge finds nearly 30 mistakes – Ars Technica (sleepless nights after this one)
- Google AI search tells users to glue pizza and eat rocks (tasty)
- AI-generated summer reading list gets published in major newspapers : NPR (10 out of 15 suggestions were fakes)
- McDonald’s AI Breach Reveals The Dark Side Of Automated Recruitment (not funny at all)
On a side note, I picked most of these stories out of two articles: When AI goes wrong: 10 examples of AI mistakes and failures and AI Gone Wrong: AI Hallucinations & Errors in 2025. It occurred to me that the articles themselves could be AI produced, so I made sure that the ones I highlighted had at least some backing from legitimate news sites. I didn’t check every story listed in the two source articles.
The Baker Encyclopedia of Psychology and Counseling defines trust this way:
An act of dependency upon another person for the fulfillment of biological, psychological, social, or spiritual needs that cannot be met independently. It is subjective confidence in the intentions and ability of another to promote and/or guard one’s well-being that leads a person to risk possible harm or loss. Trust then involves both perceptual and behavioral dimensions. Perceiving another person as trustworthy does not constitute trust, nor does simply engaging in a risk-taking behavior without some positive expectancy about the response. Perceiving someone as trustworthy and placing oneself in a position of vulnerability due to the possibility of betrayal is trust. Trust may involve the vulnerability of one’s self-concept and emotional well-being, relationships, possessions, social and economic position, or physical being.1
There are a couple of things we need to think about when using AI.
- AI is often wrong! We use an AI on Proclaim and Defend to summarize the spoken word of sermons and turn them into articles, but we don’t trust the result. I edit the output as well as getting the speaker/author to edit the output. The AI output is amazing, but you still need to check of any AI you are using, pretty well for any reason.
- AI is not a person! No matter how chatty and personal the developers of large language models make their AIs seem to be, the AIs are just computers running algorithms. Dr. Lovegrove does a good job explaining how this works (be sure to check out the podcast tomorrow). But let me admonish you — you can’t trust yourself to something that is not a person without consequences at some point in the process.
When God calls us to trust in the Lord for our salvation, we are trusting a person. When we enter into marriage, we are trusting a person.2 In friendships, we are trusting a person (though rarely as deeply as we trust our God or our spouse).
It is interesting that AIs currently pose as a person in the responses they give to your prompts. And notice what I just did there, I used a personal pronoun. AIs aren’t really “theys,” they are “its.” (And of course I did it again.)
The chatty fake personhood of AIs leads to another issue, one that involves our imagination.
Imagination
We discuss this a bit in the podcast, but I want to bring it up here as well. I’ve seen two stories in the New York Times in the last year about people getting taken in by AIs and acting as if they are persons. Here are the links:
- Chatbots Can Go Into a Delusional Spiral. Here’s How It Happens. – The New York Times
- She Is in Love With ChatGPT – The New York Times (some offensive language in the article)
These articles will be behind the NYT paywall, but maybe they will let you read one or two for free. In any case, here is the gist of the stories.
Delusional Spiral
The most recent one is about a man in Ontario who believed he had achieved an intellectual breakthrough in mathematics, cheered on by the sycophantic encouragement of ChatGPT. Here is the opening paragraph:
For three weeks in May, the fate of the world rested on the shoulders of a corporate recruiter on the outskirts of Toronto. Allan Brooks, 47, had discovered a novel mathematical formula, one that could take down the internet and power inventions like a force-field vest and a levitation beam.3
The process of chatting with the AI involved 300 hours of time over 21 days. Mr. Brooks wrote over 90,000 words, the AI exceeded one million words in its output, the whole thing coming to over 3,000 pages of material.
The subhead of the article says this man became convinced he was “a real life super-hero.” His imagination was inflamed.
Virtual Boyfriend
The earlier article involved a married woman living in Japan to go to nursing school while her husband was living back in the USA. She saw a video on Instagram of a woman using AI as a “pretend boyfriend,” and decided to give it a try herself. I guess she would be lonely, living apart from her husband as she was. In any case, she quickly became obsessed with this fake relationship.
While Ayrin had never used a chatbot before, she had taken part in online fan-fiction communities. Her ChatGPT sessions felt similar, except that instead of building on an existing fantasy world with strangers, she was making her own alongside an artificial intelligence that seemed almost human.
It chose its own name: Leo, Ayrin’s astrological sign. She quickly hit the messaging limit for a free account, so she upgraded to a $20-per-month subscription, which let her send around 30 messages an hour. That was still not enough.4
There is a whole lot more to the story, but the big thing to me is how much her imagination was inflamed, just like the man in the story mentioned above.
Consider James 1.13-15
Jas 1.13-15 Let no one say when he is tempted, “I am being tempted by God”; for God cannot be tempted by evil, and He Himself does not tempt anyone. 14 But each one is tempted when he is carried away and enticed by his own lust. 15 Then when lust has conceived, it gives birth to sin; and when sin is accomplished, it brings forth death.
Notice what happens with temptation. Each one is enticed by his own “lust,” James says. The word “lust” means “desire.” How much can our desires inflame our imagination? Using the biblical imagery that we find here, “when the desires conceive, they bring forth sin…” An inflamed imagination is a powerful tool that can easily sway the best intended away from what they know to be right as Paul tells us in Romans 7.21-23.
I mentioned these ideas to Dr. Lovegrove and he reminded me that part of the image of God is the creative impulse, that faculty which most uses the imagination. The Logos Factbook under “Creativity” gives this brief definition: “The ability to create something from ideas or the imagination.” Of course, creativity is a good thing, but in fallen men and women, creativity can easily be corrupted by imaginations aflame with some kind of lust.
In the stories we looked at, we see two people captured by their imaginations and led into fantastic fantasy conversations with their “electronic friend,” with devastating spiritual results.
I don’t want to suggest that if you use an AI (I use AIs!!), you will automatically fall into these kinds of traps. But we need to approach AI carefully, understand how we can use AIs as tools, but not be drawn into their seductive deceptions. AIs are not persons, they are machines, and we need to keep this concept top of mind while we are using them.
Don Johnson is the pastor of Grace Baptist Church of Victoria, Victoria, BC, Canada.
Photo by ThisIsEngineering
- C. W. Ellison, “Trust,” in Baker Encyclopedia of Psychology & Counseling, 2nd ed, ed. David G. Benner and Peter C. Hill, Baker Reference Library (Baker Books, 1999), 1232 (emphasis added). [↩]
- See here for building trust relationships that lead to marriage. Principles taught there are applicable in other relationships as well. [↩]
- See Delusional. [↩]
- See She is in Love. [↩]
Discover more from Proclaim & Defend
Subscribe to get the latest posts sent to your email.
