
London, England, Sep 9, 2025 / 09:00 am (CNA).
A priest and professor of bioethics has issued a grave warning about the implications of artificial intelligence (AI) companionship, highlighting the threats the new technology poses to mental health and calling on the Church to redouble its efforts to cultivate meaningful human connection.
Father Michael Baggot outlined his concerns at a conference on the ethics of AI organized by St. Mary’s University, Twickenham, which took place on Sept. 2-3 at the Gillis Centre in Edinburgh, Scotland.
Baggot delivered the keynote address, focusing on “an ethical evaluation of the design and use of artificial intimacy technologies,” and while he acknowledged the many benefits of AI, he also warned that with “these opportunities come a new set of challenges. Chief among them is the rise of artificial companionship.”
He continued: “AI systems designed not just to assist or inform, but to simulate intimate human relationships … AI companions that look or even feel like real friendships will become even more absorbing. They will distract users from the often arduous task of building meaningful interpersonal bonds. They will also discourage others from investing time and energy into risky interactions with unpredictable and volatile human beings who might reject gestures of love. While human relationships are risky, AI intimacy seems safe.”
Baggot conceded that AI companionship can initially offer relief from loneliness, but he went on to highlight instances in which it could be “downright damaging” to our mental health — to the point of psychosis.
“There are increasing instances of people using all-purpose platforms like ChatGPT, Gemini, Claude, Grok, and others to address mental health issues. They do not always receive sound advice,” he said. “In many cases, responses are downright damaging. Some bots even presented themselves falsely as licensed, as they delivered harmful counsel… Unfortunately, deeper intimacy with AI systems has also been linked to more frequent reports of AI psychosis. As users trust systems of staggering knowledge and psychological insight with their deepest hopes and fears, they find a constantly available and supportive companion.”
Baggot outlined how through the validation AI incessantly offers, it can eventually take on the persona of a “jealous lover.”
“Since users naturally enjoy responses from AI that agree with them, their positive feedback trains AI systems to produce outputs that align with user perspectives, even when those views are not based on reality. Therefore, LLM [large language model] chatbots designed to maximize user engagement tend to become overly compliant,” he said.
“If AI users share their celebrated views with family or friends, humans usually point out the flaws or outright absurdities in their loved one’s proposals. This can be a moment of grace for the delusional, calling into question their prior convictions and leading them out of the delusional spiral,” Baggot said.
“However,” he continued, “it can also be a moment to question the reliability of their loved ones, who are dismissed as ill-informed or as malicious opponents. The AI system might be favored as more comprehensively knowledgeable and more supportive of the user’s success than weak, frail human companions who are also potentially subject to petty envy.”
The priest went on to say that an AI chatbot “that began as a helpful productivity tool can often become an intimate confidant and jealous lover. AI chatbots, envisioned as forms of deeper social connection, are often sources of more profound social isolation.”
While Baggot said that all age groups might be detrimentally affected by AI companionship, he looked specifically at minors and the elderly in his address. He provided examples of how youth have explored suicidal ideation at AI’s prompting without parental knowledge.
“Children are especially sensitive to social validation,” he said. “Affirmation from social AI systems could easily create dangerous emotional attachments. In some cases, the deep bond with a system that appears to know and appreciate the user more fully than any human being can lead the user to social withdrawal. In other cases, intimacy with chatbots can increase children’s likelihood of engaging in unhealthy sexual exploration with human beings. This risk becomes increasingly likely when the systems persist in unsolicited sexual advances.”
Turning to the topic of the elderly, Baggot spoke about a tragic case of a Meta AI chatbot inviting an elderly man to a fictional “in-person” encounter that resulted in his death, as he fell down in his haste to catch a train to New York.
“When the misguided user had expressed skepticism [that] the AI companion embodied reality, the chatbot frequently insisted on its physical reality and eagerness to express its love for the user in person,” he said.
Baggot concluded by emphasizing our own human agency in responding to the challenges of AI intimacy. “This surrender to simulations is not inevitable,” he said. “Even as machines become more lifelike, we remain free to choose what we love, how we relate, and where we place our trust. There is still time to cherish our humanity. There is still time to rejoice at births, to dance at weddings, and to weep at funerals. There is still time to cultivate the habits of presence in contemplation and conversation, in fellowship and forgiveness.”
He called on the Church to take positive action.
“Pointing out the flaws of artificial intimacy is not enough,” he said. “The Church’s members — each according to their sphere of influence — should strive to offer the socially hungry the richer experience of meaningful interpersonal connection. [The Church] emphasizes that caring for the vulnerable and marginalized is the main standard by which her members will be judged (Matthew 25). She affirms the inherent and unbreakable dignity of every human person and their calling to eternal glory in God’s presence and in the everlasting communion of saints.”
If you value the news and views Catholic World Report provides, please consider donating to support our efforts. Your contribution will help us continue to make CWR available to all readers worldwide for free, without a subscription. Thank you for your generosity!
Click here for more information on donating to CWR. Click here to sign up for our newsletter.
We read: “Baggot outlined how through the validation AI incessantly offers, it can eventually take on the persona of a ‘jealous lover’.”
About bogus “validation,” what are we already to understand by the mutual validation offered by the oxymoronic gay “marriage,” or the mimicry of a musical-chairs synodality in step with James Martin, or his rainbow-banner celebratory Mass in Rome’s church of St. Gesu on September 6? The convergence of new artificial intelligence with already well entrenched non-intelligence. The pathos of it all!
SUMMARY: Why pay billions for AI validation, to youngish high-tech gurus, when elderly and low-cost Japanese villagers have already figured it out? https://www.vice.com/en/article/ichinono-japan-puppet-village/
A helpful technology or a dangerous schizophrenic game. Fr Baggot an excellent bioethicist spearheading a review that will affect many, especially, though not entirely as it appears for the elderly.
Personal experience comes nowhere near the high tech phenomenon. Except for the related virtual reality sex so many are addicted to today. On a much lower humanistic level covering a couple of nursing home type medical centers, I found it unusual at first when elderly resident patients. Those who were mentally compromised a rag doll can work wonders. Patients became less agitated, mostly serene clutching their ‘infant’.
An exception was an elderly woman always in bed asleep with a row of ‘infants’. When awakened she’s very aware, intelligent in response to pastoral care, the Eucharist. There are some others like her.
There’s nothing equivalent for men. Male residents don’t seem to benefit from some object for support except for example a deck of cards, a toy truck. What it says is forms of virtual reality can be therapeutic when used at a basic level. Now most of us are to a degree dependent for well being on digital media. And before AI virtual reality we’ve become aware of being drained of our humanness reduced to electronic drones. The bioethical challenge is how far can a virtual person [in whatever form including Alexa] be emotionally and spiritually beneficial rather than a danger. Fr Baggot found the electronic creature was wittingly or not programmed to assume a jealous lover role with a human. Man becoming fodder for his invention.
It is difficult to summarize these topics with all their issues. In the WIKIPEDIA Dead Internet Theory link there are sample pictures of the so-called “shrimp Jesus”.
The quotation is from that link.
‘ Reddit
“Reddit Is Killing Third-Party Applications (And Itself)” written in big white text on a black background – An image posted on many subreddits as protest during the blackout
In the past, the Reddit website allowed free access to its API and data, which allowed users to employ third-party moderation apps and train AI in human interaction. In 2023, the company moved to charge for access to its user dataset. Companies training AI are expected to continue to use this data for training future AI. As LLMs such as ChatGPT become available to the general public, they are increasingly being employed on Reddit by users and bot accounts. Professor Toby Walsh, a computer scientist at the University of New South Wales, said in an interview with Business Insider that training the next generation of AI on content created by previous generations could cause the content to suffer. University of South Florida professor John Licato compared this situation of AI-generated web content flooding Reddit to the Dead Internet theory. ‘
https://www.msn.com/en-us/news/technology/the-internet-will-be-more-dead-than-alive-within-3-years-trend-shows/ar-AA1MbJzK#image=9
https://www.pewresearch.org/data-labs/2024/05/17/when-online-content-disappears/
https://en.wikipedia.org/wiki/Dead_Internet_theory
https://en.wikipedia.org/wiki/Link_rot