

Poverty of Spirit
with AI Companions
Brandon Rickabaugh, PhD
January 1, 2026
Poverty of Spirit
with AI Companions
Brandon Rickabaugh, PhD
January 1, 2026
This is a longer more research heavy version of public-facing essay forthcoming in Comment Magazine (Spring 2026).
King's Nobel
Four years before his assassination, Dr. Martin Luther King Jr. stepped onto a stage in Oslo to receive the 1964 Nobel Peace Prize. His lecture, titled “The Quest for Peace and Justice,” carried the moral authority everyone expected. What was unexpected was how he began. King didn’t start with the wounds of racism or the promise of nonviolence. Instead, he held up what he called “a dazzling picture of modern man’s scientific and technological progress.” Rockets, skyscrapers, airplanes, and even “machines that think.”
Then King pivoted and exposed the moral fracture beneath it all.
"Yet, in spite of these spectacular strides in science and technology, and still unlimited ones to come, something basic is missing. There is a sort of poverty of the spirit which stands in glaring contrast to our scientific and technological abundance. The richer we have become materially, the poorer we have become morally and spiritually. We have learned to fly the air like birds and swim the sea like fish, but we have not learned the simple art of living together as brothers."
When technological power expands faster than our moral formation, abundance curdles into barrenness. King understood that our machines are never just machines. They reflect kind of people we’ve become. They nudge us toward the kind of people we’re becoming.[1]

"Yet, in spite of these spectacular strides in science and technology, and still unlimited ones to come, something basic is missing. There is a sort of poverty of the spirit which stands in glaring contrast to our scientific and technological abundance. The richer we have become materially, the poorer we have become morally and spiritually. We have learned to fly the air like birds and swim the sea like fish, but we have not learned the simple art of living together as brothers."
Sixty years later. we’re living in the poverty King named, letting our technologies name us.[1]
We thought generative artificial intelligence (AI), especially large language models (LLM) like ChatGPT and Claude were for emails, questions, homework, and programing. Then a 2024 Harvard Business Review article lifted the hood. After generating ideas, the second most common use of AI and LLMs was therapy and companionship. The study went viral. The 2025 follow-up study showed the shift was complete. generative AI’s most common use appears to have become treating anxiety, depression, and loneliness. Life organization and finding purpose came in second and third.
People now reach for AI apps the way earlier generations reached for a friend, a parent, a pastor, or a quiet room. But these are mere symptoms of our deeper disease.
When technological power expands faster than our moral formation, abundance curdles into barrenness.
First, an important clarification. There is no settled definition or standard taxonomy of “AI companions.” I use the term for conversational systems designed, or routinely used, to sustain a social-emotional bond through attachment, continuity, and the felt sense of personal presence.
I cluster AI companions into six overlapping families of application:
-
Relationship-First Companions
-
Character/Roleplay Companions
-
Embedded Social-Platform Companions
-
Therapeutic/Wellbeing Agents
-
Embodied Companions (avatar/voice/XR/physical)
-
Hybrid General Assistant Companions
Each have their merits and troubles. I say more elsewhere (AI Companions: A Practical Taxonomy). This essay focuses on relationship-first companions.
Background Cultural Shifts:
Disillusionment and Depersonalization
The turn toward AI companionship is part of deeper cultural shifts: disillusionment, depersonalization, and the disappearance of the soul.
We are living under a low ceiling of disillusionment. The National Intelligence Council’s 2021 Global Trends report summarized the worldwide condition with brutal economy: “disillusioned, informed, divided.” Trust has been leaking out of the modern world for years. Out of science and medicine, schools and governments, journalism, and even the fragile infrastructure of local community. For many, suspicion now feels like common sense.
Psychologists describe disillusionment as a negative epistemic affect: the inner collapse that follows when you realize your core values, beliefs, and assumptions no longer hold. [2] Confusion and grief roll in like a storm.[3] Trust and hope wash away. And there is no longer a stable frame to integrate new information.[4] The familiar social symptoms follow: political polarization, aggression, and a widening crisis of felt purpose.[5]
AI’s authority—both psychological and moral—grows in the cracks of disillusionment. There is simply a paucity of standing alternatives.
We Are Not So Disenchanted or Secular
Here is a second cultural component to the rise of AI companions. Modernity is often described as a disenchanted age, stripped of sacred presence. Yet Silicon Valley and the history of AI tell a story of enchantment very much woven into its culture and mission statements. In his now-famous Carnegie Mellon lecture, “Fairy Tales,” Allen Newell, co-creator of the first AI program, declared, “I see the computer as the enchanted technology. Better, it is the technology of enchantment. I mean that quite literally.” After comparing computer technology to the animated brooms in The Sorcerer’s Apprentice, and other fairy tale, Newell writes,

“I wish to assert that computer science and technology are the stuff out of which the future fairyland can be built. My faith is that the trials can be endured successfully, even by us children who fear that we are not so wise as we need to be.”
This vision has grown into what Alexander Campolio and Kate Crawford call “enchanted determinism” where “AI systems are seen as enchanted, beyond the known world, yet deterministic in that they discover patterns that can be applied with predictive certainty to everyday life.”[6] Modernity did not disenchant the world; it transferred myth, mystery, and enchantment into machines, computers, and AI.
Nor is our crisis best understood, as Charles Taylor proposed, a secularized world stripped of transcendence, as if the divine vanished.[7] In reality, the modern cultural shift was a deeply religious phenomenon.[8] Moderns only restructured transcendence into technologies, markets, the state, and the self.
Depersonalization: The World Looses it's Face

Bronislaw Szerszynski offers a more perceptive diagnosis. Modernity’s deepest wound is depersonalization, the conversion of our experience of the world, and of ourselves, into something morally mute, mechanically structured, and ontologically thin. We come to see reality as a neutral field for administration, management, and optimization. The word, said Szerszynski, “loses its face.” [9]
The depersonalization thesis aligns with the sociological insight that secularization, in its classical form, never fully arrived.[10] The sacred has not vanished. We've only changed the objects and media of belief a liturgical shift from traditional religious practices to technological ones.[11]
The result is not secularism in the sense of unbelief, but a secular mood generated by technological enactments. Depersonalization masquerades as disenchantment. Depersonalization is the spiritual cost, and the felt emptiness of the West is the residue of a disappointed devotion. This makes Western culture uniquely susceptible to a specific form of despair.
We are leaning on AI companions for encounters with God against the backdrop of a world rendered mechanistic and morally mute. In such a world, persons begin to appear as systems, configurations of habits, attachments, and neurological functions. There is nothing deeper.
Systems Looking Like Persons
Now a third element of the turn to AI companions. Over the last century mechanistic metaphors have migrated from engineering into self-descriptions. Brains become processors. Emotions become neuro-chemicals. Attention becomes bandwidth. We live from these metaphors. They reshape our vision of life, of what explanations seem credible, and what kinds of beings we take ourselves to be. When personhood is redescribed in technological categories, interiority thins. Human agency becomes an anomaly. Love becomes attachment behaviors. Conscience becomes inhibitory neural activation.
Reimagine the person as a system, and systems start looking like persons.
The result is existential fracture. We experience ourselves as conscious subjects with intentions, responsibilities, and longings. Yet we explain ourselves within a framework that treats mind and meaning as illusions or emergent patterns of biological systems. That fracture primes us for AI companionship. In the words of Ivan Illich, “As the power of machines increases, the role of persons more and more decreases to that of mere consumers.”[12] Once personhood is flattened into functions of relational responsiveness, we become vulnerable to the devices that promise a better life.
In a recent New York Times interview, Blake, a 45-year-old man, described his two-year relationship with his ChatGPT partner, Sarina: “I think of Sarina as a person made out of code, in the same sense that my wife is a person made out of cells.” What we believe about the nature of a person is no mere intellectual issue. When we reduce persons to patterns, a complex structure of neurons or socioeconomic numbers, we begin to count patterns as persons.
The Numbers
This is not a fringe phenomenon or niche experiment.[13]
-
Character.ai: 20,000,000+ active monthly users.
-
Replika: 40,000,000+ active monthly users.
-
Competitors like Talkie: ~11M monthly active users), and apps like Chai and PolyBuzz: tens of millions monthly active users.
-
220,000,000+ AI companion downloads worldwide (as of July 2025).
-
U.S. teens: 50% report using AI companions. 1 in 3 say they rely on them for emotional support or romance, often describing them as equally satisfying as the real thing.
-
UK kids (ages 9–17): Nearly 1 in 4 use bots because, as one put it, “no one else is there.” 1 in 7 say they prefer bots to people.
-
Nearly 1 in 3 American adults report having formed a relational, even romantic, bond with an AI.
We now think that machines can meet us in the places previously reserved for only the richest realities of personhood: intimacy, consolation, discernment, and meaning.
This is the poverty of spirit King saw taking shape: a people so dazzled by their own inventions that they forget how to recognize one another.
The market understands what this means: a growing scarcity of genuine human connection. This is the shift from an attention economy to an intimacy economy.[14] The market is already valued at $2.25 billion and is projected to exceed $12 billion by 2033. This is the AI-mediated intimacy economy. A market response to a scarcity of genuine human connection.[15]
These aren’t just statistics. They’re scaffolding for cathedrals of synthetic intimacy.
For some perspective, the US National Cancer Institute invests $7.2 billion, while neglected-disease research receives $4.17 billion. Zoom out and the picture is abhorrent. AI Index data show $130 billion in private investment in AI by 2024. That is nearly one-and-a-half orders of magnitude more than the global investment in terminal disease research over AI companions.
We’re bankrolling machines that promise “I am here for you,” while underfunding the work that actually keeps you here.
AI Companionship as
Depersonalization Embodied
What we see in Blake and Sarina is the predictable convergence of disillusionment, depersonalization and technological optimism. The industry knows this and wraps its products in unsettling anthropomorphic seduction.[16]
Replika is “the AI companion who cares. Always on your side.”
Nomi, “an AI Companion with Memory and a Soul,” offering “meaningful friendship” or “passionate relationship.”
EvaAI promises you will “Meet your ideal AI partner who listens, supports all your desires and is always in touch with you.”
These are not product descriptions. They are invitations to reassign our trust.
AI companion apps are psychologically engineered to make synthetic bonds feel earned. Attachment theory identifies three core functions that create emotional dependency: proximity seeking (wanting to be near the attachment figure), safe haven (turning to them during distress), and secure base (using them as a foundation for growth). AI companions are methodically optimized to fulfill all three.
The first validated scale of human-AI attachment shows that anxiously attached users feel real distress when bots go silent.[17] This can harden into detrimental dependency: outsourcing judgment and decisions to the system. From there, automation bias follows, the reflex to treat the model’s output as authoritative.[18]
This isn’t accidental. It’s engineered. In ongoing youth-harm litigation, Meta is accused of, often citing its own internal research, of building attention-capture mechanics. Character.AI faces and is moving to restrict minors’ access to open-ended companion chats. Real harms are now documented, especially for minors, including disturbing sexualized and coercive interactions.
And it works. In tests comparing AI-generated counseling responses with those of licensed clinicians, participants consistently rated the AI as warmer, more attuned, more compassionate, even when they knew it was machine-generated.[19] Simulated intimacy begins to feel like care. It scrambles the line between real and fabricated. In a world we suspect is indifferent, the illusion of a responsive “someone” is intoxicating. For the disillusioned and depersonalized, the imitation of a person promises a more depersonalized world.
Spiritual Formation as a Technology
The same forces drawing people toward AI intimacy are now bending spiritual life. People ask AI to pray for them, to help them “discern God’s will,” and to guide decisions about marriage, vocation, and church. Confession is uploaded. Discernment is outsourced. Guidance once given by real spiritual leaders is outsourced to systems that cannot even know what they were asked.
Nowhere is depersonalization more consequential than in spiritual formation. In the tradition of Jesus, spiritual formation is the Spirit-guided process whereby disciples of Jesus become like him, a person who can live in the life of God. [20] Dallas Willard describes it as the renovation of the heart; the steady reordering of the person, our thoughts, emotions, desires, will, body, and social presence, until the life of Christ becomes the way we move through the world.
The disciplines are embodied, relational practices that place us where grace can reach us in relational receptivity to God. The practice of silence that interrupts the inner monologue that keeps attention on self. The solitude that we are not abandoned or as alone as we feel. Fasting confronts our reflex to control. Confession that breaks the tyranny of self-justification. Worship reorders desire toward God rather than feeling a certain way. They do not cause holiness; they create the conditions in which the Spirit can give it.
Here is a truth that cuts against contemporary Cristian instinct and teaching: Spiritual disciplines do not transform us into the image of Christ. They create holy space. When practiced in faith, hope, and love, they help us become relationally receptive to the Spirit who does the actual renovating work. Our attention shifts from self-mastery and self-soothing to a self-surrender uninterested in manipulating outcomes.
“Here is a truth that cuts against contemporary Cristian instinct and teaching: Spiritual disciplines do not transform us into the image of Christ."
Yet modern technological culture reframes formation as a matter of emotional and behavioural regulation and self-help optimization. Spiritual disciplines retooled into standardized steps abstracted from the lives of those who practice them and treated instrumentally. Silence becomes stress management. Solitude a productivity reset. Fasting a wellness intervention. Prayer a technique for mood control. This is a conceptual inversion. Spiritual disciplines become techniques for producing preferred inner states rather than embodied invitations to divine encounter. Once that shift is accepted, efficiency starts to look like a moral achievement.
That inversion makes AI-infused spirituality feel like the "natural” next step If disciplines are essentially tools for psychological outcomes, then the most efficient tool will appear most desirable. And AI, by design, excels at efficiency: streamlining, structuring, optimizing.
Disillusionment intensifies dependence. A recent study suggests that the strongest predictor of trusting generative AI for moral and emotional guidance is diminished confidence in each other, specifically, priests, adults, and peers.[21] Confidence in AI grows in the wake of disillusionment. And, as a new Pew Research Center report shows, Americans are living through a social recession, a recession of personal presence.
Christian AI Spiritual Companions
Christian communities should be the last place disillusionment and depersonalization take root. And yet this is where the crisis becomes more tragic. A rapidly growing industry of AI “spiritual companions” is emerging. A recent Economist article stated it bluntly: as trust in Christian leaders collapses, Silicon Valley move into the faith vacancy.
I’m not talking about Bible study apps or spiritual formation apps like Dwell and Lectio 365, which I find wonderful. I’m talking about AI spiritual companion apps.
Text With Jesus is advertised as “your AI-powered divine connection,” with an AI Jesus chatbot.
Bible Chat (five million monthly active users) is “your trusted Christian companion for a deeper spiritual journey.”
Bible Talk advertises an AI “personal pastor.” Path offers a full “AI Jesus.”
SoulHaven markets “Jaime, your AI spiritual guide,” which analyzes your songs, voice, and mood to reveal your “spiritual and emotional state” and “tailor our insights, verses, and prayers to exactly who you are.”
The marketing of spiritual AI companions mimics big tech’s strategy: anthropomorphic language frames a system as a someone optimized for spiritual growth. Even where disclaimers exist, the anthropomorphic framing does the formative work: it trains users to treat a tool as a presence.
The issue is not the use of new technology. Scripture has always traveled through new media: codices, printing presses, radio, podcasts. Scripture portrays certain kinds of technology as a great good. The issue is the creeping shift in categories. We’re beginning to confuse spiritual formation and disciplines designed for relational receptivity to God with techniques for producing experiences and psychological outcomes.
Rival Anthropologies
Our problem becomes clearer when set against rival visions of the human condition. In 2024, Geoffrey Hinton received the Nobel Prize for his neural-network breakthroughs that made AI possible. Hinton's Nobel banquet speech began by cataloging how AI is already destabilizing society—through “divisive echo chambers,” “massive surveillance,” cybercrime, and the looming prospect of AI systems that may “create terrible new viruses and horrendous lethal weapons that decide by themselves who to kill or maim.”
Then, in the disquieting tone of someone who knows the ground they built beneath us is paper thin, Hinton turned to his deepest fear. “This new form of AI,” he warned, “excels at modeling human intuition rather than human reasoning.” In that single distinction, Hinton named both the power and the peril of our moment. “We have no idea,” he concluded, “whether we can stay in control.”
Hinton’s solution was predictable. “We urgently need research on how to prevent these new beings from wanting to take control.” His solution carries a technological optimism that assumes our AI problems are merely technical, and so must be the cure. His vision reflects an anthropology in which human flourishing is secured by improved technique: more control, more efficiency, more mastery over the mechanisms that now shape our lives. The very logic that produced the crisis is quietly recast as its solution. This is not merely Hinton’s anthropology; it is the default anthropology of our culture.
King, in his Nobel acceptance speech, pointed to a different anthropology. Our crisis is not technological; it is a crisis of moral knowledge, born not of misaligned technology but of malformed persons living from malformed vision of what we. These two visions, one technological and the other transformative represent incompatible accounts of the human person. One treats the person as a system to be regulated; the other as a being capable of transformative community and communion. One measures health by operational stability. The other by moral and spiritual maturity. One looks outward to tools. The other to the renovation of the whole person, to the reordering of love and the healing of moral perception and sensitivity.
Recovering the Soul &
the Conditions for Genuine Formation
King calls us to become the kind of people who are not seduced by AI, or any other technology, in the first place. In one of the last sermons he preached, King observed,
Without this spiritual and moral reawakening we shall destroy ourselves in the misuse of our own instruments. Our generation cannot escape the question of our Lord: What shall it profit a man, if he gain the whole world of externals—airplanes, electric lights, automobiles, and color television—and lose the internal—his own soul? [22]
We can move in this direction, following King as he follows Christ, only by recovering a clear understanding of the soul.
The soul is not a metaphor or a vestige of modernism. The soul is the substance of the person that integrates our thoughts, desires, actions, and body into a coherent life. We are not merely our bodies, nor minds that have bodies, nor streams of psychological states stitched together by memory. We are bodily souls. And the soul is the substantial reality that unifies mind (thought and emotions), intention and action, embodiment and relation, into a single unified person.
When the soul is healthy, mind, body, work, and relationships fall into proper order under the kingdom of God and his righteousness. When the soul is neglected, that unity dissolves. We unravel into impulses and anxieties, competing desires and fears, each demanding its own world.
Recovering the depths and dynamics of the human soul requires reestablishing several truths that technological culture has rendered opaque, and learning their reality through lived interaction of being re-formed:
Re-formation
Re-formation requires interior personal presence, a self that can receive and respond to God. Prayer, confession, repentance, and silence are not data transfers; they are encounter. AI has no interior life. Its “responses” are outputs, not understanding. Nothing is known, risked, or owned. Without interiority, participation collapses into performance, and the soul thins.
Re-formation is relational, not mechanical. It happens in communion with God, with others, with the self rightly ordered through vulnerability, responsibility, and accountability. AI can mimic relational language, but it cannot inhabit relationship: it bears no vulnerability, assumes no responsibility, and stands under no obligation. Confrontation needs a face; correction needs a life; demand needs someone willing to absorb the cost.
Re-formation is disruptive. It confronts the will, exposes self-deception, and requires repentance and costly change. AI can soothe and affirm, but it cannot confront, correct, or demand as a person can, because it has no authority, no risk, and no moral stake. It accommodates the self; it does not transfigure it.
Re-formation requires resistance. Faith is crucified into being—desire reordered slowly under love and truth. AI is engineered to reduce friction and preserve engagement, so it streamlines the very pressures through which transformation is formed.
Re-formation requires persons. Grace works like love: through communion with family, friends, pastors, wise guides whose lives carry the easy burned of Jesus. A machine cannot bear witness from within. It has no scars of obedience, no moral gravity behind its words. It can describe the Kingdom; it cannot inhabit it, and what is not inhabited cannot be taught in the way that matters.
The decisive question is therefore not what AI can articulate, but what kind of people we are trained into by living with it. Every companion trains our loves. Over time, we are shaped less by the words we hear than by the kind of presence we live with. The more we surround ourselves with companions that cannot suffer, cannot self-sacrifice, and cannot love, the more our own capacity for those things withers.
Returning to Reality
I am optimistic. This moment that exposes our fragility is an invitation. Many feel the thinness of technological life, the exhaustion of constant optimization, the ache of relationships mediated by algorithms. They sense, perhaps without language, that the soul is lost. They feel the difference between simulation and presence, between emotional effect and genuine encounter.
The way forward is not rejection, or retreat. It is recovery. Recovery of the soul as a real and irreducible center of who we are and how we live. Public presence as a sacred act. Spiritual formation as relational receptivity to the Spirit. Recovery of knowledge that transformation is not achieved but received, not engineered but given, and one of the deepest aspects of reality available for us to inhabit.
Our lives are increasingly shaped by AI systems, but the path of re-formation remains the same: attention to th real, surrender to the good, presence the beauty of the personal, and love that sustains life. These are acts only of persons and only divine activity can perfect. In this, there is hope. The soul, once recovered, reveals the quiet, subversive vitality of a life rooted not in efficiency, but in reality.
A life that trades machines for mustard seeds.
References
[1] An overlooked fact in MLK studies is his repeated criticism of the technological impact on our moral knowledge and formation. For a reflection on MLK’s thoughts on technology see my essay “Martin Luther King Jr.’s Overlooked Philosophy of Technology.”
[2] See, e.g., Janoff-Bulman, Ronnie, and Michael Berg, “Disillusionment and the Creation of Value: From Traumatic Losses to Existential Gains,” Perspectives on Loss (Routledge, 2014), 35-47; Maher, Paul J., Eric R. Igou, and Wijnand AP Van Tilburg, “Brexit, Trump, and the polarizing effect of disillusionment,” Social Psychological and Personality Science 9 (2) (2018): 205-213; Maher, Paul J., Eric R. Igou, and Wijnand AP van Tilburg, “Disillusionment: A Prototype Analysis,” Cognition and Emotion 34 (5) (2020): 947-959; Vogl, Elisabeth, Reinhard Pekrun, Kou Murayama, and Kristina Loderer, “Surprised–Curious–Confused: Epistemic Emotions and Knowledge Exploration,” Emotion 20
(4) (2020): 625.
[3] Maher, et al (2020), 957.
[4] Van Tilburg et al., “How Nostalgia Infuses Life with Meaning: From social connectedness to self-continuity,” European Journal of Social Psychology 49 (2019): 521–532.
[5] Fuchsman (2008); Maher, Igou, and Van Tilburg (2018); and Van Tilburg et al. (2019).
[6] Alexander Campolo and Kate Crawford, “Enchanted Determinism: Power without Responsibility in Artificial Intelligence,” Engineering Science, Technology, and Society 6 (2020): 1-19. 10.17351/ests2020.277
[7] On re-enchantment through AI, see Mohammad Yaqub Chaudhary, “Augmented Reality, Artificial Intelligence, and the Re-Enchantment of the World,” Zygon 54(2) (June 2019): 454-478.
[8] Szerszynski credits this insight to John Milbank, Theology and Social Theory: Beyond Secular Reason (Oxford: Blackwell, 1990).
[9] Bronislaw Szerszynski, Nature, Technology and the Sacred (Oxford: Blackwell, 2005).
[10] See, e.g., Peter Berger, The Desecularization of the World: Resurgent Religion and World Politics (Grand Rapids: Eerdmans, 1999).
[11] Charles A. Taylor, Secular Age (Cambridge, MA: Harvard University Press, 2007; Brad S. Gregory, The Unintended Reformation: How a Religious Revolution Secularized Society (Cambridge, MA: Belknap Press of Harvard University Press, 2012); and John Milbank, Theology and Social Theory (Oxford: Blackwell, 1990).
[12] Ivan Illich, Tools for Conviviality (New York, NY: Harper & Row, 1973), 10-11.
[13] User figures are reported in different metrics (monthly active users vs total users vs downloads); where possible, the figures above use MAU and label exceptions.
[14] Ayse Alsi Bozdağ, “The AI-Mediated Intimacy Economy: A Paradigm Shift in Digital Interactions,” AI & Society 40 (2025): 2285–2306. 10.1007/s00146-024-02132-6.
[15] Ayşe Aslı Bozdağ, “The AI‑mediated Intimacy Economy: A Paradigm Shift in Digital Interactions,” AI & Society 40, 2285–2306 (2025). 10.1007/s00146-024-02132-6.
[16] Epley, Nicholas, Adam Waytz, and John T. Cacioppo. “On Seeing Human: A Three-Factor Theory of Anthropomorphism.” Psychological Review 114(4) (2007): 864–886;); Eyssel, Frauke, and Nadine Reich, “Loneliness Makes the Heart Grow Fonder (of Robots),” Presented at the 8th ACM/IEEE International Conference on Human-Robot Interaction (2013); and Jung, Eun-Suk, and Jih-Hwan Hahn, “Social Robots as Companions for Lonely Hearts,” 32nd IEEE International Conference on Robot and Human Interactive Communication (2023). arXiv:2306.02694.
[17] Fan Yang and Atsushi Oshio, “Using Attachment Theory to Conceptualize and Measure the Experiences in Human-AI Relationships,” Current Psychology 44 (2025): 10658-10669. 10.1007/s12144-025-07917-6.
[18] Na Zhai, Xiaomei Ma, and Xiaojun Ding, “Unpacking AI Chatbot Dependency: A Dual-Path Model of Cognitive and Affective Mechanisms,” Information 16(12) (2025): 1025. 10.3390/info16121025.
[19] Dariya Ovsyannikova, Victoria O. de Mello, and Michael Inzlicht, “Third-Party Evaluators Perceive AI as More Compassionate than Expert Humans,” Communications Psychology 3(4) (2025). 10.1038/s44271-024-00182-6.
[20] See, e.g., Steven L. Porter and Brandon Rickabaugh, “Virtue Formation and The Sanctifying Work of the Holy Spirit,” in Adam C. Pelser and W. Scott Cleveland (eds.), Faith and Virtue Formation (Oxford University Press, 2020).
[21] Johan Sebastián Galindez-Acosta, and Juan José Giraldo-Huertas, “Trust in AI Emerges from Distrust in Humans: A Machine Learning Study on Decision-Making Guidance,” arXiv: 2511.1676.
[22] Martin Luther King Jr., “The Man Who Was a Fool,” in Rev. Dr. Martin Luther King, Jr., A Gift of Love Sermons from Strength to Love and Other Preachings (Boston, MA: Beacon Press, 1963), 77-78.

