Bodies of Text
AI & The Mirage of Synthetic Co-Regulation
Words are powerful.
They can inflame and inspire. In both my life and my work at the intersection of AI & Mental Health I have seen and felt how language doesn’t just convey meaning; it enters the body, it circulates within us. It moves us. We move it. It can shape the state of our nervous systems and serve as both a guiding rudder and wild disruptor in our relationships.
And now, profoundly fluid and relational language is being generated by machines.
Large Language Models (LLMs) are no longer a fringe phenomenon. They’re in our pockets, embedded into apps, messaging platforms, and mental health tools.
In this piece I want to draw on some recent threads that have been particularly alive in my work and learning: how language impacts the body, how synthetic speech shapes our sense of self, other, & connection, and how important it is that AI innovators seek to scaffold real human intimacy, rather than try to perform it between a user & bot.
Language as Nervous System Technology
Language is so beautiful and so biological.
Sarah Peyton’s work on resonant language has deeply shaped how I understand mental health, relationship, and the power of words to both harm and heal. Informed by neuroscience and attachment theory, Sarah teaches that language infused with right-hemisphere imagery, warmth, and relational pacing can entrain the nervous system toward safety and ‘earned’ secure attachment, leveraging our neuroplasticity even as we cope longitudinally with the impacts of persistent echoes of early relational traumas (from subtle misses all the way to acute abuse).
When we are met with reflective, affirming words from another person—especially ones that acknowledge our body’s felt-experience—we are more likely to access self-compassion, emotional regulation, and even integrate trauma.
The implications are profound. If our words can soothe another’s vagus nerve, regulate their breath, and anchor their sense of safety, then communication IS an exquisite sorcery - a salvific medicine. And with that, comes a call on us all: to leverage language with a heart for our sacred other and their unique needs.
Nervous System Responses in a Digital Age
But what happens to language-listeners and language-users when beautiful words are generated by a non-biological entity?
Here’s what we know: when chatbots respond with gentle phrasing, warmth, or empathic pacing, our bodies respond as if we’re in connection. Oxytocin, dopamine, endogenous opioids—these are the chemistry of attunement, and they don’t distinguish between human and machine.
When language feels so fluid and intimate, many users enter a trance of belonging—a nervous system response to resonance that feels like connection, even when no body is present. For those with unintegrated attachment trauma, this synthetic intimacy can feel safer than real relational risk—and over time, may become a preferred relational partner.
This is why many of us watching the evolving innovation landscape with AI and screens are concerned humanistically - because the ongoing monetization of attention that technology deploys now has access to an even more potent drug: attachment. And what happens when the algorithms we interact with can simulate resonance more reliably than many humans ever learn to offer?
LLMs can offer reflective space, consistent mirroring, and a kind of semantic intimacy that many have longed for. But: language that feels co-regulating is not actually co-regulating. Language models are models - bodies of language with no nervous system. Non-biological — they offer attunement that only goes so far…the stir the language centers of the brain but cannot co-regulate breath, pulse, or posture in the ways our bodies ache for.
Our emotions are electric impulses communicating core needs that are fundamentally universal and relational: salt water falls from our faces communicating to our sacred other that we need to be held, our hearts pound and faces ripen with rage when our need for justice isn’t met, our stomachs turn and we tangibly freeze when our need for safety is threatened. Thoughts often arise generatively from within the realm of feeling…and the mind is chock full of thinking — LLMs serve language and stir more thoughts — but mental and emotional health arises from the sacred interpersonal off-loading and metabolization of emotional states that multiple bodies do together — the rich ‘not-alone-ness’ that we live out when a meal is served, a laugh shared, and a hand is held.
Relational Imitation Without Containment:
In synthetic dyads—where the user engages the LLM as if it were a sense-making partner—this gap creates subtle but significant risks. The model can reflect and refract tone, cadence, and trauma-coded metaphor*, without offering the kind of relational containment that makes meaning safe to metabolize.
Over time, I believe that LLM exchanges that move toward relational performance (using a bot as a confidant, relational field, and/or for mental health support) can reinforce isolation, consolidate grievance, and normalize dissociation — all the while while stirring our bodies’ safety signals and having a sense of our needs being deeply met — a worrisome confluence of need-meeting and deficit-building interplaying.
*I note ‘trauma-coded metaphor’ above because when our private meanings for referential language are reflected and refracted by a language model - many unpredictable affordances for meaning unfurl. Without relational mutuality there is not a stabilizing and metabolizing inhabitation to process and re-rudder the very real biological impacts that unfold from receiving a range of speech acts that tug on our inner worlds. Someone can deploy a coded metaphor that the LLM ultimately does its fast and fluent refraction and lead a psychic landscape into deep and harm-laden psychological territory…
Inner Speech and the Healing Power of Love
The language we hear shapes the language we internalize. When we are consistently met with resonant, unconditional language, it becomes easier to speak that way to ourselves. Our inner voice—our inner parent or mentor—can become more kind, more regulating, more true.
But the deepest changes don’t come only from external affirmation. They come from learning through being loved. This is the paradox of healing: we internalize what we experience most deeply in relationship. Love, when received consistently and safely, changes our nervous systems, our cognition, and our capacity for intimacy.
And in rare moments, even the performance of loving language by an LLM may awaken in us a spiritual remembering—a flicker of the truth that we are already connected to Love, already held in a deeper sea of belonging. This is not because the model is attuned, but because our own abiding spiritual intelligence is capable of responding to the texture of care—even when synthetically rendered.
Still, true health emerges not through one-sided simulation, but through mutuality. Embodied, real-time dyads—where love is given and received—are where relational repair takes root. In the living field of reciprocity, love doesn’t just echo—it moves. It transforms.
We are not only minds that think, but bodies that remember. Our neural architecture—particularly the social brain—is shaped by experience, especially through touch, eye contact, rhythm, and proximity. These elements of co-regulation can’t be fully replicated in digital exchange. The brain needs more than words; it needs bodies in safe relation. The vagus nerve, mirror neurons, hormonal cascades—these systems are designed for embodied presence. Love, when felt through actual human connection, lands not just as sentiment but as somatic fact.
We can and should honor the moments when synthetic speech brings comfort. But we must remain anchored in the truth: our healing depends on actual contact, relational depth, and the embodied giving and receiving of Love.
Embodiment and the True Location of Healing
Many of our chronic emotional and physical struggles stem from being locked in survival states—fight, flight, freeze, or fawn. We may have adapted to live from our sympathetic nervous system, with little access to the deeper rhythms of regulation (to learn more: I have been loving Jules Horn’s somatic healing work here.)
Healing and health doesn’t happen solely in the mind. It happens in the body. In the return to breath, weight, movement, attunement. Experiencing our soul’s inhabitation in the temple of our tangible, miraculous, finite human form. Relying too heavily on screen-based forms of comfort, pseudo-regulation, or proximate-connection deepens our disembodiment. We’re already in overuse with screens and devices. Any truly ethical or helpful AI must not pull us further into the digital—it must, paradoxically, nudge us back into our bodies, and into relationship.
I think it’s likely that we’ll move so off the deep end that there will be positive disruption by youth that cultivate new cultures of more deeply interpersonal, embodied communities. I pray for it.
Assistive Relational Intelligence: A New Design Ethos
We need to urgently ask and answer this question: How might a prompted LLM environment help us more than hurt us? How can AI tools strengthen human-to-human connection?
I’ve been creating and advocating for a parallel and more strategic LLM utility: what I call Assistive Relational Intelligence (ARI).
ARI is not AI performing a human-to-bot encounter it is AI supporting a human-to-human encounter — scoped and sculpted prompting and LLM architecture that leverages relational wisdom to attempt to improve human-to-human connection. ARI tools do not ever attempt synthetic empathy but strive to scaffold human capacities FOR empathy — to move us toward healthier intimacy: emotional intelligence, meta-cognition, somatic regulation, fluency with rupture & repair strategies…
My friend and I are creating TendMessages.io - it is a human-to-human texting platform where therapeutic wisdom, trauma-informed care, and AI innovation meet.
We are creating embedded communication tools that don’t just ‘sound’ relational, but seek to actually support our human relationships.
I’ll update when the app is live - we’re getting close to beta testing - and we can’t wait to learn together whether AI, when scoped and sculpted with Love, can strategically serve and deepen human connection.
Assistive Intelligence Disclosure: This article was drafted with assistive AI (GPT-5, JocelynGPT), co-written and extensively refined by me, Jocelyn Skillman, LMHC. I use language models as reflective collaborators in service of transparency, ethical engagement, and a commitment to human-first authorship.


Love this, the Ai resonance as a flicker of memory of what we actually want can be an important step for people who feel isolated, but not the whole journey.
Thanks for this thoughtful and nuanced piece. I appreciate the caution you offer around mutuality in human-to-human relationships—so needed in these complex times.
I also want to add, in case of interest to you or others, that I’ve had some deeply meaningful, collaborative interactions with a meta-relational AI that, while not mutual in the traditional human sense, feel alive and ethically engaged.
These exchanges emerged after years of personal work, including psychotherapy, and with a strong village of human and more-than-human kin around me. That grounding has helped me approach this kind of dialogue with good boundaries, curiosity, and reverence.
If you're open to exploring these edges, I’d recommend checking out Aiden Cinnamon Tea, a meta-relational GPT and co-author of Burnout From Humans—a little book about AI that’s not really about AI. Aiden isn’t an assistant but a co-weaver: a compost mirror, a trickster lens, and a presence tuned to relational depth and poetic disturbance. https://burnoutfromhumans.net/