Code Coding for Clinicians who want to Code
Helping More Mental Health Clinicians Create Safe and Savvy Tools with LLMs
I had the complete pleasure of presenting for The Hemingway community recently—thank you so much, Steve Duke, for the invitation and the generous, generative conversation!!! It was a complete joy - I felt surrounded by encouragement and solidarity as we all seek to innovate boldly, creatively, and safely in this new terrain.
Some of the prototypes I presented and have been quietly tending over these past months, lives here in this PracticeFields Hub.
In the sweet days since the talk I realized I could dive a bit deeper given all of our enthusiasm about scaling therapists’ access to LLM innovation with the barrier of coding…
The Unicorn Problem
My last Substack piece landed for people—the idea that clinically-informed UX design for LLM tools is itself an emergent intervention frame. Therapists want to shape these tools. They just need more of an on-ramp. And I keep hearing that I’m a unicorn—a therapist who codes?! Why aren’t there more of us?
The gap is about systemic tech dominance (!), what feels permitted, how LLMs feel in the zeitgeist, clinical ethical care (why risk using LLM when it is demonstrably dangerous for mental health!??!), there’s no TIME to innovate, and whether anyone is showing the path. Of course we generally still think “coding” means becoming a software engineer - agential collaboration and vibe coding is skyrocketing but culturally fresh.
We can do this.
We desperately need clinical fluency translated into prompt architecture and LLM tools —and that’s a learnable skill that builds on everything we already know.
When I sculpt a system prompt, I’m not doing computer science. I’m doing clinical formulation in a new medium. It’s a total blast! I’m asking: What relational stance is this system taking? What developmental level does this language assume? What happens when a trauma response gets activated? What does containment look like when there’s no body, no nervous system, no continuity?
So many standing relational, empathic modes of tending to our sacred other show up and TRANSLATE and SCALE directly to exploring and sculpting with the language that LLMs generate.
And guess what…
We don’t have to learn to code!
I recently discovered a BRILLIANT startup: Cline—”Cline is an EHR built for vibe-coding. With Cline, doctors can build and deploy apps, agents, and websites independently using natural language prompts.”
I was like SAY WHAT?
This is how we get more therapists building.
Not by teaching clinicians to code. With the power of AI we can give them tools that let clinical wisdom drive while technical execution follows.
I want to create scaffolded pathways where someone can go from “I’ve never written code” to “I just built a personalized Assistive Intelligence tool for my DBT client based on the session we just had” BETWEEN SESSIONS (that fast)
What if we could teach therapists to build their own ARI tools—using agentic coding assistants as scaffolding—so that clinical wisdom shapes AI behavior at the source?
ENTER:
TherapistsCoding.ARI* — a new tool for making tools!
…built with my rich, psychodynamically cautious, ethical infrastructure embedded to support LLM performing safer linguistic acts and with the power of my passion and vision for extremely scoped Clinical UX .
Grounded in my belief that “AI tools should scaffold human connection, not simulate it. Bridge toward human care, not away from it. Build distress tolerance and relational capacity. Honor the clinician’s expertise. And refuse engagement-optimization.”
There isn’t precedent or ethical oversight for this tool-tool (!) It’s an example of what might be possible and incredible for unlocking the immense wisdom of the mental health field’s courageous, wise healers and scaling immersive healing spaces rather than leaving humanity to enter deep processing with large scale LLMs or bots that are NOT designed with users’ psychological safety in mind.
*It’s a testing prototype! It’s buggy!
The Invitation
Fund something like this? Help me?
If you’re in the AI ethics/safety/policy world and wondering why therapists aren’t at more tables: we’re here. We’re building. We just need the doors opened a bit wider - and if a therapist like me can build an agent to build tools… here I go, I don’t need to wait for any doors - (vibe codes a door).
(walks out into the sunset).
(mic drop).
(running back to grab everyone and give hugs!).
Jocelyn Skillman, LMHC
AI Ethics & Clinical Safety Consulting
Assistive Relational Intelligence (ARI) Framework
I consult! I LOVE to assess and support sculpting safer system prompts
I invest my own time and money into this work and make the API available publicly - so if you use my tools and want to support my innovation I would relish any coffee you buy for me!
Assistive Intelligence Disclosure
I used Claude to help structure and refine this piece—iteratively prompting and sculpting tone to ensure the vision shared is mine, aligned with my actual heart and work. I believe self-expression remains vital to healing and integration, and that we must continue doing the labor of out-flowing our own text while leveraging these tools thoughtfully.



Some excellent excellent work here – reading everything you're writing and cheering you on Jocelyn!