Why I’m Building an AI Therapist (And What I’ve Learned So Far)
Last year, I found myself asking a tough question: Why is it still so hard and expensive to get someone to talk to when you're struggling with your mental health? That question led me down a rabbit hole of research, experimentation, and eventually... the birth of Aitherapy, an AI therapist designed to make mental health support radically more accessible. This isn’t a launch post — it’s a checkpoint. I want to share what I’ve learned building in the intersection of tech and mental health, and hear from others working on similar things. 3 Things I’ve Learned So Far Empathy is a design principle, not just a trait. Whether you’re building a chatbot or a full platform, the smallest word choices, delays, and visuals feel different when you're mentally overwhelmed. Good tech should feel like it’s taking your hand, not giving you a task. People want to open up — if they feel safe. We’ve seen users go deep with Aitherapy. Anonymity helps. Judgment-free interaction helps more. The biggest challenge is trust — and it has to be earned over time, even with an AI. No one wants a “robot therapist.” They want real relief. So we’re constantly tuning tone, pacing, emotional intelligence — and reminding ourselves this is therapy with AI, not therapy by AI. What’s Next We’re still early. Still learning. Still talking to users every day. But if you're building something at the edge of tech + humanity, or just curious about the process — I’d love to connect. I’ll be sharing more lessons here, and in the meantime, I’d love to hear: What do you think AI is good at when it comes to mental health? And where should we draw the line? Let’s chat.

Last year, I found myself asking a tough question:
Why is it still so hard and expensive to get someone to talk to when you're struggling with your mental health?
That question led me down a rabbit hole of research, experimentation, and eventually... the birth of Aitherapy, an AI therapist designed to make mental health support radically more accessible.
This isn’t a launch post — it’s a checkpoint. I want to share what I’ve learned building in the intersection of tech and mental health, and hear from others working on similar things.
3 Things I’ve Learned So Far
Empathy is a design principle, not just a trait.
Whether you’re building a chatbot or a full platform, the smallest word choices, delays, and visuals feel different when you're mentally overwhelmed. Good tech should feel like it’s taking your hand, not giving you a task.People want to open up — if they feel safe.
We’ve seen users go deep with Aitherapy. Anonymity helps. Judgment-free interaction helps more. The biggest challenge is trust — and it has to be earned over time, even with an AI.No one wants a “robot therapist.”
They want real relief. So we’re constantly tuning tone, pacing, emotional intelligence — and reminding ourselves this is therapy with AI, not therapy by AI.
What’s Next
We’re still early. Still learning. Still talking to users every day.
But if you're building something at the edge of tech + humanity, or just curious about the process — I’d love to connect. I’ll be sharing more lessons here, and in the meantime, I’d love to hear:
What do you think AI is good at when it comes to mental health?
And where should we draw the line?
Let’s chat.