The Birth of Shala: Creating an AI Mental Health Companion for Digital Wellness

Note: This blog post is our capstone project for Kaggle's 5-Day Gen AI Intensive Course with Google Table of contents: Let's Time Travel Why the name Shala? From Chatting to Prototyping Making Shala human Future Plan(s) A good Experience Me Anjolie Jordan Let's Time Travel Think back to a time in your life where you felt so overwhelmed and wanted to talk to a professional about but decided not to out of fear of being seen as weak or a wimp by your friends and family. That quiet kind of suffering, the one that sits in your chest while you try to mask it with a smile is more common than most people admit. For instance, 50% of people who suffer from depression or other mental health illnesses opt out of seeking help due to the stigma surrounding mental health issues. This is exactly why Anjolie (@pixels2bytes), Jordan(@jordancarlin), and I created SHALA(Supportive Help Agent and Lifeline Assistant), an AI mental health chatbot designed to be there when it feels like no one else can be. Why the name Shala? After Anjolie, Jordan, and I decided to create a mental health chatbot, we started brainstorming names in our Discord chat. Anjolie suggested "Shala," an acronym for Supportive Help Agent and Lifeline Assistant. Initially, I hesitated, fearing the name wouldn't be unique enough. However, upon discovering that "Shala" is Sanskrit for "home", I realized it perfectly reflected our goal for users to feel a sense of home when using our chatbot. Now coming up with the app’s name is not the only thing that made Shala awesome. Join me in reminiscing on how we brought it to life. From Chatting to Prototyping Since we wanted Shayla to feel approachable, we went with a simple chatbot interface. Pairing a generative AI agent like Gemini with Gradio turned out to be a great fit not just for its flexibility, but because the default chatbot design had a warm, inviting feel. The soft color palette and clean layout made it easier to create a space that felt safe, not clinical. From there, we thought things were going to be a smooth ride but little did we know that we were in for a bumpy ride. Making Shala Human There were so many challenges that we came across when building Shala but if we had to pick one, it would be training Shala to give more empathetic responses. We dug around everywhere for datasets that were clean enough to work with and actually helped her sound like a real, caring human, not an AI robot reciting facts. This was tricky, because Gemini’s model is great at being accurate, but not so great at being emotionally aware. Trying to get it to respond with warmth instead of just precision was a real struggle. One dataset we found from GitHub seemed promising at first, but it turned out to be all over the place too inconsistent, too chaotic. It started making Shala’s responses weird and unpredictable during testing, so we had to scrap it completely. We eventually overcame this by taking the RedditESS dataset, parsing it into a more readable format and injecting it into the embedding model, helping Shala produce responses that are very heartwarming. Future Plan(s) In the future, Jordan, Anjolie, and I have a number of enhancements in mind to make Shala even more helpful and user-friendly. For example, we’d like to implement a feature that allows Shala to quickly and accurately identify and provide resources that are specifically relevant to the user's location. This would involve integrating geolocation services and a comprehensive database of location-based resources, enabling Shala to tailor its responses to the user's specific needs based on where they live. By providing resources that are geographically relevant, we can ensure that users receive the most appropriate and accessible care and support for their particular situation. A Good Experience Overall, building Shala with Jordan and Anjolie was fun. Collaborating with them not only deepened my understanding of AI and machine learning but also highlighted the profound impact that datasets can have on a chatbot's personality and user interactions. Witnessing how our chatbot evolved and adapted based on the data it was trained on was truly fascinating. If our story got you interested in learning about Gen-AI, Google just released a self-guided version of this course. Also, If you want to see more of mine, Jordan’s and Anjolie tech adventures, follow us on dev.to and check the other links below: Me Christine Belzie Follow Technical Writer| Open Source Maintainer | Coding Hobbyist

Apr 20, 2025 - 06:06
 0
The Birth of Shala: Creating an AI Mental Health Companion for Digital Wellness

Note: This blog post is our capstone project for Kaggle's 5-Day Gen AI Intensive Course with Google

Table of contents:

  • Let's Time Travel
  • Why the name Shala?
  • From Chatting to Prototyping
  • Making Shala human
  • Future Plan(s)
  • A good Experience
    • Me
    • Anjolie
    • Jordan

Let's Time Travel

Think back to a time in your life where you felt so overwhelmed and wanted to talk to a professional about but decided not to out of fear of being seen as weak or a wimp by your friends and family. That quiet kind of suffering, the one that sits in your chest while you try to mask it with a smile is more common than most people admit. For instance, 50% of people who suffer from depression or other mental health illnesses opt out of seeking help due to the stigma surrounding mental health issues. This is exactly why Anjolie (@pixels2bytes), Jordan(@jordancarlin), and I created SHALA(Supportive Help Agent and Lifeline Assistant), an AI mental health chatbot designed to be there when it feels like no one else can be.

Why the name Shala?

After Anjolie, Jordan, and I decided to create a mental health chatbot, we started brainstorming names in our Discord chat. Anjolie suggested "Shala," an acronym for Supportive Help Agent and Lifeline Assistant. Initially, I hesitated, fearing the name wouldn't be unique enough. However, upon discovering that "Shala" is Sanskrit for "home", I realized it perfectly reflected our goal for users to feel a sense of home when using our chatbot. Now coming up with the app’s name is not the only thing that made Shala awesome. Join me in reminiscing on how we brought it to life.

From Chatting to Prototyping

Since we wanted Shayla to feel approachable, we went with a simple chatbot interface. Pairing a generative AI agent like Gemini with Gradio turned out to be a great fit not just for its flexibility, but because the default chatbot design had a warm, inviting feel. The soft color palette and clean layout made it easier to create a space that felt safe, not clinical. From there, we thought things were going to be a smooth ride but little did we know that we were in for a bumpy ride.

Making Shala Human

There were so many challenges that we came across when building Shala but if we had to pick one, it would be training Shala to give more empathetic responses. We dug around everywhere for datasets that were clean enough to work with and actually helped her sound like a real, caring human, not an AI robot reciting facts. This was tricky, because Gemini’s model is great at being accurate, but not so great at being emotionally aware. Trying to get it to respond with warmth instead of just precision was a real struggle. One dataset we found from GitHub seemed promising at first, but it turned out to be all over the place too inconsistent, too chaotic. It started making Shala’s responses weird and unpredictable during testing, so we had to scrap it completely. We eventually overcame this by taking the RedditESS dataset, parsing it into a more readable format and injecting it into the embedding model, helping Shala produce responses that are very heartwarming.

Future Plan(s)

In the future, Jordan, Anjolie, and I have a number of enhancements in mind to make Shala even more helpful and user-friendly. For example, we’d like to implement a feature that allows Shala to quickly and accurately identify and provide resources that are specifically relevant to the user's location. This would involve integrating geolocation services and a comprehensive database of location-based resources, enabling Shala to tailor its responses to the user's specific needs based on where they live. By providing resources that are geographically relevant, we can ensure that users receive the most appropriate and accessible care and support for their particular situation.

A Good Experience

Overall, building Shala with Jordan and Anjolie was fun. Collaborating with them not only deepened my understanding of AI and machine learning but also highlighted the profound impact that datasets can have on a chatbot's personality and user interactions. Witnessing how our chatbot evolved and adapted based on the data it was trained on was truly fascinating.
If our story got you interested in learning about Gen-AI, Google just released a self-guided version of this course. Also, If you want to see more of mine, Jordan’s and Anjolie tech adventures, follow us on dev.to and check the other links below:

Me