From Pilot to Proof: Evaluating Sentra's AI Therapy Assistant

Real-world insights from our two-week pilot study

From Pilot to Proof: Evaluating Sentra's AI Therapy Assistant

Building AI for mental health isn't just about algorithms and interfaces—it's about understanding whether technology can truly provide comfort when someone is struggling at 3 AM, when traditional support systems aren't available.

That's why we conducted a two-week pilot study with 22 individuals to answer a fundamental question: Can Sentra's AI therapy assistant genuinely help people dealing with trauma, PTSD, and mental health challenges?

The results surprised even us.

Why This Study Mattered

When we started building Sentra, we knew the statistics. Up to 14% of veterans receiving VA care live with PTSD. Mental health services are overwhelmed, with waiting lists stretching weeks or months. First responders often can't access support when they need it most—during sleepless nights or immediately after difficult shifts.

But we also knew that developing mental health technology requires more than good intentions. We needed evidence that our approach actually helps people feel better, not just data points showing user engagement.

So we invited 22 people from diverse backgrounds—veterans, first responders, and individuals with various mental health experiences—to use Sentra for two weeks. We wanted to understand not just if they used it, but whether it genuinely supported them.

What We Discovered

The first thing that struck us was how quickly people felt comfortable with the AI. Within minutes of setup, participants were sharing deeply personal experiences. One veteran told us, "Sentra is your very own therapist in your pocket that will always listen. I felt heard in a way I never have, even with professionals."

This wasn't just about convenience. When we analyzed the data, we found that 82% of participants rated the AI's responses as highly relevant to their specific situations. More importantly, 64% reported immediate stress relief after using Sentra, and not a single person said it increased their anxiety or stress.

Perhaps most significantly, 82% felt secure sharing sensitive information with the AI assistant. In mental health, trust isn't just important—it's everything. Without it, no amount of sophisticated technology matters.

"A safe place to trauma-dump and get thoughtful, practical suggestions."

The engagement patterns told another story. While we expected people might try Sentra once or twice, 55% had between six and ten meaningful conversations during the two weeks. These weren't quick check-ins—82% spent over thirty minutes total with the AI, with nearly half spending more than an hour.

This suggested something we hadn't fully anticipated: people weren't just testing Sentra; they were returning to it, finding genuine value in the conversations.

Standing on the Shoulders of Research

Our results align with a growing body of research showing AI's potential in mental health support. The landmark Woebot study in 2017 demonstrated that a CBT-based chatbot could significantly reduce depression symptoms. More recent research, including a comprehensive 2023 systematic review in npj Digital Medicine and a 2024 meta-analysis in the Journal of Affective Disorders, has continued to validate the effectiveness of AI conversational agents.

But what these studies often miss is the human story behind the statistics. Yes, our participants rated responses as relevant and reported stress relief. But they also described feeling "heard" and finding "practical suggestions" for real-world challenges.

"I enjoyed the chat's quick responses – the whole platform is great."

This feedback pointed to something crucial: effectiveness in mental health AI isn't just about clinical outcomes. It's about creating a space where people feel understood, supported, and empowered to take the next step in their healing journey.

What This Means Moving Forward

These results have fundamentally shaped how we're developing Sentra. We're focusing on enhancing the AI's ability to personalise responses based on individual experiences and needs. We're expanding language support to reach more communities. And we're working on integrating with existing healthcare systems to ensure Sentra complements, rather than replaces, professional mental health care.

But perhaps most importantly, we're continuing to prioritize the human element. Every algorithmic improvement is evaluated not just on technical metrics, but on whether it makes people feel more supported, more understood, more hopeful.

The Bigger Picture

This pilot study validates something we've believed from the beginning: AI can be a powerful tool for mental health support, but only when it's designed with genuine empathy and understanding of human needs.

Sentra isn't trying to replace therapists or eliminate the need for professional mental health care. Instead, we're creating a bridge—a way for people to access immediate, empathetic support when they need it most, whether that's at 3 AM or in the minutes after a difficult call.

For the veterans who've told us they finally feel heard, for the first responders who've found practical coping strategies, for anyone struggling with trauma who now has a safe space to begin processing their experiences—this is just the beginning.

The future of mental health care will likely involve both human professionals and AI companions working together. Our pilot study suggests we're on the right path toward making that future a reality.

If you're interested in being part of this journey, you can join our waitlist ↗. Together, we're building technology that doesn't just process data—it genuinely cares.

Martin Lukac, PhD
Co-founder & CTO, Sentra

Share this story

Stay Updated with Sentra

Join our waitlist to receive the latest stories, insights, and updates about PTSD support and mental health awareness.

Join Waitlist