October 6, 2025

His 6 -year -old son told him that he wanted to die. She therefore built an AI business to save him

0
Backpack-has-an-AI-care-companion-989x675.jpg


The booming world of mental health support fueled by AI is a mines field. Chatbots giving medical advice dangerously incorrect to the Companions of AI encouraging self -harm, the titles are filled with stories on duty.

High -level applications like the character. AI and the folder were faced with reactions for harmful and inappropriate responses, and academic studies have made an alarm.

Two recent studies from the University of Stanford and Cornell have revealed that AI chatbots often stigmatize conditions such as alcohol dependence and schizophrenia, respond “inappropriately” to certain commons and “encourage delirious thought of customers”. They warned against the risk of renowned on AI without human supervision.

But in this context, Hafezah Muhammad, a black woman, builds something different. And she does it for painfully personal reasons.

“In October 2020, my son, who was six years old, came to see me and told me that he wanted to commit suicide,” she says, her voice always bringing the weight of this moment. “My heart broke. I didn’t see him coming. ”

At the time, she was executive of a national mental health company, someone who knew the system inside and outside. However, she still couldn’t get her son, who has a handicap and is on Medicaid, in care.

“Only 30% or less providers even accept Medicaid,” she explains. “More than 50% of children in the United States are now from multicultural households, and there were no solutions for us.”

She says she was terrified, embarrassed and worried about the stigma of a child in difficulty. So she built the thing she could not find.

Today, Muhammad is the founder and CEO of Backpack Healthcare, a supplier based in Maryland which served more than 4,000 pediatric patients, most of them on Medicaid. It is a company that mark its future at the radical idea that technology can support mental health without replacing the human touch.

Practical, not a replacement therapist

On paper, the backpack looks like many other remote series startups. In reality, his AI approach is deliberately pragmatic, focusing on “boring” but impactful applications that empower human therapists.

An algorithm associates children with the best therapist possible during the first trial (91% of patients remain with their first match). The AI also writes treatment plans and session notes, giving clinicians the hours they lost in paperwork.

“Our suppliers spent more than 20 hours per week in administrative tasks,” explains Muhammad. “But they are the publishers.”

This human approach in a loop is at the heart of Backpack’s philosophy.

The most critical differentiator for the backpack lies in his robust ethical railing. His care companion AI 24/7 is represented by “Zipp”, a friendly cartoon character. It is a deliberate choice to avoid the dangerous “the illusion of empathy” seen in other chatbots.

“We wanted to clearly indicate that it is a tool, not a human,” explains Muhammad.

The Nans Rivat investor of Pace Healthcare Capital calls this the trap of “LLM empathy”, where users “forget that you are talking to a tool at the end of the day”. He points to cases like the character. Ai, where a lack of these railings has led to “tragic” results.

Muhammad is also categorical about data confidentiality. She explains that the individual patients of patients are never shared without explicit and signed consent. However, the company uses aggregated anonymized data to report trends, such as the speed with which a group of patients was planned for care, to its partners.

More importantly, Backpack uses its internal data to improve clinical results. By following measures such as anxiety or levels of depression, the system may report a patient who may need a higher level of care, ensuring that technology is used to improve children, more quickly.

Above all, the backpack system also includes an immediate crisis detection protocol. If a child hits a sentence indicating suicidal ideas, the chatbot instantly responds with crisis hotline numbers and instructions to call 911. Simultaneously, an “immediate distress message” is sent to the human crisis response team, which contacts the family directly.

“We are not trying to replace a therapist,” explains Rivat. “We add a tool that did not exist before, with integrated security.”

Build human backbone

Beyond its ethical technology, Backpack also attacks the shortage of the national therapist. In many cases, therapists, unlike doctors, must traditionally pay their pocket for the expensive supervision hours necessary to obtain a license.

To fight this, Backpack launched its own two -year residence program that covers these costs, creating a pipeline of dedicated and well -trained therapists. More than 500 people apply each year and the program has an impressive 75%retention rate.

In 2021, the general surgeon of the time, Dr. Vivek H. Motherthy, called mental health “the question of decisive public health of our time” while referring to the mental health crisis that raged young people.

Muhammad does not be the criticisms that AI could worsen things.

“Either someone else will build this technology without the right railing, or I can, as a mom, ensure that this is well done,” she said.

Her son is now 11 years old, booming, and serves as “Child Innovator” from Backpack.

“If we do our job well, they don’t need us forever,” explains Muhammad. “We now give them the tools, so they become resilient adults. It’s like teaching them to cycle. You learn it once, and it becomes a part of who you are. ”


https://gizmodo.com/app/uploads/2025/08/Backpack-has-an-AI-care-companion-989×675.jpg

About The Author

Leave a Reply

Your email address will not be published. Required fields are marked *