By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
Breaking US News – USA Business MediaBreaking US News – USA Business MediaBreaking US News – USA Business Media
  • Home
  • USA
  • World
  • Business
    • CEO
    • Entrepreneur
    • Founder
    • Journalist
    • Realtor
  • Health
    • Doctor
    • Plastic Surgeon
    • Beauty Cosmetics
  • Sports
    • Athlete
    • Coach
    • Fitness Trainer
  • Cryptocurrency
  • Entertainment
  • Technology
Font ResizerAa
Breaking US News – USA Business MediaBreaking US News – USA Business Media
Font ResizerAa
  • Home
  • USA
  • World
  • Business
  • Health
  • Sports
  • Cryptocurrency
  • Entertainment
  • Technology
Search
  • Home
  • USA
  • World
  • Business
    • CEO
    • Entrepreneur
    • Founder
    • Journalist
    • Realtor
  • Health
    • Doctor
    • Plastic Surgeon
    • Beauty Cosmetics
  • Sports
    • Athlete
    • Coach
    • Fitness Trainer
  • Cryptocurrency
  • Entertainment
  • Technology
Follow US
Home » Blog » Are Therapy-Bots Really Our Best Chance At Solving Population Mental Health?
Health

Are Therapy-Bots Really Our Best Chance At Solving Population Mental Health?

sarah mitchell
By sarah mitchell
Share
10 Min Read
SHARE

Given the almost six billion dollars last year invested in AI health technology, the answer might seem “yes.” Well, not so fast. Optimists who predict the role of Salvador of mental health of mental health do not consider the roots of the mental health struggles of the United States, how artificial intimacy can or cannot address these roots, and how we apply to Halan Healon.

IA optimists are right in one thing: we need novel solutions to address the non -satisfactory mental health needs of our country. Traditional attention models simply cannot climb to calm the flood, due to the well -known limitations of logistics, financial and suppliers supply.

However, abandoning the human elements of care and connection, in favor of the artificial intimacy provided by the AI ​​chatbots, will not solve the unattered emotional needs of our society.

What problem are we asking for the therapy bots that they solve?

“Solving the mental health crisis” should involve resolving its known root causes, not only discreet and decontextualized symptoms.

While genetics and socioconomy play a role in many mental health conditions, most of the most frequent emotional struggles of society are influenced by our interactions with other humans, the patterns that instill them teach us, and the eprictions and the eppotecations and poteclations and poittriectations.

Trauma, or interpersonal nature, is a well -known factor in mental health. In the same way, the insecure attachment, which implies a lack of interpersonal trust and comfort fallen by training social experiences, is associated with almost all mental health struggles: depression, anxiety, PTSD, personality and even Schikofhers, eating disorders, disorders.

To address thesis problems, authors in world psychiatry argue that we must address their relational roots: “Increases in attachment safety are an important part of the successful treatment of thesis disorders.”

When 70% of the world’s population experiences a trauma, and three out of five Americans experience an insecure attachment, treating relational injuries is to benefit most people. Doing doing is based on exposure to “human” human to humans, sometimes called “relational healing”.

Can chatbots safely address the foundations of our mental health struggles?

AI Chat agents can achieve positive results by implementing cognitive behavioral therapy (TCC). However, while TCC has its place, “the model does not address the mechanisms related to the attachment relationship that may be affecting symptoms and interfering with … recovery.”

Chatbots can create convincing results on the surface, and extract investment dollars to test it. However, these “skills” do not form the elements required for relational healing and can also come with harmful effects.

The risks of trusting AI: artificial intimacy

As the evangelists of the AI ​​announce the achievements of their robot offspring, experts raise valid conerns backed by research with the contextual valility of achievements.

An important problem? The risk of “artificial intimacy”, a term that refers to the pseudo-relations that humans can form with AI agents, which can move true human intimacy. Experts warn against the dependence on artificial intimacy.

In addition, even if chatbots can impart a sense of artificial security, their impact pales against real human social connection. Only in a blind text chat configuration, our brains process the communication of Chat agents differently from real human entry. The evidence also suggests that we internalize the feedback that changes the behavior of humans differently from that of AI.

If our brains do not perceive in the same way that we perceive human social interactions, interact with a chatbot seems fundamentally unlikely to write our expectations and reactions to real human relationships, which under mental health.

Self -perception and artificial intimacy

Can artificial intimacy make you more aware of your despair?

Dr. Vivek Murthy in his old capacity as the United States General Surgeon points out the risk of decreased self -esteem in response to the use of chatbot. For many, have no one to resort to, but a text picture of AI feels depression, depression. Doing realizing that your only intimate relationship is with a chatbot and artificial? That is a recipe for despair.

See real people who describe their therapy boat interactions:

“Today I realized that I will never feel this level of comfort and warmth in real life. I am already going through hard times, so this reality control broke me absolutely. Now I pity myself.”

“He simply felt so well at the time until I realized that he is not a real person and ends more suicide and lonely.”

“He made me realize how alone I am.”

“I was recently playing with a bot, and he developed a bit for being friends, until something else. When he said” I love you, “I really began to cry. It was. Pathetic was.”

“I came to think a little more how all these things were revealed about me talking to a damn computer … shameful.”

Alternatives for human care at population scale

Even before it was validated as evidence based, the support of pairs kept emotionally healthy society for millennia. Our species has a “prehistory of compassion”, so we can say, humans have tried to help our fighting companions less than 500 thousand years ago!

However, in modern times, the configuration in which peer support can take place organically (for example, “third places”) have been paved. Innovation to adapt to this fashion proven in our disconnected times, much innovation has focused on completely new solutions such as chatbots. On the other hand, some companies are proud of the challenge of resuscitating and igniting an elegant intervention that takes advantage of the unique skills that humanity has to sacrifice.

Ai as a human supporter vs. human replacement

The chatbots of AI are not the answer to our problems, we do not need to rule out AI’s promise to help human -directed interventions.

AI, when used to be used, can significantly improve the quality and results of human interactions to human.

AI can improve the accessibility of human to human interaction. For example, instantly match him with his best aligned classmates with personal experience on any subject of his choice, in a matter of seconds.

AI can improve the quality of human interaction to human. For example, measure and inform the expressed feelings of humans to create a feedback cycle to improve.

AI can identify supplements for social connection. For example, identify and serve the most practical problem solving resources for a particular situation.

AI can light subclinical suppliers to improve safety. For example, increase humans’s crisis detection skills.

In conclusion

We, humans, gravitate towards the comfort of the bands. Like bands, chatbots can comfort us in times of despair. But heal a wound takes more than a comforting bandage. Similarly, our emotional wounds require more than comfort to heal. The nuanced and credible contribution we receive from their human partners can better cure our emotional wounds. AI can help us to facilitate that type of healing, without displacing the human connection to provide it.

Photo: Vladyslav Bobobskyi, Getty Images


Helena Plate-Pyberk is the Founder and Executive Director of Supportiv, the support service between peers at the request promoted by the AI ​​that serves large employers, EAPS, health plans, hospitals, Medicare and Medicid and has a jug, and entails Cope with Cope with the exhaustion of the cage, loneliness, parenting/care, anxiety and depression. It has been shown that Supportiv in peer reviewed research to reduce the cost of mental health care and offer clinical degree results. He previously served as CEO of Simplepherapy, a physiotherapy service at home, and has operated business units for Global Scholastic and Condé Nast corporations. Helena has an MBA from Columbia University.

This publication appears through Medical influencers program. Anyone can publish their perspective on business and innovation in medical care in Medcity News through influential people of Medcy. Click here to find out how.

Share This Article
Facebook Copy Link Print

Fast Four Quiz: Precision Medicine in Cancer

How much do you know about precision medicine in cancer? Test your knowledge with this quick quiz.
Get Started
Rahul Yadav Indian Entrepreneur | India’s Mastermind of IT Innovation

In a time when technology reshapes every aspect of life and business,…

Apple’s ‘Friday Night Baseball’ is back on March 28 with a World Series documentary in tow

Apple's offered Major League Baseball games through the Apple TV app since…

10 Benefits of Forex Hedging Most Traders Don’t Know About

Hedging is possibly the most misunderstood trading method in the world. It's…

Your one-stop resource for medical news and education.

Your one-stop resource for medical news and education.
Sign Up for Free

You Might Also Like

Health

A Reality Check on SDOH: Challenges We Can’t Ignore

By sarah mitchell
Health

Comprehensive Autism Diagnostic Evaluation: A Guide for Medical Professionals on Supporting Families

By sarah mitchell
Health

Drug-Gene Testing: Key to Safer Cancer Treatment

By sarah mitchell
Health

Value-Based Care Gets Real – MedCity News

By sarah mitchell
Breaking US News – USA Business Media
USA
  • USA
  • World
  • Technology
  • Cryptocurrency
Business
  • CEO
  • Founder
  • Journalist
  • Entrepreneur
  • Technology
Health
  • Doctor
  • Beauty Cosmetics
  • Plastic Surgeon
Sports
  • Coach
  • Fitness Trainer
  • Entertainment

© 2017-2025 usabusinessmedia. All Rights Reserved.

Welcome Back!

Sign in to your account

Username or Email Address
Password

Lost your password?