By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
Stay Current on Political News—The US FutureStay Current on Political News—The US FutureStay Current on Political News—The US Future
  • Home
  • USA
  • World
  • Business
    • Realtor
    • CEO
    • Founder
    • Entrepreneur
    • Journalist
  • Sports
    • Athlete
    • Coach
    • Fitness trainer
    • Life Style
  • Education
  • Health
    • Doctor
    • Plastic surgeon
    • Beauty cosmetics
  • Politics
  • Technology
    • Space
    • Cryptocurrency
  • Weather
Reading: ‘How Are You Using AI?’ Therapists Should Ask You That Question, Experts Argue
Share
Font ResizerAa
Font ResizerAa
Stay Current on Political News—The US FutureStay Current on Political News—The US Future
  • Home
  • USA
  • World
  • Business
  • Cryptocurrency
  • Economy
  • Life Style
  • Health
  • Politics
  • Space
  • Sports
  • Technology
  • Weather
  • Entertainment
  • Cybersecurity
Search
  • Home
  • USA
  • World
  • Business
    • Realtor
    • CEO
    • Founder
    • Entrepreneur
    • Journalist
  • Sports
    • Athlete
    • Coach
    • Fitness trainer
    • Life Style
  • Education
  • Health
    • Doctor
    • Plastic surgeon
    • Beauty cosmetics
  • Politics
  • Technology
    • Space
    • Cryptocurrency
  • Weather
Follow US
Stay Current on Political News—The US Future > Blog > Education > ‘How Are You Using AI?’ Therapists Should Ask You That Question, Experts Argue
Education

‘How Are You Using AI?’ Therapists Should Ask You That Question, Experts Argue

Sarah Mitchell
Sarah Mitchell
Published April 13, 2026
Share

Saba and his co-author’s recommendations are “closely aligned” with the recommendations of the American Psychological Association (APA) in a health notice published in November from last year, says the APA Vaile Wright.

Asking what a patient gets out of their conversations with an AI chatbot lays “a foundation for the therapist to better understand how they are trying to navigate their emotional well-being and mental illness,” Wright says.

“A treasure trove of information”

“People use these tools regularly to ask how to cope with stressful experiences and challenges in personal relationships,” explains Saba.

And some use chatbots to get advice on how to cope with symptoms of anxiety and depression.

“To the extent that we can encourage our clients to take these conversations, in greater and greater detail, even into the therapy room, I think there is potentially a treasure trove of information,” he says.

It could be information about the main causes of stress in someone’s life, or if they are turning to a chatbot as a way to avoid confrontations.

“Say, for example, you have a client who is having relationship problems with his or her spouse,” says the APA’s Wright. “And instead of trying to have open conversations with their spouse about how to get their needs met, they turn to the chatbot to get those needs met or to avoid having these difficult conversations with their spouse.”

That experience will help the therapist better support the patient, he explains.

“Helping them understand how to have a safe conversation with their spouse, helping them understand the limitations of AI as a tool to fill those gaps in those needs.”

Discussing the use of AI is also an opportunity to learn about things a client might not willingly share with a therapist, psychiatrist says Dr. Tom Inselformer director of National Institute of Mental Health. “People often use chatbots to talk about things they can’t talk about with other people because they’re too worried about being judged,” he says.

For example, suicidal thoughts may be something a patient is reluctant to share with their therapist, but it is essential for the therapist to know to keep the patient safe.

Be curious, but don’t judge.

When it comes to broaching the topic for the first time with patients, Saba suggests doing so without any judgment.

“We don’t want customers to feel like we’re judging them,” he says. “In general, they just won’t want to work with us if we do that.”

He recommends that therapists approach the topic with genuine curiosity and offers language suggestions for these conversations.

“‘You know, AI is something that’s growing rapidly, and a lot of people have told me that they’re using things like ChatGPT for emotional support,'” he suggests. “‘Is that your case? Have you tried it?'”

She also recommends asking specific questions about what they found helpful so you can better understand how a patient uses these tools.

It could also help a therapist determine whether a chatbot can complement therapy in helpful ways, Insel says, such as examining what topics to bring to their sessions or venting about everyday life.

In a way, therapy and chatbots “could be aligned to work together,” Insel says.

Saba and his co-author, William Weeks, also suggest asking patients if they found any interactions with the chatbot unhelpful or problematic, and also offering to share the risks of using chatbots for emotional support.

For example, data privacy risks, because many AI companies Use conversations, even the most sensitive ones, to further train your models..

There are also risks to treating a chatbot like a therapist, Insel says.

Talking to a chatbot about one’s mental health is “the opposite of therapy,” he says, because chatbots are designed to affirm and flatter, reinforcing users’ thoughts and feelings.

“Therapy is there to help you change and challenge you,” Insel says, “and to get you to talk about things that are particularly difficult.”

Adopting the advice

Psychologist Cami Winkelspecht has a private practice working primarily with children and adolescents in Wilmington, Del.

He has been considering adding questions about social media use and AI to his intake form and appreciated Saba’s study as it offered some sample questions to include.

The ChatGPT home page on a computer screen.
The ChatGPT home page on a computer screen. (Kiichiro Sato | AP)

Over the past year, Winkelspecht has had an increasing number of clients and parents asking him for help using AI for brainstorming and other tasks in ways that don’t break the school’s honor code. So, he had to familiarize himself with the technology to be able to support his clients. Along the way, she has realized that therapists and children’s parents need to be more aware of how children and teens use their digital devices, both social media and AI chatbots.

Contents
“A treasure trove of information”Be curious, but don’t judge.Adopting the advice
Popular News
USA

What to know about Sarah Palin’s libel lawsuit against the New York Times: expert (Video)

Sophia Martin
Sophia Martin
April 16, 2025
Elijah Kalā: Breathing Life into the Spirit of Aloha
When Virat Kohli says it is, that’s great PR
‘Last of Us’ star Bella Ramsey reveals autism diagnosis, says it’s ‘freeing’
George Mills: “It’s not rocket science, just a lot of mundane work”
Stay Current on Political News—The US Future
The USA Future offers real-time updates, expert analysis, and breaking stories on U.S. politics, culture, and current events.
  • USA
  • World
  • Politics
  • Education
  • Weather
  • Business
  • Entrepreneur
  • Founder
  • Journalist
  • Realtor
  • Health
  • Doctor
  • Beauty cosmetics
  • Plastic surgeon
  • Sports
  • Athlete
  • Coach
  • Fitness trainer
© 2017-2026 The USA Future . All Rights Reserved.
Welcome Back!

Sign in to your account

Username or Email Address
Password

Lost your password?