Lying To Your Therapist Is Being Superseded By Telling The Truth To Generative AI

  • 📰 ForbesTech
  • ⏱ Reading Time:
  • 121 sec. here
  • 13 min. at publisher
  • 📊 Quality Score:
  • News: 83%
  • Publisher: 59%

Artificial Intelligence AI News

Large Language Models Llms,Generative AI,Chatgpt

Dr. Lance B. Eliot is a world-renowned expert on Artificial Intelligence (AI) with over 7.4+ million amassed views of his AI columns. As a CIO/CTO seasoned executive and high-tech entrepreneur, he combines practical industry experience with deep academic research.

In today’s column, I am continuing my ongoing series about the impact of generative AI in the health and medical realm. The focus this time is once again on the mental health domain and examines the eyebrow-raising aspect that people will lie to their therapist, something they presumably shouldn’t be doing, meanwhile, people tend to be markedly truthful to generative AI when seeking mental health advice .

“The primary aim of this study was to investigate one facet of a survey of the client lying in psychotherapy, that which focused on the nature, motivation, and extent of client dishonesty related to psychotherapy and the therapeutic relationship.” “Most extreme in their extent of dishonesty were lies regarding romantic or sexual feelings about one’s therapist, and not admitting to wanting to end therapy.”

“Results suggest that attachment style plays a significant role in determining individuals’ likelihood of discussing personally distressing topics online and in determining the extent to which they find disclosures in therapy and anonymous and non-anonymous online spaces to be helpful.” “You tell me you are worried about your drinking. I ask you how many drinks you are having each night, and you tell me 1, but you are actually drinking 3-4, and report that your antidepressant isn’t helping. How can I really help?”

There are barriers to being truthful. You might want to refrain from hurting someone’s feelings. You might be embarrassed about something that you did and do not want to confess to it. And so on., July 25, 2019, these are some notable facets of the therapist-client relationship and how lies enter into the picture :

Here are the biggest motivators, according to experts: They don’t want to deal with the consequences, They’re in denial, They don’t want to relive their trauma, and They want their therapist to like them. To make matters more confounding, you don’t have to bring up a mental health topic to have generative AI proceed to give you mental health advice. There are lots of ways that generative AI can become triggered in a computational pattern-matching way that the AI will suddenly offer therapeutic commentary. No need for the user to enter an explicit prompt to get this. It can happen for a slew of other reasons, see my discussion atPeople at times act as though they are using a private confessional.

Many of those assumptions about generative AI aren’t fully true, but it is what people seem to believe to be the case. I’ll be repeating this cautionary note for each of these depictions.Similar to the above-noted logic, people might feel freer to lambast AI if the mental health advice seems ineffective. Such assumptions about generative AI aren’t fully true, but it is what people seem to believe to be the case.

 

Thank you for your comment. Your comment will be published after being reviewed.
Please try again later.
We have summarized this news so that you can read it quickly. If you are interested in the news, you can read the full text here. Read more:

 /  🏆 318. in TECHNOLOGY

Technology Technology Latest News, Technology Technology Headlines