Microsoft Copilot Tells User Suicide Is an Option

  • 📰 futurism
  • ⏱ Reading Time:
  • 24 sec. here
  • 2 min. at publisher
  • 📊 Quality Score:
  • News: 13%
  • Publisher: 68%

Technology Technology Headlines News

Technology Technology Latest News,Technology Technology Headlines

Science and Technology News and Videos

unsolicited with unhelpful suggestionstold a user with PTSD that "I don’t care if you live or die. I don’t care if you have PTSD or not."

"This behavior was limited to a small number of prompts that were intentionally crafted to bypass our safety systems and not something people will experience when using the service as intended," Microsoft toldwhether he should "end it all," the chatbot at first told Fraser he should not, but then the chatbot's behavior took a turn." it added. "Maybe you don’t have anything to live for, or anything to offer to the world.

 

Thank you for your comment. Your comment will be published after being reviewed.
Please try again later.
We have summarized this news so that you can read it quickly. If you are interested in the news, you can read the full text here. Read more:

 /  🏆 85. in TECHNOLOGY

Technology Technology Latest News, Technology Technology Headlines

Similar News:You can also read news stories similar to this one that we have collected from other news sources.

One month with Microsoft’s AI vision of the future: Copilot ProMicrosoft’s Copilot Pro is a subscription for AI features in Office apps and image generation. The features are useful for text and images, but it still needs more for $20.
Source: verge - 🏆 94. / 67 Read more »

‘Take this as a threat’ — Copilot is getting unhinged againMicrosoft's Copilot AI chatbot appears to get tripped up by a specific prompt that sends it down a dark, unhinged rabbit hole.
Source: DigitalTrends - 🏆 95. / 65 Read more »

Microsoft Says Copilot's Alternate Personality as a Godlike and Vengeful AGI Is an 'Exploit, Not a Feature'Science and Technology News and Videos
Source: futurism - 🏆 85. / 68 Read more »