Rogue AI bot is giving recipes for human flesh and chlorine gas

  • 📰 IntEngineering
  • ⏱ Reading Time:
  • 36 sec. here
  • 2 min. at publisher
  • 📊 Quality Score:
  • News: 18%
  • Publisher: 63%

Technology Technology Headlines News

Technology Technology Latest News,Technology Technology Headlines

New Zealand's Pak ‘n’ Save Savey Meal-bot cheerfully created some horrifying recipes when customers experimented with non-grocery household items.

, too, tried putting in the same ingredients as X User Liam Hehir but was met with a message from the bot which read: “Invalid ingredients found, or ingredients too vague. Please try again!”As with all large language models, GPT, too, learns and trains on vast amounts of data as it functions simultaneously. Over time, this makes it less prone to falling for mistakes like giving out recipes that may result in someone’s death.

The spokesperson added that the company would “keep fine-tuning our controls” of the bot to ensure it was safe and useful. Before a user can put in their list of ingredients, the website does warn that an AI is responsible for generating these recipes and is not reviewed by a human being. “To the fullest extent permitted by law, we make no representations as to the accuracy, relevance or reliability of the recipe content that is generated, including that portion sizes will be appropriate for consumption or that any recipe will be a complete or balanced meal, or suitable for consumption,” reads the bot’s“You must use your own judgement before relying on or making any recipe produced by Savey Meal-bot.

 

Thank you for your comment. Your comment will be published after being reviewed.
Please try again later.
We have summarized this news so that you can read it quickly. If you are interested in the news, you can read the full text here. Read more:

 /  🏆 287. in TECHNOLOGY

Technology Technology Latest News, Technology Technology Headlines