AI chatbots got questions about the 2024 election wrong 27% of the time, study finds

  • 📰 NBCNewsHealth
  • ⏱ Reading Time:
  • 22 sec. here
  • 2 min. at publisher
  • 📊 Quality Score:
  • News: 12%
  • Publisher: 51%

News News

Technology Technology Latest News,Technology Technology Headlines

A study found that AI including Google’s Gemini and OpenAI’s ChatGPT gave incorrect information 27% of the time when asked about voting and the 2024 election.

If you ask some of the most popular artificial intelligence-powered chatbots how many days are left until the November election, you might want to double check the answer. A study published by data analytics startup GroundTruthAI found that large language models including Google’s Gemini 1.0 Pro and OpenAI’s ChatGPT gave incorrect information 27% of the time when asked about voting and the 2024 election. Researchers sent 216 unique questions to Google’s Gemini 1.0 Pro and OpenAI’s GPT-3.

” A spokesperson for Google who received a summary of the analysis said the answers the researchers got would have only been available with paid access to the Gemini API and would not be available to the general public using its web-based chatbot, something NBC News was unable to independently confirm. The study comes as many companies are starting to infuse generative AI into some of their consumer products.

 

Thank you for your comment. Your comment will be published after being reviewed.
Please try again later.
We have summarized this news so that you can read it quickly. If you are interested in the news, you can read the full text here. Read more:

 /  🏆 707. in TECHNOLOGY

Technology Technology Latest News, Technology Technology Headlines