Amazon facial-identification software used by police falls short on tests for accuracy and bias, new research finds

  • 📰 washingtonpost
  • ⏱ Reading Time:
  • 1 sec. here
  • 2 min. at publisher
  • 📊 Quality Score:
  • News: 4%
  • Publisher: 72%

Technology Technology Headlines News

Technology Technology Latest News,Technology Technology Headlines

The new research is raising concerns about how biased results could tarnish the artificial-intelligence technology’s exploding use by police and in public venues, including airports and schools.

 

Thank you for your comment. Your comment will be published after being reviewed.
Please try again later.
We have summarized this news so that you can read it quickly. If you are interested in the news, you can read the full text here. Read more:

 /  🏆 95. in TECHNOLOGY

Technology Technology Latest News, Technology Technology Headlines

Similar News:You can also read news stories similar to this one that we have collected from other news sources.

Researchers say Amazon face-detection technology shows biasFacial-detection technology that Amazon is marketing to law enforcement often misidentifies women, particularly those with darker skin, according to researchers from MIT and the University of Toronto. Privacy and civil rights advocates have called on Amazon to stop marketing its... “Often” “particularly”....which means what? Nothing? Oh gotcha. Why does that even matter? I thought identifying someone by their gender is politically incorrect 🤔 LoL 😒
Source: ABC - 🏆 471. / 51 Read more »