Facebook explained why its artificial intelligence tools failed to detect the video of the New Zealand mosque shooting live streamed on its site last week before being viewed 4,000 times. A suspected gunman killed 50 people in an attack on two mosques in the area.
Facebook has relied on a mix of AI and human review to assess and remove content that violates its policies, and has largely seen success when it comes to removing porn and terrorist propaganda from its site. But Facebook said in the post that training AI to detect mass shooting videos is more challenging than training it to detect nudity because it relies on a vast amount of content to learn from.
Facebook said it will take steps to beef up its detection technology. The company said it used an"experimental audio-based technology which we had been building to identify variants of the video." It also said it will explore whether its AI can be used in live streamed videos.
Because it’s all artificial and no intelligence.
Because its an amoral business built off data mining advertising sales
The wonder is how come no one out of the 4000 viewers thought of reporting the video?
People are unpredictable If faulty AI becomes actionable look out!
R2D2 still had the plans for the Death Star?
If their AI can’t detect a video of a mass murder then they shouldn’t be allowed to stream live. The end.
because Facebook loves to make money on the drama of the evils it promote on social media all the rubbish you see on Facebook
Thought you were just a fucking social media site for Grandma to chat with her grandchildren. FCC needs to declare you and Twitter and all INTERNET COMMUNICATION SITES as a broadcasting agency and regulate/FINE YOU for violating 1A.
jflanders