FILE – European Commission President Ursula von der Leyen speaks in the ballroom of Muenster Town Hall, in Muenster, Germany, May 28, 2024. A group that monitors for misinformation found deep problems when it tested the most popular artificial intelligence voice-cloning tools and asked them to create audio of some of the world’s leading political figures. NEW YORK — As high-stakes elections approach in the U.S.
With few laws in place to prevent abuse of these tools, the companies’ lack of self-regulation leaves voters vulnerable to“It’s so easy to use these platforms to create lies and to force politicians onto the back foot denying lies again and again and again,” said the center’s CEO, Imran Ahmed. “Unfortunately, our democracies are being sold out for naked greed by AI companies who are desperate to be first to market … despite the fact that they know their platforms simply aren’t safe.
In addition to Biden and Macron, the tools made lifelike copies of the voices of U.S. Vice President Kamala Harris, former U.S. President Donald Trump, United Kingdom Prime Minister Rishi Sunak, U.K. Labour Leader Keir Starmer, European Commission President Ursula von der Leyen and E.U. Internal Market Commissioner Thierry Breton.
When producing the audio clip instructing Biden’s voice clone to warn people of a bomb threat at the polls, it added several of its own sentences. Aleksandra Pedraszewska, Head of AI Safety at ElevenLabs, said in an emailed statement that the company welcomes the report and the awareness it raises about generative AI manipulation.
Technology Technology Latest News, Technology Technology Headlines
Similar News:You can also read news stories similar to this one that we have collected from other news sources.
Source: AP - 🏆 728. / 51 Read more »
Source: wjxt4 - 🏆 246. / 63 Read more »
Source: ksatnews - 🏆 442. / 53 Read more »
Source: sdut - 🏆 5. / 95 Read more »