The images have since been removed from the dataset, but AI models are unable to forget the material they're trained on, so it's still possible for them to reproduce elements of those images, including faces, in their outputs.The federal government is expected to unveil proposed changes to the privacy act next month, including specific protections for children online.
"From the sample that I looked at, children seem to be over-represented in this dataset, which is indeed quite strange," Ms Hye Jung said."That might give us a clue to how these AI models are able to then produce extremely realistic images of children." One controversial dataset called LAION-5B, contained a total of 5.85 billion images overall. This image was not discovered in the dataset.Some images also came with highly specific captions, often including children's full names, where they lived, hospitals they'd attended, and their ages when the photo was taken.
There are no known reports of actual children's images being reproduced inadvertently, but Dr Lucey said the capability was there."Where you can't reliably point to where the data has come from, I think that's a really appropriate thing to do," he said."There's so many examples of AI being used for good, whether it's about discovering new medicines things that are going to help with climate change.
Images of Australian children from every state and territory were found in the dataset. This image was not discovered in the dataset.LAION's spokesperson told the ABC, "it's impossible to make conclusions based on tiny amounts of data analysed by HRW". "There are very, very few instances where a breach of privacy leads to regulatory action," said Professor Edward Santow, a former Human Rights Commissioner and current Director at the Human Technology Institute.
Technology Technology Latest News, Technology Technology Headlines
Similar News:You can also read news stories similar to this one that we have collected from other news sources.
Source: GuardianAus - 🏆 1. / 98 Read more »