, including image hosting and art sites. This gender, as well as some racial and cultural bias, is established because the way Stability AI classifies different categories of images. Luccioni said that if there are 90% of images related to a prompt that are male and 10% that are female, then the system is trained to hone in on the 90%. That may be the most extreme example, but the wider the disparity of images on the LAION dataset, the less likely the system will use it for the image generator.
“It’s like a magnifying glass for inequities of all kinds,” the researcher said. “The model will hone in on the dominant category unless you explicitly nudge it in the other direction. There’s different ways of doing that. But you have to bake that into either the training of the model or the evaluation of the model, and for the Stable Diffusion model, that’s not done.”Compared to other AI generative models on the market, Stable Diffusion has been particularly laissez faire about how, where, and why people can use its systems.
But as more of these systems get released, and the drive to be the preeminent AI image generator on the web becomes the main focus for these companies, Luccioni is concerned companies are not taking the time to develop systems to cut down on issues with AI. Now that these AI systems are being integrated into sites likeAdvertisement
“I think it’s a data problem, it’s a model problem, but it’s also like a human problem that people are going in the direction of ‘more data, bigger models, faster, faster, faster,’” she said. “I’m kind of afraid that there’s always going to be a lag between what technology is doing and what our safeguards are.”
Because…. Who is making them ?
You. Don't. Say.
Because that’s what they were taught
You guys should really just shut down.
Technology Technology Latest News, Technology Technology Headlines
Similar News:You can also read news stories similar to this one that we have collected from other news sources.
Source: WIRED - 🏆 555. / 51 Read more »
Source: DigitalTrends - 🏆 95. / 65 Read more »