Artists and computer scientists are testing out a new way to stop artificial intelligence from ripping off copyrighted images: “poison” the AI models with visions of cats. A tool called Nightshade, released in January by University of Chicago researchers, changes images in small ways that are nearly invisible to the human eye but look dramatically different to AI platforms that ingest them.
“If you are a creator of any type, if you take photos, for example, and you don’t necessarily want your photos to be fed into a training model — or your children’s likenesses, or your own likenesses to be fed into a model — then Nightshade is something you might consider,” Zhao said. The tool is free to use, and Zhao said he intends to keep it that way. Models like Stable Diffusion already offer “opt-outs” so artists can tell datasets not to use their content.