Artists create a tool to sabotage AI-generated images of their works

Images generated by artificial intelligence are a resource that many companies are using indiscriminately to sideline artists and avoid paying them for their works, a worrying trend for the union and one that already has a combative response: Nightshade.

Oliver Thansan
Oliver Thansan
25 October 2023 Wednesday 11:11
7 Reads
Artists create a tool to sabotage AI-generated images of their works

Images generated by artificial intelligence are a resource that many companies are using indiscriminately to sideline artists and avoid paying them for their works, a worrying trend for the union and one that already has a combative response: Nightshade.

This tool allows artists to add invisible changes to the pixels of their works before uploading them to the Internet in order to “poison” generative AI systems. The software is intended to confuse the artificial intelligence by mislabeling components of the original image. The idea is that when the AI ​​system tries to generate images from the original creations that contain this "mark" the result will contain conceptual errors.

The revolutionary anti-AI technology could sabotage future iterations of AI imager models such as DALL-E, Midjourney and Stable Diffusion, by rendering some of their results useless (dogs become cats, cars become cows, etc.). ).

The creators of Nightshade explain that companies involved in AI such as OpenAI, Meta, Google and Stability AI are facing a series of lawsuits from artists who claim that their copyrighted material and personal information have been obtained without consent or compensation .

So the hope is that it will help tip the balance of power back from AI companies toward artists, by creating a powerful deterrent against disrespect for artists' copyrights and intellectual property. , as explained by Ben Zhao, director of the team that created Nightshade.

“Poisoned” data samples can manipulate AI creations so that images of hats are cakes and images of bags become toasters, to give an example of the cases that the creators have tested.