Much like the plant that it’s named after, this new tool’s sole purpose is to poison and confuse AI Image generators. Why? Because AI image generators scoop up a pool of numerous images at random from all over the web to create and consolidate the image that you have set the prompt for. But wait. Where did this pool of images come from in the first place? Umm yes, human artists. Remember them? Well till date, the only tools real human artists had to protect their artwork was lawsuits. LOL. That’s until Ben Zhao from the University of Chicago went ‘This is Sparta’ and developed Nightshade. How does it work? Well, it attacks the powerhouse of an image - the pixels! Nightshade manipulates the images at the pixel-level so it’s really difficult to spot with the naked eye.
According to Zhao’s team, when the ‘poisoned’ images were introduced to a platform such as Stability AI's Stable Diffusion XL (SDXL), chaos ensued. Suddenly, the model couldn't tell a "car" from a "cow," and a "dog" was mysteriously transformed into a "cat." Even a simple "hat" got a makeover, turning into a delectable "cake". So it turns out AI can be blinded after all but established platforms like Midjourney, DALL-3 and SDXL have already been trained and consecutively sued by many artists for using their copyrighted work, but Nightshade aims to take the fight to the future. I wonder how ancient artists fought against photographers when cameras were invented. Hopefully, no one poisoned them.