"Nightshade Tool Launched to Safeguard Works from AI Infringement"

2024-01-23

After several months of its initial release, a new free software tool called Nightshade has finally emerged, allowing artists to "poison" artificial intelligence models.

The tool was developed by computer scientists from the Glaze Project at the University of Chicago, led by Professor Ben Zhao. Its working principle is to pit artificial intelligence against artificial intelligence. It uses the popular open-source machine learning framework PyTorch to identify the content in a given image and then applies a label. It cleverly alters the image at the pixel level, allowing other AI programs to see something completely different from reality.

This is the team's second tool of this kind. Nearly a year ago, they released a separate program called Glaze, which aimed to alter digital artworks according to user requests, confusing AI training algorithms into perceiving the images with different styles, such as different colors and brushstrokes.

However, Glaze, designed by the Chicago team, is a defensive tool, and it is still recommended for artists to use it alongside Nightshade to prevent AI models from imitating their styles, while Nightshade is designed as an "offensive tool".

If an AI model is eventually trained on many images modified or "colored" by Nightshade, all users of that model may mistakenly classify objects, even those that have not been colored by Nightshade.

The team further explains, "For example, while the human eye may see a shadow image of a cow in a green field with little change, an AI model may see a large handbag lying in the grass."

Therefore, even if a user requests an AI model to generate an image of a cow, if the AI model processes the shadow image of the cow to resemble a handbag, it will start generating handbags instead of cows.

Requirements and Nightshade's working principle

Artists who want to use Nightshade must have a Mac with an Apple chip (M1, M2, or M3) or a PC running Windows 10 or 11. The Windows file can also run on a PC's GPU, provided it supports Nvidia GPUs listed in the hardware support list.

Due to high demand for the tool, some users have reported long download times, some lasting up to 8 hours (the file sizes for the Mac and PC versions are 255MB and 2.6GB, respectively).

Users must also agree to the Glaze/Nightshade team's End User License Agreement (EULA), which stipulates that users can use the tool on their own controlled machines, without modifying the underlying source code, and not "copy, reproduce, distribute, resell, or otherwise use this software for any commercial purpose."

Nightshade v1.0 "transforms images into 'poisoned' samples, causing [AI] models trained on them without consent to exhibit unpredictable behavior deviating from expected norms. For example, a prompt requesting an image of a cow flying in space might yield an image of a floating handbag," the development team stated in a blog post on their website.

In other words, by using Nightshade v1.0 to "obscure" images, the images are transformed into new versions with the help of open-source AI libraries. Ideally, the transformation is subtle enough that it appears visually similar to the human eye, but for any AI model trained on it, it seems to contain completely different subjects.

Furthermore, the tool can resist most typical conversions and modifications that users or viewers may apply to the images. As the team explains, "You can crop, resample, compress, smooth pixels, or add noise, and the poison effect remains. You can take screenshots or even photograph images displayed on a monitor, and the shadow effect remains. This is because it is not a watermark or hidden information (steganography)."

Applause and condemnation

While some artists have downloaded Nightshade v1.0 and started using it, there are also internet users who express dissatisfaction, seeing it as an attack on AI models and companies.

The Glaze/Nightshade team denies seeking destructive purposes and writes, "Nightshade's goal is not to break models but to increase the cost of training on unauthorized data, making licensing images from creators a viable alternative."

In other words, the aim is to make AI model developers pay artists to use their undistorted data for training.

The team further explains their ultimate goal, "By using Nightshade responsibly, it can help deter model trainers who ignore copyrights, opt-outs, and do-not-scrape/robots.txt directives. It does not rely on the goodwill of model trainers but charges a small incremental fee for each unauthorized data scrape and training."

In essence, the intention is to make AI model creators pay a higher price, making them think twice and consider signing licensing agreements with human artists as a more viable alternative.

Of course, Nightshade cannot reverse the passage of time. Any artworks that were captured before being colored by this tool are still used to train AI models. Coloring them now may affect the effectiveness of the models, provided that these images are recaptured and used again to train updated versions of AI image generators.

At the technical level, there are no regulations preventing others from using Nightshade to color AI-generated works or works not of their own creation, opening the door to potential misuse.