"Nightshade Tool Surpasses 250,000 Downloads in Five Days"

2024-01-30

The new tool Nightshade, developed by researchers in computer science at the University of Chicago, is designed to be downloaded and used by artists for free to disrupt AI models from scraping and training their artwork without permission. The tool received 250,000 downloads in the first five days after its release.

"Nightshade has reached 250,000 downloads in the five days since its release," said Ben Zhao, the project's leader and a computer science professor. "I expected a high level of enthusiasm, but I underestimated it... The response has been beyond our imagination."

For this free tool, it is a good start and shows that some artists have a strong desire to protect their work from being used for AI training without authorization. According to the Bureau of Labor Statistics, there are over 2.67 million artists in the United States alone.

"We have not yet conducted a geographical analysis of these downloads," Zhao wrote. "Based on the reactions on social media, these downloads are coming from all over the world."

How Nightshade Works and Why It's Popular

Nightshade changes the artwork published on the internet or "colors" them at the pixel level to make them appear completely different to machine learning algorithms—for example, a handbag instead of a cow. When AI algorithms are trained based on several "colored" images scraped from the internet, they may start generating incorrect images based on user prompts or requests.

On the Nightshade project page, Professor Zhao and his colleagues—Shawn Shan, Wenxin Ding, Josephine Passananti, and Heather Zheng—stated that they developed and released this tool to "increase the cost of unauthorized data training and make obtaining image licenses from creators a viable alternative."

Shortly after its release on January 18, 2023, the demand for concurrent downloads of Nightshade was so high that the network servers at the University of Chicago couldn't handle it. The developers had to add mirror links for people to download copies from another location in the cloud.


Meanwhile, the team's previous tool, Glaze, which subtly changes pixels to prevent AI models from learning an artist's signature "style" and making them appear as something else to machine learning algorithms, has received 2.2 million downloads since its release in April 2023, according to Zhao.

What's Next for the Glaze/Nightshade Team?

Zhao and his research partners have already announced their intention to release a tool that combines Glaze (defensive) and Nightshade (offensive).

As for when it will be released, it will take at least a month.

"Our to-do list is long right now," Zhao wrote. "The combined version needs careful testing, or we can't guarantee that there won't be any surprises down the road. So I think it will take at least a month, maybe longer, for us to complete comprehensive testing."

At the same time, researchers from The Glaze Project also advocate for artists to use Glaze first and then Nightshade to protect their style while disrupting AI model training. They are encouraged to see artists doing this, even though using two separate programs can be cumbersome.

Zhao explained, "We have warned people that we haven't done comprehensive testing to understand how it works with Glaze, and people should wait before using Nightshade to publish any images. The response from the artist community is, 'We will do it in two steps, using Nightshade first and then Glaze, even though it takes more time and has a more noticeable impact on the artwork.'"

An open-source version of Nightshade may also be in the plans. "We may release an open-source version at some point," Zhao said. "It just takes more time to release different versions."

The project leader noted that he and his colleagues do not expect direct feedback from the manufacturers behind the AI image generation technology, such as OpenAI (DALL-E 2), Midjourney, Stability AI (Stable Diffusion), etc.