This tool could protect artists from AI-generated art that steals their style

0 11



Robots would come for humans’ jobs. That was guaranteed. The assumption generally was that they would take over manual labor, lifting heavy pallets in a warehouse and sorting recycling.

Now significant advances in generative artificial intelligence mean robots are coming for artists, too.

AI-generated images, created with simple text prompts, are winning art contests, adorning book covers, and promoting “The Nutcracker,” leaving human artists worried about their futures.

The threat can feel highly personal. An image generator called Stable Diffusion was trained to recognize patterns, styles and relationships by analyzing billions of images collected from the public internet, alongside text describing their contents.

Among the images it trained on were works by Greg Rutkowski, a Polish artist who specializes in fantastical scenes featuring dragons and magical beings.

Seeing Rutkowski’s work alongside his name allowed the tool to learn his style effectively enough that when Stable Diffusion was released to the public last year, his name became shorthand for users who wanted to generate dreamy, fanciful images.

One artist noticed that the whimsical AI selfies that came out of the viral app Lensa had ghostly signatures on them, mimicking what the AI had learned from the data it trained on: artists who make portraits sign their work. “These databases were built without any consent, any permission from artists,” Rutkowski said.

Since the generators came out, Rutkowski said he has received far fewer requests from first-time authors who need covers for their fantasy novels. Meanwhile, Stability AI, the company behind Stable Diffusion, recently raised $101 million from investors and is now valued at over $1 billion.

“Artists are afraid of posting new art,” computer science professor Ben Zhao said. Putting art online is how many artists advertise their services but now they have a “fear of feeding this monster that becomes more and more like them,” Zhao said. “It shuts down their business model.”

That led Zhao and a team of computer science researchers at the University of Chicago to design a tool called Glaze that aims to thwart AI models from learning a particular artist’s style. To design the tool, which they plan to make available for download, the researchers surveyed more than 1,100 artists and worked closely with Karla Ortiz, an illustrator and artist based in San Francisco.

Say, for example, that Ortiz wants to post new work online, but doesn’t want it fed to AI to steal it. She can upload a digital version of her work to Glaze and choose an art type different from her own, say abstract.



Source link

Leave A Reply

Your email address will not be published.