‘AI Art’ Companies & DeviantArt Are Being Sued By Artists

1 min read


Screenshot: Futurama

Stability AI and Midjourney—two of the biggest names in the exploding field of AI-generated imagery—and portfolio site DeviantArt have become the target of a class action lawsuit, filed in California on behalf of artists.

As we’ve covered previously, AI-generated imagery is a highly contentious field, one where “artists write algorithms not to follow a set of rules, but to ‘learn’ a specific aesthetic by analyzing thousands of images. The algorithm then tries to generate new images in adherence to the aesthetics it has learned.”

The suit has been brought forward by three plaintiffs, all artists: Sarah Andersen (of Sarah’s Scribbles), Kelly McK­er­nan and Karla Ortiz, who on behalf of all artists affected are “seeking compensation for damages caused by Stability AI, DeviantArt, and Midjourney, and an injunction to prevent future harms”.

It makes numerous and serious allegations:

The lawsuit alleges direct copyright infringement, vicarious copyright infringement related to forgeries, violations of the Digital Millennium Copyright Act (DMCA), violation of class members’ rights of publicity, breach of contract related to the DeviantArt Terms of Service, and various violations of California’s unfair competition laws.

As lawyer Matthew Butterick says in his post about the case, the three companies are being targeted by “writ­ers, artists, pro­gram­mers, and other cre­ators” who are “con­cerned about AI sys­tems being trained on vast amounts of copy­righted work with no con­sent, no credit, and no com­pen­sa­tion.”

Stability AI (who make Stability Diffusion) and Midjourney are the companies behind the two most popular AI-generated art platforms, while DeviantArt—most commonly known as a portfolio and community art site—is being included for its own work in fucking up massively.

For more technical details on the suit—or contact details if you’re an artist and would like to get involved—you can check out Butterick’s post here.



Source link