Today The Verge reported that stock image behemoth Getty Images is suing Stability AI, the company behind Stable Diffusion for illegally scraping content:

In a press statement shared with The Verge, the stock photo company said it believes that Stability AI “unlawfully copied and processed millions of images protected by copyright” to train its software and that Getty Images has “commenced legal proceedings in the High Court of Justice in London” against the firm.

Getty Images CEO Craig Peters told The Verge in an interview that the company has issued Stability AI with a “letter before action” — a formal notification of impending litigation in the UK. (The company did not say whether legal proceedings would take place in the US, too.)

“The driver of that [letter] is Stability AI’s use of intellectual property of others — absent permission or consideration — to build a commercial offering of their own financial benefit,” said Peters. “We don’t believe this specific deployment of Stability’s commercial offering is covered by fair dealing in the UK or fair use in the US. The company made no outreach to Getty Images to utilize our or our contributors’ material so we’re taking an action to protect our and our contributors’ intellectual property rights.”

When contacted by The Verge, a press representative for Stability AI, Angela Pontarolo, said the “Stability AI team has not received information about this lawsuit, so we cannot comment.”

This comes fresh on the heels of news of a different lawsuit against not just Stable Diffusion, but also against MidJourney and DeviantArt’s art generator DreamUp (I’m glad to know DeviantArt is still around).

This seems like the start of a new wave of lawsuits.

There has been an increasing amount of ethical concern over the fact that AI models are trained on the internet. The Verge points out that since Stable Diffusion is an open source model, it’s possible to browse the entire dataset that the AI was trained on (shit’s wild, there’s even NSFW stuff in there). This implies it’s a relatively simple matter for lawyers to determine if that content was used or not; it’s also out in the wild, which means that there’s no going back for Stability.

There’s also the fact that it’s possible to generate the Getty watermark using Stable Diffusion. The Verge used the transformer model to generate these images below.

Both of these images are as terrifying to look at in detail as they probably are for Stability AI’s legal team to see in evidence.