Getty Images is suing Stability AI for illegally using millions of images from its bank

⇧ [VIDÉO] You may also like this affiliate content (after ads)

Stability AI is behind Stable Diffusion, an artificial intelligence capable of generating images from text descriptions. This type of AI is based on machine learning, so it needs to be trained to perform a task with millions of existing data. Getty Images is accusing its designers of illegally using an image bank to train their tool.

“Getty Images’ position is that Stability AI has illegally copied and processed millions of copyrighted images and associated metadata owned or provided by Getty Images without a license, in the interests of Stability AI’s commercial interests and to the detriment of content creators,” Bank explains. images in the press release. Therefore, the company initiated legal proceedings in the High Court of London.

Getty Images says in a statement that it supports artificial intelligence art projects, arguing that it can spur creative efforts. It has already licensed “leading technology innovators” so they can train their AI with some of their images while respecting personal and intellectual property rights. Stability IA would not take any action with Getty Images to obtain a license and would use the images without permission or consent.


Show the world your passion for space and that you also support the fight against global warming.

Thousands of files from the most famous image banks

Craig Peters, CEO of Getty Images, told The Verge that the company has released Stability AI with a “preliminary letter” — formal notice of pending litigation — explaining that Stability AI has misused someone else’s intellectual property to create a commercial proposal. of which he will be the sole beneficiary. Stability AI did not comment on The Verge’s appeal and stated that they did not receive any information about this lawsuit.

The details of the lawsuit have yet to be made public, but Peters said the allegations include copyright infringement and violation of the site’s terms of service (specifically, web scraping, which involves extracting website content with a program).

Since the training data for Stable Diffusion is open source, Andy Baio, an American technologist and blogger, decided to get and study some of that data (a sample of 12 million images from the 2.3 billion images that Stable Diffusion was first formed on, says Baio ). “It’s no surprise that a lot of the images come from stock image sites. 123RF was the largest with 497K, 171K images coming from Adobe Stock CDN on, 117K from PhotoShelter, 35K images from Dreamstime, 23K from iStockPhoto, 22K from Depositphotos, 22K .from Unsplash, 15K from Getty Images, 10K from VectorStock and 10K from Shutterstock, among many others,” he explains.

Like many other AI-based tools, Stable Diffusion uses human-generated data (in this case, images) for learning and improvement. These data, texts and images are usually collected from the Internet, very often without the knowledge of their authors and creators and therefore without their permission.

The need for a new legal framework for creative AI

However, companies in the sector claim that the practice is legal; in the United States, this would notably be covered by “fair use” (which can be translated as “reasonable” or “acceptable” use), a set of rules that define certain limitations and exceptions to copyright. The authors, for their part, consider this a violation of their rights. “These generative models need to take into account the intellectual property rights of others, that’s the crux of the problem. And we are taking these actions to clear things up,” Peters said.

To date, legal experts are divided on this issue, but it is certain that the law will very quickly need to integrate these new aspects of the cultural and creative sectors, because these conflicts are likely to multiply. “This lawsuit marks an escalation in a growing legal battle between AI companies and content creators,” said James Vincent, a journalist with The Verge.

A clear new legal framework is all the CEO of Getty Images wants, who said the company is not interested in damages or a possible ban on AI-powered art tools. “I think there are ways to build generative models that respect intellectual property. I compare [ceci à] Napster and Spotify. Spotify has been in talks with intellectual property owners—labels and artists—to create the service. You can argue about whether they are getting fair compensation, but these are negotiations based on the rights of individuals and entities,” he said.

Note that Stabilité AI is also the subject of another lawsuit filed by three artists: Sarah Andersen, Kelly McKernan and Carla Ortiz have filed a class action lawsuit against Stability AI, DeviantArt and Midjourney for their use of Stable Diffusion. For Matthew Butterick, one of the lawyers representing the three artists, the appeal is “another step towards fair and ethical AI for all.”

Stability is a priori confident that it will act legally, but in the face of growing dissatisfaction from image creators, the company announced in mid-December that artists who wish can remove their images from the body used for training before the formation of Stable Diffusion 3. However, the process tested and described by Ars Technica seems time-consuming and not very strict: for example, there is no identity check before “deactivating” images, and it is not possible to deactivate multiple copies of the same image at the same time.

Getty Images

Back to top button

Adblock Detected

Please consider supporting us by disabling your ad blocker.