Science

After AI Stable Diffusion art is now doing porn

However, the creators of Stable Diffusion trained the AI ​​of their kit using LAION 5B, while knowing that there were many obscene images in the database. As a result, the sample of images was reduced to just 120 million pieces of content, and the AI ​​​​was trained to evaluate images in such a way as to exclude pornographic as much as possible. Unfortunately, this did not prevent Internet users from creating more or less depraved content and opened the way for two major abuses.

The first one concerns the “deepfake”, namely the fact that artificial intelligence creates a false image that has everything that the real one has. The problem here is mostly with celebrities with deepfakes that allow fake porn content to be created using their image. Even if the Unstable Diffusion Discord server, where internet users stream content created from the open source version of Stable Diffusion, bans such content, there are plenty of them on 4Chan, Reddit and other platforms. Just type “Stable Diffusion Porn” in Google Images to find no less than 5 diversions of British actress Emma Watson in the top 30 search results…

The second drift is child pornography. If this content is also prohibited on the Discord server, the presence of “hints”, texts used to help the AI ​​create the desired image, may make it possible to create them. Already denounced in many art circles, AI-assisted imagery will not only mimic news like this.

Back to top button

Adblock Detected

Please consider supporting us by disabling your ad blocker.