Unstable Diffusion enables users to generate porn by typing text into the prompts

Unstable Diffusion enables users to generate porn by typing text into the prompts

AI ethicists believe AI-generated porn will have negative consequences, including for artists and actors

When Stable Diffusion, the text-to-image AI developed by Stability AI, was open sourced earlier this year, it didn’t take long for the internet to exploit its erotica side. Communities on Reddit and 4chan used the AI system to produce illicit fake nude celebrity photos as well as lifelike and anime-style drawings of bare, typically female figures. 

New forums, however, immediately appeared to fill the hole, as sites like NewGrounds allowed forms of adult art, while Reddit quickly removed many of the subreddits dedicated to AI porn. 

The largest is Unstable Diffusion whose proprietors are building a business around AI algorithms made to create high-quality erotica. The Patreon for the server receives around $2,500 per month from several hundred contributors.

Some ethicists say the manufactured porn could have negative repercussions, especially for marginalised groups. The AI generated porn on the Unstable Diffusion server has a wide range of distinctive artistic styles, sexual references and quirks. There are subgroups for BDSM and kinky things, channels for softcore and safe for work, channels for men-only content, and even a channel solely for non-human nudes. There are also channels for hentai and furry art.

There are 4,375,000 images created on Unstable Diffusion so far. On a semi-regular basis, the organisation runs competitions where participants are required to recreate photographs using the bot. 

Unstable Diffusion doesn’t allow material containing extreme gore, deepfakes or child pornography in order to be considered an ‘ethical’ community for AI generated porn. Discord server users are expected to abide by the rules of service and submit their photos for moderation. The server has a full time moderation team and uses a filter to prevent images of users from appearing.

More
Load More