AI companies were generating pornography with gamers' idle GPU time in exchange for Fortnite skins and Roblox gift cards.

Action
AI companies were generating pornography with gamers' idle GPU time in exchange for Fortnite skins and Roblox gift cards.

If your machine has a powerful GPU, you may find it a bit wasteful to leave all the computing power sitting in the background. In that case, Salad, which offers items such as game codes and gift cards in exchange for users' GPU resources, might seem like a reasonable way to get the machine working.

However, the software's default setting selects users to generate adult content. There is an option to "manually set workload type" and users can uncheck the "adult content workload" option (via 404 Media).

On Salad's Discord server, moderators state that the Adult Content Workload option is unchecked by default and Stable Diffusion image generation jobs are automatically considered adult content. When several users asked if there would be less revenue generated by Salad if this setting was turned off, the moderator explained, "If there is a high demand for a particular adult content workload, there may be a little less revenue, but we do not intend to make such an enterprise a core part of our business model. We do not intend to make such companies a core part of our business model."

The revenue earned from lending GPU time to Salad can be redeemed at the storefront, where users can exchange it for a variety of rewards, including Roblox gift cards, Fortnite DLC skins, and Minecraft vouchers.

On Salad's website, the company explains that "some workloads may generate images, text, or videos of a mature nature," and that any adult content generated is erased from the user's system as soon as the workload is complete. This vague generalization about sexually explicit content creation appears to be due to Salad's inability to view or moderate images created through its platform.

However, one of Salad's clients is CivitAi, a platform for sharing AI-generated images that has previously been investigated by 404 Media. The service hosts an image-generating AI model of a specific person, which, when combined with a pornographic AI model, was found to be able to generate non-consensual sexual images. A link to this study is shared here, but as a content warning, the subject matter covered is highly sexually graphic.

A spokesperson for CivitAI told 404, "Salad handles some of the image generation at different times of the day. We are scaling our services based on the amount of demand." Given this connection, it seems likely that Salad users are generating AI pornography for the site.

"The combination of generating AI porn in exchange for vouchers for games frequented by children and youth is a real hoot, especially since the option to disable it is not obvious.

The creation of nonconsensual sexual images of real people is particularly deplorable, and coupled with the high level of abstraction here, means that many people may be generating this content on their machines without clearly knowing they are signing up.

Oh well, the Internet can be creepy at times. Let's leave this story at that.

Categories