Skip to content

Take It to the Spank Bank

AI comes for the skin flick

At first, I imagined all AI-generated pornography would look like a hyperbolic form of hentai. Or, in the vein of those ubiquitous internet warlock games, I envisioned low-resolution, preternaturally buxom women clad in golden bikinis. Turns out I wasn’t entirely off-base. A friend recently showed me r/AdultDiffusion, a “community for adult Stable Diffusion art, emphasizing respect and responsibly [sic].” In other words, it’s a Reddit forum bursting with photos of nude women—all AI-generated, most of them cartoonish, others strikingly photorealistic. We had entered the uncanny valley, and everyone was naked.

The moderators prohibit any representations that appear “underage,” but there was something unsettlingly pubescent about the photos. Scrolling through, I was struck by the sense that each was someone’s body. My friend protested. This was the beauty of the technology: there was no one behind the image. He no longer had to feel guilty about objectifying some real human woman. He could objectify an object! Arousal could be its own closed system, limited only by the power of your computer.

Although close inspection still reveals that AI-generated pornography is synthetic—the airbrushing and proportions in defiance of physics are often telltale signs—users may not be deterred, as many aren’t pursuing a real person but only a patchwork of visual cues to incite arousal. For some, “authenticity” is almost auxiliary, or even a drawback, to platforms like OnlyFans or Chaturbate. With AI porn, no matter what degrading imagery you generate, it would appear as though there is no soul at stake, no one else implicated by your desires. Amid the images and metadata, the body is only as relevant as its contortion. The person is no longer the point.

In many ways, the free-for-all fantasia enabled by AI is the logical endpoint to the layers of alienation and depersonalization endemic to the industry and its interfaces. With an infinite number of sexual scenarios available to peruse online, and free access to all of them, all the time, the contemporary porn experience has almost completely diverged from interpersonal sexual gratification. Your AI girlfriend can never reject you! She can never say no! But even if these creations appear posthuman, there are still real people implicated—in the datasets that produce them and the livelihoods of adult performers they imperil.


Many of the uploads to r/AdultDiffusion are sourced from AI pornography aggregators, on which users can generate and display explicit content produced either on personal models or through the webpages themselves. Like with traditional porn aggregators, much of the site is free for users without accounts, with the option to upgrade to paid tiers. Many, like Candy.ai, Pornderful.ai, and CivitAI implement some iteration of Stable Diffusion, an open-source text-to-image model released in 2022. Diffusion is a form of deep learning that compresses pixel images into their fundamental semantics, corrupts them, and stores that corruption as a noise predictor. When the user types in a prompt, images are de-noised and scaled up from the latent space based on definitions predicted from the training data. In other words, the model remembers key data from its training and uses that information to create a new image from inputted words. 

Between July 2020 and 2023, the monthly traffic to the twenty most popular deepfake sites increased by 285 percent.

On Pornify.cc, a relatively new site utilizing Stable Diffusion, users can select from a wide menu of hashtagged qualities to generate the #teen, #babe, #20s, or #milf of their fantasies (#males are also available). Of course, “all characters are over the legal age,” according to the site. Should her body be #hourglass, #pear-shaped, #petite? And the breasts: #huge or #big, #asymmetric or #saggy? Should she be #caucasian, #asian, #czech, #chinese, or one of fifty-two other ethnicities? And as for bonus features, should she be #pregnant, #cum-covered, wearing a #hijab, or possibly #transgender (the latter being available only to VIP users)? Once a satisfactory number of characteristics have been selected, the user clicks generate, and are forthwith presented with an AI-generated centerfold to be deposited in the spank bank.

Or perhaps it will be uploaded to r/AdultDiffusion, where almost anything goes—except for “underage” and “deepfake” pornographic content, which is the AI-abetted manipulation of someone’s image into sexual scenarios, often without their consent. Some rudimentary deepfakes, like DeepSwap, take the face of one individual and map them onto the body of another, a virtual Frankenstein of people and desires. Sites like mrdeepfakes.com, “the largest and most user-friendly celebrity deepfake porn tube site,” collect a panoply of videos in which celebrity faces have been grafted onto existing porn. 

Generative Adversarial Networks (GANs), the technology underpinning deepfakes, was first developed by the computer scientist Ian Goodfellow and his colleagues in 2014; the term deepfake emerged three years later on Reddit. By 2019, some 96 percent of deepfake content was pornographic and that 99 percent targeted women, according to a study by SensityAI. During the 2020 election, the public grew increasingly conscious of political deepfakes as a threat to democracy. By 2021, major outlets like the BBC were warning that deepfake pornography could become an epidemic. It probably already had: Bloomberg reports that between July 2020 and 2023, the monthly traffic to the twenty most popular deepfake sites increased by 285 percent.

The public’s concern, however, is selective and tends to arise only when celebrities, minors, and other sympathetic victims are involved. “Although there has been an increase in public awareness over the past five years, sympathies are not equally distributed,” explains media scholar Sophie Maddocks. “Cybersexual harassment is a gendered harm that disproportionately targets minoritized groups. For digital sex workers, this is not only about the right to one’s image; it’s also a question of labor and livelihood.”

Alana Evans, a performer and president of the Adult Performance Artists Guild, has been in the industry since 1998 and recognizes how artificial intelligence and adjacent technologies like deepfakes threaten to exacerbate existing inequities. “I’ve made it this long because I’ve kept up with what’s out there and adapted,” she tells me. “AI technology can do a lot for us—my body is not what it was when I started out and it would be nice to produce a gangbang without having to shoot it. But when that technology falls into the wrong hands, the actors are the ones who lose out.” In addition to siphoning income streams from adult performers, Evans warns that deepfakes made without the artists’ consent are often made to engage in racist roleplay and other scenarios performers may be uncomfortable with—while at the same time diluting the value of content made by the performers themselves.

It is no great revelation that labor conditions within the porn industry are often exploitative. Feminist and conservative critics alike have lambasted the arduous, often painful, work of producing hardcore pornography as well as the downstream effects of the frequently degradative fantasies they depict. Furthermore, women are often more vulnerable than men to manipulation and coercion by porn producers, on set and in legal negotiation. Race, of course, also plays a role: women of color tend to make one half to three quarters of white actresses’ salaries and are often asked to undertake more extreme performances than their white counterparts. Although the recent shift toward gig work through platforms such as OnlyFans provides many performers greater autonomy over their time and income stream, artists are then at the mercy of the platform and instability of non-regular income: according to a 2020 study, the top 1 percent of earners on OnlyFans take home over a third of the total income generated by the platform. While AI-generated content may alleviate some of the immediate physical burdens of producing content in the short-term, the advance of artificial intelligence will only increase the precarity of the individual worker.

Both self-employed performers and those working with traditional system companies are at risk. According to Evans, porn production companies will often draw up contracts that ask the artist to sign over rights to their image, and early career performers rarely possess the leveraging power to reject such clauses. Evans shares anecdotes of contracts sent to gig workers promising easy, upfront cash drawn up by a host of bad actors, from companies seeking to create AI models without the signer’s consent to anti-porn organizations that use bots trained on these images to ban search results that contain these performers’ work.

Victims of deepfake pornography find little recourse in the legal system. “The tools we have to regulate deepfakes now are tools that were designed for other problems,” legal scholar Matthew Kugler tells me. “We have laws that protect your right of publicity, laws to protect against defamation, laws against revenge porn and the sharing of genuine pornographic images without your consent. But none of these address the fundamental harms of deepfake pornography because this technology wasn’t around when they were written.” In Virginia, one of only a few states that have passed legislation explicitly targeting deepfake pornography, victims must prove malicious intent in court, such as an attack on the victim’s reputation, to provoke a penalty (up to a year in jail and/or a fine of up to $2,500). But if the court rules that there was nothing malicious in the deepfake creator’s intention, there is no path to justice.

Already, removing images and videos uploaded to sites like Pornhub without permission is a game of whack-a-mole, and the proliferation of these deepfakes is only exacerbating this challenge. “Pretty soon, civilian women are going to have to learn how to do [Digital Millennium Copyright Act] takedowns,” adult performer and writer Rachel Oyster Kim warns. According to Wired, over one hundred thousand deepfake pornographic films were uploaded to the top thirty-five deepfake sites in the first nine months of this year—a 54 percent increase over all of 2022. There’s no sign things are slowing down.


Even when one’s likeness is not explicitly replicated, its constituent parts may become a part of an artificial pornographic schema in the form of training data—like the model powering Pornify.cc. Stable Diffusion was trained on LAION-5B, an uncurated dataset that contains over five billion image-text pairs scraped from every corner of the internet. To produce variations on, say “busty red heads,” users rely on specialized models that absorb stills from pornography videos, files downloaded from OnlyFans, and photos uploaded to Reddit forums like r/nsfw or r/gonewild. Anywhere nudity exists online, it can be scraped to train artificial intelligence models. (Pornify.cc tries to sidestep legal trouble by claiming it’s “pure coincidence” that “characters may sometimes bear resemblances to real-life people.”)

Anywhere nudity exists online, it can be scraped to train artificial intelligence models.

The bodies that so disturbed me on r/AdultDiffusion, then, were indirectly stolen from real people, or at least their attributes. Most have no idea that their likeness is being appropriated, nor do they have any recourse to prevent it from happening. While the SAG-AFTRA strike has drawn public attention to the issue of digital reproduction—with workers demanding “that the right to digitally replicate a performer’s voice or likeness to substantially manipulate a performance, or to create a new digital performance, is a mandatory subject of bargaining”—sex workers are left out (SAG has denied adult performers admittance to their union since 1974). And because it is so difficult to ascertain where your image is being used online, performers struggle to make a legal case against involuntarily becoming part of a dataset under current law, no matter what threat it might pose to their livelihood. “I’m glad people are finally paying attention to these issues,” Evans adds. “But sex workers also deserve the right to their image.”

Still, not all adult performers are concerned by the uptick in AI pornography; they’re of the mind there will always be a critical mass of users looking for a real person behind the screen. After all, the promise of platforms like OnlyFans—which hosts some 120 million registered users—is access to personalized explicit content, a parasocial relationship with a real, live performer. While your AI girlfriend may never say no, she also can never provide this kind of intimacy.

“I haven’t found a technology yet that can do what I do,” says Evans. Since beginning her own management company, which handles communication and promotion for other performers, she has been screening chatbots and video services for herself and clients, but none quite stack up. Often, the most lucrative, and tedious, part of the job for self-employed content creators is responding to the thousands of chats and emails sent by fans—sorting through the spam, encouraging users to purchase original content, charging even to chat. Many management companies will outsource this labor to low-wage click workers overseas via sites like Upwork or Clickworker. Because Evans refuses to go down this rabbit hole (“It stinks of exploitation and corruption”), she manages her messages herself. Hypothetically, text generators like Botly, which profess to assist performers in scaling their business, should be an asset, but she hasn’t found technology that can strike the balance between flirting and running a business. For now, when users pay for Evans’s page, they’re getting the real deal.

If the articulation of desire is a constant negotiation of fantasy, an exploration of the self through reaction to the other, then AI pornography and its self-referential imagery is more of an outsourced spank bank than porn as we know it. Sex has always been about the particulars: the stray hairs, the creases, the sweat, the folds. At the same time artificial pornography took off, Pornhub reported that the “reality” category increased 169 percent. Perhaps there is hope yet for the body, for our want of it and the bodies of sex workers who depend on sharing their image.