A professional headshot of a man with his face carefully poised in a neutral expression, the kind that might be used in an acting portfolio. A teenage girl with red hair and glasses, pouting at the camera against an outdoor backdrop. These photographs are the sort that saturate our online world, ones that you might find on a Facebook profile or LinkedIn page. The only difference? These people don’t exist. They are the product of an algorithm, a network of images competing against each other to create convincing fakes – and experts believe that they could soon replace pictures of real people in everything from the profiles that we match with on dating apps to the bodies that we watch in porn.
Designed by former Uber engineer Phil Wang, thispersondoesnotexist.com makes use of a code called StyleGAN (short for generative adversarial network). Wang has utilized this code to create a seemingly endless stream of faces.
“Our sensitivity to faces, when you really think about it, is a product of evolution for successful mating,” he told me. “What the site really demonstrates is that even for a data distribution that we are so well crafted to understand as human beings, the machine can pick apart all the relevant features and recompose them in a way that’s coherent.”
Wang’s innovation is fascinating and seemingly innocuous and yet it shares the same technological basis as much more sinister creations. Over the past few years GANs have been almost endemically misused to create malicious content, such as material mapping the faces of celebrities on to existing, often pornographic, footage – known as deepfakes.
“A recent study found that 96% of all deepfake videos were pornographic and in many cases are being used to harass and terrorize women,” says Rachel Thomas founder of fast.ai and expert in applied data ethics. “In general, our legal system has been slow to catch up with addressing [this kind of] sexual imagery and the use of AI is deepening and accelerating this problem.”
With this technology already inextricably and often problematically intertwined with sexual content, some are predicting that even more spurious videos making use of the same fake-face technology might be the next avenue that the adult industry looks to.
A growing body of research into creating full-body deepfakes also raises questions about the potential uses and misuses of GANs. In April 2019, a Japanese artificial intelligence company developed an AI that can automatically generate entire bodies in motion. Although they envisaged this being used for advertising, the already disproportionate use of GANs for pornographic purposes means that it isn’t much of a stretch to imagine how this new algorithm might be used in a similar way.
The expense and effort involved in creating such footage also means that full-body deepfakes, with real faces or otherwise, are unlikely to replace videos of actual adult performers on a significant scale any time soon, but industry insiders are nevertheless conscious of the potential implications of this technology.
“Many of the characters in our experiences are CGI generated so [full-body generated deepfakes] are similar to what we do,” says Ela Darling, an adult performer and founder of virtual reality company Viro Club, a platform which syncs adult toys to avatars (usually created using real models) to create a hyperrealistic pornographic experience.
“Some people are concerned that we’re going to reach a place where we don’t even need performers any more, because you can create AI humanoids, and I think that that’s something that could be damaging to performers in the industry.”
From a producer’s point of view, there are many potential benefits to full-body AI generated pornography. It opens up the possibility of experimenting with more innovative and interactive content in an era where free videos have ruptured a previously profitable industry. On the other hand, it makes it easier to create extreme content that consumers seek, but that some performers might not be prepared to participate in. But as with many new technologies, the possibility of creating sexual content featuring people who have never existed still throws up a host of ethical considerations.
As debate heightens on how adult content can warp our perception of consensual and enjoyable sex by showing scenes that objectify women and feature problematic sexual activity, and with concerns of exploitation in the industry growing, the idea of introducing life-like images of people who can be bent to the viewer or producer’s will is somewhat worrying. By allowing artificial intelligence into the equation we could be opening up our screens to ever more extreme content, and perhaps making real-life performers feel that they have to compete with their cyber counterparts.
“The whole deepfake situation is a deeply unsettling concept because it’s mostly men using women to harm other women,” says Ela Darling, whose site has put strict guidelines in place to ensure that nonconsensual deepfake technology is not used in any of their videos. “We really focus on the impact of the person being victimized and we don’t consider the deeper power structures being drawn on to create these experiences. As we move into the future and nascent technology becomes more widespread and typical, we need, at the end of the day, to make sure that performers are stakeholders.”
The situation is nuanced, and it seems that there is no clear answer on the ethics or desire for AI pornography, or how much closer the technology that Wang showcases on his website brings us towards it. But Wang himself remains positive:
“I certainly have concerns, but I also think it is better for the majority to be informed of this technology than it is to only have a handful of experts and bad actors make use of it,” he says. “My goal with the site is modest. I want to share with the world and have them understand instantaneously, what is going on, with just a couple of refreshes.”