Diving Deep into Deepfake Porn (Part 1 of 3)
Trigger Warning: Mentions of sexual violence, sexual assault, sexual exploitation, and rape
This post is part one of three in a series exploring the dark world of deepfake pornography. In this first part, I’ll be giving a basic overview of what deepfake pornography is and how it perpetuates sexual violence against women. Part two will detail how developments in AI deepfake technology are being used to harm children. The final part will go over the current legal landscape surrounding deepfakes, and why there is no simple solution to stop this kind of content from being created and circulated online.
With the rising prevalence of AI dominating mainstream media, you’ve probably heard discussions of deepfakes and may have encountered some online. Deepfakes are images or videos generated by AI that use deep learning to create false likenesses of a person. There has been much media coverage on the dangers of deepfake technology and how AI-generated images can be used to create large-scale political disinformation or to commit fraud. However, I want to focus on another application of this technology, one that I find more horrifying than fake news and financial scams: deepfake pornography.
Deepfake pornography is exactly what it sounds like. It’s pornographic material created using AI where anybody’s likeness can be synthetically generated into sexually explicit images and videos. This type of content ranges from superimposing somebody’s face onto an existing adult creator’s body or generating a completely original image using AI photo manipulation software. This is sickening, and saying that it “violates consent” is an understatement when we look at the scope and severity of the exploitation occurring.
In a 2019 report done by Sensity, a company that detects and monitors deepfakes, it was found that 96% of deepfakes online were nonconsensual pornography and that the top four websites dedicated to deepfake pornography received more than 134 million views. These websites are unfortunately easy to come across, and in an article written by Kat Tenbarge for NBC, she discovered that two of the largest websites hosting deepfake videos could be easily accessed through Google.
A vast majority of this content targets female celebrities but as AI technology develops, the deepfake pornography market is rapidly expanding to target women who are not famous. Creators can make and sell customized deepfake porn of anyone they can get a likeness of. Tenbarge’s NBC piece uncovered a creator on Discord who, for just $65 (roughly £52), could make a 5-minute video of a “personal girl” - anyone with less than 2 million Instagram followers. Further, research done by Henry Ajder and Giorgio Patrini found a Telegram channel with nude images of more than 680,000 women that were sold for just $1.25 (£1) each. Let that sink in. For £1, over half a million women were sexually exploited without their knowledge or consent. This technology has created a new fear for women, with many speaking out online about the trauma caused by seeing nonconsensual deepfake images of themselves.
When researching for this blog post, I came across a creator (whose name and likeness I’ve obscured to preserve privacy) who made a TikTok detailing her experience with deepfake pornography. Through tears, this creator describes how a random faceless account sent her an Instagram message containing AI-edited nudes which were later distributed online without her consent. She states that these deepfakes were made using fully-clothed pictures she had posted on her Instagram account, and laments about how repulsed she is by the comments she received as a result of the fake nudes circulating. She continues, voicing that she feels “disgusted and violated in every way” and says the only reason anyone would want these pictures of her is because they like that the photos were nonconsensual.
Another high-profile story about deepfakes that shone a light on the issue occurred earlier this year. Back in January, Twitch streamer Brandon Ewing (known as Atrioc online) was caught looking at deepfake pornography of popular female Twitch streamers, some of whom were his friends and coworkers. One of the streamers, QTCinderella, whose likeness was featured on the deepfake site, knew Ewing personally and was quick to voice her horror and anger when the story broke. In an emotional livestream she responded, “This is what it looks like to feel violated, this is what it looks like to feel taken advantage of. This is what it looks like to see yourself naked against your will being spread all over the internet…Fuck Atrioc for showing it to thousands of people. Fuck the people DM’ing me pictures of myself from my website. Fuck you all. This is what it looks like, this is what the pain looks like,”
What these two stories emphasize is the core issue within deepfake pornography: systemic rape culture and the pain it inflicts upon women. In the digital age, it has never been easier for individuals to access a wide variety of consensual pornography made by adult creators who willingly provide this content to their audiences. However, what the booming deepfake pornography market tells us is that people seek out and pay for this material because of its nonconsensual nature. Either they don’t care about the harm this content causes women, or, more insidiously, they choose this content because of the violence it perpetuates.
Furthermore, this content is dangerous because any AI has the ability to generate realistic images of anything imaginable, and it could soon be impossible to tell the difference between real and artificially-generated images. This means that deepfake porn can realistically synthesize women in increasingly more violent sexual situations which can aid in normalizing aggressive sexual behavior and sexual assault.
An article by Arwa Madahi for The Guardian calls the rise of deepfake pornography “an emergency” and I agree. What’s happening here is mass exploitation where any woman can potentially be edited into sexually explicit content that they did not consent to. In fact, they may not even be aware that their likeness has been sexualized. Deepfake porn is a form of sexual violence that has the potential to ruin women’s lives, and it’s a problem that will only get worse as AI technology continues to develop.
In my next post, I will be looking at an even worse crisis when it comes to deepfake pornography: how this technology is being used to exploit children through the creation of child sexual abuse content.