'Deepfake’ Could Be The New Revenge Porn

'Deepfake’ Could Be The New Revenge Porn

As if upskirting and revenge porn weren’t enough to contend with, along comes another form of abuse that’s completely out of our control. ‘Deepfake’ porn is on the rise, and while celebrities are the main target right now, regular women could soon find themselves falling victim…

What is deepfake?

Deepfake is a type of pornography that sees a victim’s face digitally placed over that of a porn actor. As Motherboard reports, there’s a growing community of Redditors who are creating this kind of porn using the faces of celebrities. They take existing video footage of both women and use a machine learning algorithm to superimpose the celebrity’s face onto explicit videos. “This algorithm is able to take the face of a celebrity from a publicly available video and seamlessly paste it onto the body of a porn performer,” they explain. “Often, the resulting videos are nearly indecipherable from reality. It’s done through a free, user-friendly app called FakeApp.”
 

How and why is it happening?

As technology continues to develop, the ability to face-swap is no longer confined to the expertise of a special effects studio – as those of us who use Snapchat know, it’s fairly easy for even the biggest techno-phobes among us to do a convincing face swap already. There are plenty of apps that allow you to easily add one person’s face to another person’s body on a phone, and it could be in the hands of a jealous ex.

The trend began with celebrities – one of the most famous features Star Wars actor Daisy Ridley’s face manipulated into an explicit video, but Meghan Markle, Emma Watson and Gal Gadot have also been targets. With an appetite clearly growing for these types of videos, it’s becoming entirely possible that anyone’s image – particularly those who have a lot of good quality Instagram or Facebook photos – can be used without their consent, making it on-par with revenge porn.

In the deep, dark recesses of the internet, users of a deepfake chatroom have already started swapping tips and tricks on how to make a convincing video, with many saying they’ve already been creating them using pictures of girls from their school. Motherboard revealed one user said they’d made a “pretty good” video of a girl they went to school with, using around 380 pictures scraped from her Instagram and Facebook accounts, while another said that making a deepfake was a good way to “blackmail” someone.

Female-focused pornography director Erika Lust, who has spoken out against deepfakes, believes the dangerous trend represents a “new frontier” for non-consensual pornography, revenge porn and fake news. “Deepfakes are incredibly easy to make and people are already starting to use them to make fake porn videos using images of co-workers, friends and ex-girlfriends taken from their social media accounts,” she said.
 

What if you’re a victim?

For those unexpecting people who find themselves a victim of a deepfake, the results could be catastrophic. Online pornography is an ever-thriving business. In 2017, Pornhub averaged 81 million visitors each day – that’s 28.5 billion visits per year. Over 4 million videos were added, totalling 595,492 hours’ worth of content – so it wouldn’t take long for a video in which your face has been digitally inserted to circulate. 

It’s terrifying – and there could be nothing you could do about it. While the UK now has revenge porn laws in place, they don’t currently clarify whether fake images are covered or not – and there are no specific laws targeting deepfake porn as of yet. Victims must currently rely on a patchwork of laws around harassment, malicious communications, data protection and copyright violation – so there’s no straight route, or even guarantee of, justice.

It’s also important to note there are two victims for each deepfake – the other being the porn performer. As director Lust explains, the actress who stars in the original porn video is being denied a proper credit for her work. The adult industry is already a hard one to work in, with constant copyright issues and free porn sites like YouPorn and Pornhub meaning actors aren’t getting paid sufficiently for their time. Covering up the face of the actress who put in the work on a video that might go ‘viral’ among porn users means she could potentially miss out on a wave of work that might have come with its popularity.
 

What’s being done?

In February, Twitter banned deepfake videos, stating: “We will suspend any account we identify as the original poster of intimate media that has been produced or distributed without the subject’s consent. We will also suspend any account dedicated to posting this type of content.”

Pornhub followed suit, describing it as “non-consensual porn”, and Reddit has shut down its primary subreddit account dedicated to deepfakes around the same time, saying it “breached the rules against non-consensual content or revenge porn”.

But experts have warned the technology could become so advanced that it could soon become impossible to tell what’s real and what’s not. As such, legal boundaries need to be implemented. It’s hoped that, after the upskirting scandal, deepfake pornography will be the next form of abuse to be made a criminal offence. As Clare McGlynn, Professor of Law at Durham University, who specialises in the legal regulation of pornography and image-based sexual abuse, put it, the upskirting bill was “a welcome first step towards a more comprehensive response to image-based sexual abuse”.

Speaking to the Guardian, McGlynn seemed positive that the upskirting law could cover “images which have been altered too, and clearly criminalise a practice that victims say they find incredibly distressing.”
 

Fashion. Beauty. Culture. Life. Home
Delivered to your inbox, daily