Now We Have to Worry About Our Exes Making Fake Porn With Our Faces

Latest

Deepfakes, or pornography in which face-swapping technology has been used to generate fake videos, first made the news at the end of 2017, when a Reddit user released videos featuring the faces of several celebrities and then subsequently made an app widely available for anyone to use.

Unsurprisingly, more and more people are now creating deepfakes featuring everyone from their classmates, coworkers, exes, and even complete strangers whose photos they find online. From the Washington Post:

A growing number of deepfakes target women far from the public eye, with anonymous users on deepfakes discussion boards and private chats calling them co-workers, classmates and friends. Several users who make videos by request said there’s even a going rate: about $20 per fake.

The Post shares the story of one anonymous woman featured in a deepfake porn video, whom they found by running images from the video through a reverse-image search.

“I feel violated—this icky kind of violation,” she told the Post, after being contacted by a reporter and alerted to the video. “It’s this weird feeling, like you want to tear everything off the Internet. But you know you can’t.”

While Reddit as well as Pornhub have banned deepfakes from their websites, with Reddit noting that it violates its policy against “involuntary pornography” and Pornhub describing it as ““nonconsensual content or revenge porn,” people have simply moved to other platforms, and the videos are still widely circulated. The Post described how the video of the woman it interviewed came to be created:

The requester of the video with the woman’s face atop the body with the pink off-the-shoulder top had included 491 photos of her face, many taken from her Facebook account, and told other members of the deepfake site that he was “willing to pay for good work :-).”
It had taken two days after the request for a team of self-labeled “creators” to deliver. A faceless online audience celebrated the effort. “Nice start!” the requester wrote.

While there’s a lot of (very valid) handwringing over the potential of the technology behind deepfakes to contribute to the proliferation of fake news, the ways it’s been used to date has largely been as yet another tool to harass and intimidate women. The videos, the Post notes, have “been weaponized disproportionately against women, representing a new and degrading means of humiliation, harassment and abuse.”

As Mary Anne Franks, a law professor and president of the Cyber Civil Rights Initiative told the Post, “If you were the worst misogynist in the world, this technology would allow you to accomplish whatever you wanted.”

0 Comments
Inline Feedbacks
View all comments
Share Tweet Submit Pin