Last year I encountered the horror that is deepfake porn videos, in which celebrities faces are slapped onto porn stars bodies. The potential abuse of deepfake technology, which at that point was not quite readily accessible to every casual internet user, was clear. I started thinking about trolls making deepfake porn of women they hated, or vengeful exes making them of their girlfriends (now a reality, by the way.) And then there are the possibilities for videos imitating politicians and public figures, their fakery undetected by the internet unsavvy like my grandparents.
Deepfakes, to put it bluntly, scare the shit out of me! Which makes it hard to feel anything but terror when “fun” deepfake videos of Cardi B morphing into Will Smith, or Bill Hader turning into Seth Rogen turning into Tom Cruise, go viral. Wow, cool, I want to die inside? Recently a video made the rounds from the new Chinese deepfake app Zao, an app that lets people swap their faces with celebrities, like a user who made himself into Leonardo DiCaprio in several movies. It was a peek into a future in which making a deepfake is as easy as pulling up a song on Spotify, considering reports of an app concept that could approximate what any woman could look like naked using deepfake technology have since been scrapped. Lucky us, I guess?
Slowly, platforms are dealing with deepfakes. Deepfake porn has been banned from Reddit and Tumblr has started to crack down on it. On Thursday Reuters reported that Facebook and Microsoft are teaming up for a “deepfake detection contest,” commissioning researchers to create deepfakes to then create better detection tools, with a very helpful video to teach you what a deepfake is.
But I just can’t watch another deepfake video guys. I can’t! If you see a deepfake video online, in the streets, wherever, don’t show it to me. I don’t want to see this shit, ever. *Taylor Swift voice* Like, EVER.