Using the latest AI tech, people are making celebrity sex tapes that aren’t real. Through apps that allow for ‘face transplanting’ it has now become very easy to replace a face on a porn video, and add just about any face on social media. Even yours!

 

It’s no secret that we’re living in the future. From Apple’s facial recognition tools to the prevalence of virtual reality, many of the things we once thought were far off sci-fi features are now our norm. And with these new technologies comes more ways to exploit it. According to Motherboard, that could be what we’re seeing in the world of artificial intelligence.

Motherboard reports people have figured out how to make AI porn using celebrity faces. Of course, superimposing celebrities’ faces on pornographic images is nothing new. But this reportedly takes things a step farther by making it seem like a celebrity is appearing in a porn video, saying and doing things they never did or said. According to Motherboard, one Redditor called ‘deepfakes’ seems to be leading the way in AI celebrity porn.

“I just found a clever way to do face-swap,” deepfake told Motherboard. “With hundreds of face images, I can easily generate millions of distorted images to train the network. After that if I feed the network someone else’s face, the network will think it’s just another distorted image and try to make it look like the training face.”

This, of course, raises some issues. As Motherboard points out, it could be an easy(ish) vehicle for harassment, if someone makes what seems to be a sex tape of someone else and spreads it around. Having sex is not shameful, but having images — or what seem like images — of yourself spread without your consent is a violation of your privacy. And this is where things get murky. If it’s not actually your body being shown, where do the lines of consent, and violation of consent, fall?

Porn performer Grace Evangeline told Motherboard that spreading any intimate image of someone, even if it’s not actually their own body, without consent is wrong. She’s right. Using nude or intimate photos to shame someone is a violations of privacy, and that remains the same no matter what technology is used.

Read full story on Motherboard