“Deep Fake” Porn Videos Getting More Common

As technology that allows you to alter videos improves and becomes more readily available and easier to operate, woman across the country are facing a new type of “Revenge Porn” – deep fake videos and images.

“Deep fake” is the name given to video clips or still images of pornography where the actors’ faces have been replaced by another person’s face. Trolls have been doing this for the last few years with celebrities as money can be made off the deep fakes. There are numerous porn sites devoted to fake porn clips featuring many of Hollywood’s most famous actresses. The issue recently came into focus in the public eye when the conservative website The Daily Caller posted a deep fake image of US Representative Alexandra Ocasio-Cortez that was billed as a nude selfie. The picture was actually one taken by one of former US Representative Anthony Weiner’s online mistresses with rep. Ocasio-Cortez’s face photoshopped in.

Once just used to victimize and make money off celebrities “deep fakes” are now being used to victimize non-famous women as well as the tech has improved and become more available

But now that the technology has been improved and made easy to use by companies like Google and Apple, deep fake videos are being used as weapons against women who are not in the spotlight. Many of the deepfake tools, built on Google’s artificial-intelligence library, are publicly available and free to use. While the deepfake process demands some technical know-how, an anonymous online community of creators has in recent months removed many of the hurdles for interested beginners, crafting how-to guides, offering tips and troubleshooting advice — and fulfilling fake-porn requests on their own. All they need is about 50-100 photos of the person, relatively easy to obtain if you think about how many selfies and pictures are available across people’s social media platforms.

To simplify the task, deepfake creators often compile vast bundles of facial images, called “facesets,” and sex-scene videos of women they call “donor bodies.” Some creators use software to automatically extract a woman’s face from her videos and social-media posts. Others have experimented with voice-cloning software to generate potentially convincing audio.

“Revenge Porn,” or as I prefer to call it, Non-consensual pornography, has long been defined as when a partner shares an actual video or image of the other person without that person’s consent to shame, ridicule or extort them. Normally the offended party shared the video or image with their former lover with intent that no other person see it. Through the Cyber Civil Rights Initiative (ccri.org) I have represented dozens of women victimized in this fashion. While they did nothing wrong, they often feel the shame and guilt of having taken the videos/images or allowed them to be taken. Victims of deep fake pornography, however, never consented to anyone taking their images or videotaping them in intimate moments. That amps up the shock and horror when they find out they have become victims.

To combat this, folk should take a couple of simple steps. In September, Google added “involuntary synthetic pornographic imagery” to its ban list, allowing anyone to request the search engine block results that falsely depict them as “nude or in a sexually explicit situation.” Folks should add this term to their ban list. Secondly, make sure that you have made your images private to be seen only by those you choose to see them. While this won’t prevent a former friend or partner from abusing the privilege, it at least cuts down on the imagery available to the public.

According to a Washington Post article on the subject, the Defense Advanced Research Projects Agency, the Pentagon’s high-tech research arm is funding researchers with hopes of designing an automated system that could identify the kinds of fakes that could be used in propaganda campaigns or political blackmail. Military officials have advertised the contracts — code-named “MediFor,” for “media forensics” — by saying they want “to level the digital imagery playing field, which currently favors the manipulator.” Like much technology, it is usually developed by or for the government and then gets filtered down for use by the public.

Until then, if you find yourself the victim of a deep fake porn, contact http://CCRI.org and ask for help. The site has helped countless victims of revenge porn and lists attorneys in many states who have agreed to work for free or at a greatly reduced costs to help victims scrub these invidious and insidious images and videos off the Net.

Follow me on Twitter @oscarmichelen

2 Comments

Leave a Reply