AI technology is evolving at a scary pace. A Reddit user has developed an AI-driven desktop app that can take photos of anyone and swap them onto someone else’s face in any video.
It’s like Snapchat’s face swap filter, but just really good.
FakeApp is a desktop program designed by /u/deepfakeapp that can be used to create neural network-generated face swap videos from a series of images (taken from another video or a library of images) with a machine learning algorithm developed by /u/deepfakes.
The subreddit (NSFW warning) initially was created as a discussion forum for users to discuss deepfakes’s work, but since the public release of the app, all hell broke loose.
In short, almost all the face swaps have been of celebrity faces being slapped onto porn stars. Face swapped videos include the faces of celebs like Emma Watson, Scarlett Johansson, Gal Gadot, Taylor Swift, and Natalie Portman. Even crowd-favorite Nicolas Cage made a few appearances, but not in any pornos though (thankfully).
Even though celebrity/porn star face swaps are not a new thing, apps like these start to blur the lines on what might be real and what isn’t. Celebrities may not give apps like these too much attention just yet, but the bigger problem arises in cases where generated deepfakes could be used as potential revenge porn.
Initially the algorithm was created simply for deepfakes and his small following’s pleasure. However with an app now being made available to the public, it opens up a new door which current laws may not be able to close.
For instance, the regular Joe could use anti-defamation laws as a defense, but it would have to be proven that the deepfake video was made with the intention of trying to tarnish someone’s name. If the accused can prove that the video was created for pleasure purposes only, it would be hard to then say that it was created to cause emotional distress or harm a person’s character.
According to Wired, the only real solution to a problem like this would be to develop AI software than can scan through these faked videos to detect if they were altered or not.
At this point, there is no legislation properly covering the repercussions of technology like this, but it’s definitely interesting to see the direction in which software like will evolve.
In the mean time, it’s always a good idea to do an assessment of what content you post online. The FakeApp developer recommends using at least 200 images to develop a pretty good deepfake. If you’re trigger happy with the social media upload button, this might be a good time to re-think what photos you put up on social media.