Share
Share
Share
 

FakeApp uses AI to take face swapping to a new level. Cue the celeb fake porn movies

Photo: nuttynutter6969/reddit

AI technology is evolving at a scary pace. A Reddit user has developed an AI-driven desktop app that can take photos of anyone and swap them onto someone else’s face in any video.

It’s like Snapchat’s face swap filter, but just really good.

FakeApp is a desktop program designed by /u/deepfakeapp that can be used to create neural network-generated face swap videos from a series of images (taken from another video or a library of images) with a machine learning algorithm developed by /u/deepfakes.

The subreddit (NSFW warning) initially was created as a discussion forum for users to discuss deepfakes’s work, but since the public release of the app, all hell broke loose.

In short, almost all the face swaps have been of celebrity faces being slapped onto porn stars. Face swapped videos include the faces of celebs like Emma Watson, Scarlett Johansson, Gal Gadot, Taylor Swift, and Natalie Portman. Even crowd-favorite Nicolas Cage made a few appearances, but not in any pornos though (thankfully).

Photo: derpfakes/reddit

Even though celebrity/porn star face swaps are not a new thing, apps like these start to blur the lines on what might be real and what isn’t. Celebrities may not give apps like these too much attention just yet, but the bigger problem arises in cases where generated deepfakes could be used as potential revenge porn.

Initially the algorithm was created simply for deepfakes and his small following’s pleasure. However with an app now being made available to the public, it opens up a new door which current laws may not be able to close.

For instance, the regular Joe could use anti-defamation laws as a defense, but it would have to be proven that the deepfake video was made with the intention of trying to tarnish someone’s name. If the accused can prove that the video was created for pleasure purposes only, it would be hard to then say that it was created to cause emotional distress or harm a person’s character.

According to Wired, the only real solution to a problem like this would be to develop AI software than can scan through these faked videos to detect if they were altered or not.

At this point, there is no legislation properly covering the repercussions of technology like this, but it’s definitely interesting to see the direction in which software like will evolve.

In the mean time, it’s always a good idea to do an assessment of what content you post online. The FakeApp developer recommends using at least 200 images to develop a pretty good deepfake. If you’re trigger happy with the social media upload button, this might be a good time to re-think what photos you put up on social media.

Whether you’re a novice or seasoned traveler, there is always that one person or group of people that occasionally bug…
If you’re buying a Samsung Galaxy S9 or S9+ from Verizon, expect to see Yahoo-based apps and Bixby news content…
Just two months after the unveiling of the Galaxy S9 and S9+, Samsung is will now offer more internal storage…
As cryptominers shift their attention away from GPUs, graphics cards prices may plummet as shipments are expected drop by as…
Facebook has started beta testing a new feature that allows fans of pages to limit conversations to their friends and…
Caribbean Airlines has joined the ranks of other international airlines by introducing a premium economy cabin. Dubbed “Caribbean Plus”, rows…
It’s 2018 and there are still many websites that believe in forcing users to watch autoplay videos. That’s right, we’re…
Sometimes I like to record a snippet of what I’m listening to on my phone’s iTunes player to post to…
If you’re an iPhone user (or use any iOS device as a matter), it may be time to consider using…
Like many other mobile phone manufacturers, Apple can’t keep anything a secret. In a recently leaked internal memo (a lengthy…
We're looking for up and coming writers to join our expanding team!