The internet has always had a crush on Nicholas Cage and put him at the center of thousands of memes. The latest Cage viral mayhem, however, is both entertaining and incredibly creepy. With the help of a crafty software called FakeApp, which uses deep learning technology, people have been inserting Nic Cage’s face into all sorts of famous movies.
FakeApp uses technology that allows you to scan the face of a person, then uploads it to pre-existing video content. In our case, people have hilariously swapped Cage’s face for Andy Samberg, James Bond, Indiana Jones, even freaking Lana Lane from Superman.
Deep learning employs neural networks which are interconnected by nodes that run automated computations on input data. Deep-learning software attempts to mimic the activity in layers of neurons in the neocortex, the wrinkly 80 percent of the brain where thinking occurs. It’s the closest we’ve come so far to the real purpose of artificial intelligence — real learning, like recognizing patterns from sounds, images, and other data, instead of task-specific pre-configured instructions from a programmer.
In the case of FakeApp, after an initial training session, the software’s node arrange themselves to convincingly superimpose a celebrity’s face over any kind of video. The more and varied the original video content is, the most convincing the final representation will be.
These Nicholas Cage fake videos are certainly hilarious but it’s not all fun and games. When this technology meets forgery, people’s livelihoods are at risk. Last month, for instance, we learned how a redditor used open-source deep learning tools to face-swap celebrities onto the bodies of porn actresses. There’s a whole subreddit dedicated to fake deep learning porn. The results are surprisingly convincing, considering these clips had a one-man production team. Imagine the kind of things someone with a sizeable budget could accomplish.
Similar technology also exists for faking someone’s voice. A combination of the two means that what used to be traditionally solid evidence in court (video and audio) is now suddenly questionable. What’s more, it’s not hard to imagine a future where such doctored videos and footage can be used to blackmail celebrities or put someone in a bad light. By the time the fake footage is publically called out, the damage is already done in the minds of most people.
Prepare for #FakeNews 2.0 — uglier and more polarizing than ever. The future doesn’t seem boring at all but it does seem frightening.