Trading Faces and Privacy

Trading+Faces+and+Privacy

Hayk Rostomyan, Multimedia Editor

We all have fun using the “face swap” feature on Snapchat, an option to quite literally trade faces. Whether it’s to change faces with our friends or look like Trump for a silly joke, the technology is often used in benign ways. However, facial recognition has become a nightmare for Hollywood with the birth of a craze known as “Deepfake.” These allow users to create videos with individuals using their likeness. As is so often the case, the technology has been used to create fake porn videos of celebrities with a machine learning algorithm. Crazy, scary stuff, and a long way from Snapchat’s “face swap.”

In December, Motherboard, which is an online website that covers technology, reported on a program developed by a team of college students that is able to replace the faces of celebrities and put them in a completely new video or film. As long as enough facial footage from all possible angles exists, movie magic can be made. With that footage, anyone using the aforementioned program can switch out the face of, for example, Amy Adams from “Man of Steel” with Nicholas Cage’s face for some good laughs. But for every silly Nick Cage meme, there is somebody who takes things to the extreme. Since then, a FakeApp developer became available in January through a Reddit community page.

Unfortunately, some people have taken the likeness of Emma Watson and Gal Gadot, and superimposed their faces into porn. There is an obvious problem here – it deals with concern about their image and the violation that they experience by the implication they are involved in such a film. This has launched a big controversy between a large number of Hollywood actresses and websites that host this content. The term Deepfake came from a “subreddit,” a community site under the Reddit website banner. On Feb. 7, Reddit banned the Deepfake from its pages. “I sincerely thought I was effectively filtering or manually removing all of that sort of content as it was being posted, but it is possible some slipped in and I didn’t catch it,” the Deepfake app moderator wrote in a statement on Reddit. “I am disappointed but I will respect the admins’ decision.”

The website Verge contacted Eric Goldman from the Santa Clara University School of Law and he told them that the best option for these celebrities is to sue for copyright and defamation. He stated that there is a case for defamation since the footage is altered and effectively tells a lie about the actors and actresses. However, “It’s almost impossible to erase a video once it’s been published to the internet,” said Goldman. “… If you’re looking for the magic wand that can erase that video permanently, it probably doesn’t exist.”

The program can be used to create Hollywood magic, but a majority aren’t using it for true artistic purposes. Instead, they’re creating porn without consent of those being featured in it.

CGI technology has grown by leaps and bounds in recent years, allowing this technology to be used in more benign ways. Picture bringing dead celebrities to life, for example. We have already seen CGI being used to make Jeff Bridges look like a younger version of himself in “Tron Legacy.” Now we can take James Dean’s face and put it on a living person so we can have the sequel to “Rebel Without a Cause” that everyone has been asking for. Or you can make a movie where everyone shares the same face but their bodies are different. We have similar editing tactics in Hollywood now, but this algorithm will make it easy for even college students to make something wonderful.

Alternatively, with this software being available for everyone,  it now poses a danger, even outside of the realm of porn. Any given person can now create propaganda on their political enemies. President Trump’s face can be superimposed into a video of him doing cocaine, or a young Bernie Sanders can be shown in KKK rallies beating and hanging African Americans.

Ultimately the fake news will be revealed as being fake but if it takes too long to determine the legitimacy of these videos they might cause someone to lose an election. This app is a powerful tool that can be used for good and evil.

 

Hayk Rostomyan can be contacted at  [email protected]