Deep Fakes

When people started to realize the full potential, a program like Photoshop could have, it seemed as if a revolution was on the horizon. Our trust in photography as evidence could not be rescued and a dark age of uncertainty would emerge. Some time ago, I was talking to a judge who told me about meetings and conference they have had back then, where they were discussing the bleak implications. It seemed, as if there could be no other possible outcome than to get rid of photography in the legal system altogether. But what might be there to replace it?

In retrospect, digital photography in general and photo editing software like Photoshop in particular, did have an influence on how photography is being used. But contrary to the fears, the role it plays is bigger than ever. Even in courtrooms and the legal setting.

I mention this, because I had a déjà vu recently. At the moment, there is quite a lot of fuzz about a piece of software called Deep Fake – since it has “Fake” in its name, there has to be fuzz about it in the media. This program relies on deep learning algorithms to automatically replace the faces of persons in videos, with the faces of other people. The software manages to match the facial expressions and lighting. Some of the results are more convincing than others, so there is definitely room for improvement. But undoubtedly the software will improve over time.

Naturally the first usage people have found for this new technology is porn. It always is porn. The faces of celebrities have been used to replace the faces of people in porn videos. As said, some of the videos are more convincing than others.

Many of the videos disappear quite quickly, since they seem to violate the guidelines of the porn sites they are uploaded to. But I wonder why exactly. Is this really a violation of copyright, since the videos are clearly edited from the source material? Or is it a violation of privacy rights? But here, the argument is weird. The outrage these videos cause, is precisely because they do not show the real people. Rather the faces shown are merely based on the celebrities. Sure, the title then claims that a certain video shows a certain celebrity. But this is something that is done constantly. When Sarah Palin, back in 2008, was running as vice presidential candidate in the US election, a whole flood of porn videos was created, where actors posed as Sarah Palin. On the Internet, quite a few videos were marketed as the real deal. But even, if something was called “Sarah Palin real porn”, it still wasn’t Palin, that was to be seen in the video.

These deep fake videos do not claim to be the real deal, but rather they are clearly marked as deep fakes. Sure, over time that might change, but right now even the URL of these websites mentions the fake. People seem quite proud about this new toy and deep fake seems something worth mentioning. So where is the real harm in these videos? Take the Melania Trump video. It is clear that this is not her. And as a matter of fact, it is not her. The body isn’t, since that belongs to some anonymous porn actress and the face isn’t, because this is merely the result of a calculation that was loosely based on some real video footage of Melania Trump.

As a quick side note. At the moment of writing, I have yet to encounter a video, where the celebrity, whose appearance is being used, is male.

Some actors have already begun to get trademarks on their own faces and we might finally have reached the point, where this becomes relevant. So, the liable aspect might be a mere violation of a trademark, yet that should normally not spark too much outrage.

I think the outrage has a lot to do with the current fake news debate. People are just afraid that fake news might become indistinguishable from the reals news. And every report that might support that fear, is amplified. And technology seems quite scary in general. In 2016, for instance, Adobe (the manufacturer of Photoshop) presented a new software Adobe VoCo. This program was called “Photoshop for audio” and lets users create new voice tracks from pre-recorded audio. The clue here is, that the software is able to create entirely new sentences with a voice that resembles the source material.

This is similar to Deep Fake. Source material is analyzed and used to create something new. Since we have become so used to photo editing software, the parallels might be a bit hard to spot, but this is exactly the stuff Photoshop enabled the inept layman to do in his basement. Photoshop made it possible for almost everyone to alter images more or less convincingly. The fact that Deep Fake or Adobe VoCo use deep learning algorithms to some extend is insignificant. To the normal user all three programs are a black box and very few people have a clear understanding, what Photoshop actually does, when its filters or tools are used. Deep Fake automatizes very difficult crafts and so does Photoshop.

We, as a society, have proven extremely resilient to the dangers posed by Photoshop. Never do I get the feedback from friends after posting or sending an image, where they question the authenticity of my post. Debates on the authenticity of images happen, and they happen quite prominently, but taken that Photoshop exists on millions of computers and in every news room around the globe, these debates are quite rare.

Fake news is a buzzword currently, and everything that could support the argument that fake news is on the rise, gets vastly amplified attention. But disinformation, false claims and denial of evidence are not new. They are at least as old as interaction between bigger groups of people.

Sure, the way fake news spreads is evolving, and so are the tools used. But every tool in the media toolkit might be used that way, even pen and paper. And if the tools do not work to your liking, you can always claim that a piece of evidence is false. Denial is the most important weapon for people trying to spread fake news. And for denial, no one needs special skills.

That way, I believe, that the bigger impact these new tools might have in this debate, might come from the claim that they were used in the first place. They will certainly become part of the denial game. Comparable to the way people nowadays claim that a picture is photoshopped and should therefore not be seen as real evidence. This can easily taint every real evidence and therefore it becomes quite damaging.

Using these tools in a fully convincing fashion is always too difficult. With a picture that has been doctored with Photoshop, people always seem to find the source material. Or they spot minute irregularities that give away the fake. I am not paranoid, so I don’t believe that there are too many fake images out there, everyone believes in. The positive feedback someone receives for proving that a picture is doctored is just too tempting, and since these photo editing tools are so widely available, too many people know what they can do and how that then looks like. Someone always spots the fake.

But the danger lies more in the doubt these tools can create. Tools like Deep Fake and VoCo might in the end become household names, just like Photoshop. And when this happens, too many people might expect these tools being omnipresent. Everything becomes doubtful.