Select Page

This column ran in Feb. 2018 and since then the technology for deepfake videos has only gotten better – meaning worse for people. An article in Motherboard today points out that there’s a version which “removes” clothing from photos of women (it doesn’t work on photos of men, surprise surprise) so I thought I’d rerun this column. 

Thirteen years ago, a photographer at a New Hampshire summer camp gave his bosses some CD-ROMs containing photos of campers, not realizing that he had forgotten to remove some very unpleasant images from them.

In those images he had taken the faces of 15-year-old girls from some of his camp photos and used Photoshop to morph them onto the bodies of adult women in pornographic pictures. When camp officials found the appalling images on the CD-ROMs, they called police.

The man was eventually convicted of possessing child pornography, but he appealed the case and, in January 2008, the state Supreme Court overturned the conviction.

The state’s high court said, in part, that the images didn’t meet the legal definition of child pornography because they weren’t made with actual children and thus weren’t illegal – despite the rage and disgust they produced in the girls and their families, not to mention anybody else who learned about them. The ruling also dealt with several other legal aspects, including the fact that the photographer had kept the images to himself.

Why do I dredge up this unpleasant story, which I covered extensively at the time? Because technology has improved wildly, that case carries serious implications for you, me and society as a whole.

As is often the case with new tech, it is first being applied in pornography. In some scrofulous corners of the internet, notably the discussion site Reddit, people who I will think of as sad, lonely males living in their parents’ basement have been using it to replace participants in porn videos with famous actresses.

A recent spate of news about deepfakes, including belated moves by Reddit, Pornhub and other sites to the end the sharing of such videos, reminded me of the New Hampshire case from years ago. So I called up Ted Lothstein, a Concord attorney whose oral arguments before the state Supreme Court convinced it to overturn the conviction of the camp photographer mentioned above, and sought out his opinion.

First of all, Lothstein pointed out a major legal difference between the New Hampshire Supreme Court decision and the deepfake sites, where videos are being openly shared. The camp photographer meant to keep his images private, which was one of the key points in his defense.

The images were, in other words, a sexual fantasy in digital form – and Lothstein noted that the law has nothing to say about our fantasies, even the most repellent ones.

“If you criminalize it, you’re criminalizing the person’s thoughts,” he said. “That would be a violation of the person’s substantive due process rights … the concept that some things can’t be criminalized, period.”

Lothstein thinks deepfake porn videos are perfectly legal if they are kept private, just as the photoshopped images were legal.

“Without a publication, I can’t see how any law is violated – copyright, trademark, right to privacy, the right to have your image be your own property – it’s hard to see how doing this in your home would violate the law,” Lothstein said.

But deepfake porn usually isn’t private. It is designed to be shared as widely as possible in order to titillate many and humiliate some. That complicates things from a legal perspective.

Lothstein said that if a shared deepfake videos involves a non-famous person like you or me it would almost certainly violate privacy laws that deny others the right to use your image in certain ways.

However, he said, it’s more complicated when deepfake videos involve famous people, who have much less expectation of privacy. Further muddying the legal waters is the fact that courts have upheld our First Amendment right to parody or lampoon famous people, even when it’s done in vulgar forms (look up Hustler Magazine v. Jerry Falwell to learn more).

All this is interesting, but why should I care about a slimy practice that affects only famous actors and actresses? Nobody, after all, is ever going to morph my face onto the studly pool boy in Horny Housewives Vol. MCXVII unless they want to make a farce.

But what if somebody morphs me onto a person marching in a white supremacist rally and shares the video, or morphs me onto a person participating in a discussion about the violent overthrow of the U.S. government and shares the video, or morphs me onto an adult participating in a child pornography video and says that if I don’t pay $10,000 they’ll show it to our friends and family and the police?

I’d sure care then.

Right now these scenarios aren’t feasible because deepfake software depends on examining many images of the person whose image is being morphed into the video. That’s why it currently involves famous actors and actresses, who are in lots of freely available video for analysis.

That won’t last, however. As the software improves, it will soon require analysis of just a few photos for artificial intelligence to be able to create an impossible-to-detect video likeness. At that point, deepfake videos featuring just about anybody doing just about anything will become possible.

The fallout will be sweeping.

Here’s one likely outcome: Video evidence obtained by police, such as hidden-camera shots of drug sales from a sting operation or a robbery filmed by surveillance cameras, will become useless in court because nobody can be sure it hasn’t been faked. Prosecutors, you’d better be ready.

And another, which I bet is already in the works: Political attacks.

Don’t tell me there aren’t activists working with this technology right now, creating fake videos showing (insert some politician here) in the act of doing (insert something bad here), and waiting for an opportune time to release them.

You think the debate about what’s real and what’s fake is toxic now? Wait until deepfake videos of Trump or Clinton or Obama or others start cropping up, feeding paranoia and suspicion on all sides. Our uncivil civil discourse will become even more uncivil, if that’s possible.

In the long run, I think debate will revert to the 1850s, before daguerreotypes came along. We already can’t trust photos – they’re too easy to fake – or audio files – they’re too easy to fake. Soon we won’t be able to trust video, either.

If nobody believes any objective evidence about what happened in the recent past, disagreements will be reduced to “he said / she said” arguments, making it harder than ever to change people’s minds. Public discourse will further ossify into warring camps, if that’s possible.

In other words, what started out being repulsive on the individual level is becoming destructive on the societal level. Like so many things on the internet.

Pin It on Pinterest