Technology is making it very easy to create “deepfake” videos that look real and, as you’d expect if you know anything about human behavior, there are growing cases online of people using it to create pornography that appears to involve real people, almost always women. MIT Technology Review has an article about it (here) that details the way it can ruin women’s lives, creating virtual “revenge porn.” This is leading people to create laws against it, although that’s hard to do without violating legitimate free speech.
New Hampshire faced a variation this issue 16 years ago in a story I covered extensively. It involved a summer camp for girls, whose official photographer used Photoshop to put faces of teenage campers younger than 16 onto the bodies of adult women in pornography pictures. These “morphed” pictures, which he called “personal fantasies,” were discovered by accident when he didn’t remove them from CD-ROMs of official camp photos. He was arrested and convicted of possessing child pornography.
The case eventually made it to the New Hampshire Supreme Court, which overturned the conviction. (That’s why I’m not giving names and details, although it’s easy to find if you do a little searching. I believe the photographer has since died.)
The legal issue involved child pornography rather than using technology to make fake pornography with identifiable people, so it’s a bit different than deepfake videos involving adults. But testimony at the trials and elsewhere centered, understandably, on the outrage and disgust and shame of the girls who were involved and their families, and the harm that was done – exactly the issues at the center of today’s deepfakes.
Child pornography is illegal not just because we think it’s disgusting (“violates community standards” is the usual legal term) but because it does harm to children – either when the pictures are made or when they’re distributed, which the law says can make it more acceptable for others to force sexual acts on children. The state Supreme Court overturned the conviction because they said this case failed those counts because actual children weren’t involved in the making of the pictures, and the photographer didn’t distribute them.
The supreme court ruling discussed past court decisions about child pornography involving wholly “virtual” bodies, like the Sims game, that didn’t involve people – with a 2002 U.S. Supreme Court case, Ashcroft vs. Free Speech Coalition, that found such laws unconstitutional. Things have changed since then in court rulings and I suspect deepfake videos will force them to change further.