Latest News

Deepfakes are a problem — So what is the solution?

By Carl Szabo  Originally published in The Hill

“Deepfakes” is the latest scary buzz word circling Capitol Hill. It’s basically a fake video made to look real.

But “deepfakes” is really just a new word for “photoshopping” of digital images. It can be putting a face on someone else’s body with the intent to deceive, or it might be an obvious attempt at satire (like John Snow apologizing for Season 8 of Game of Thrones).

Photoshopping and misreporting have been around for decades. The famous radio broadcast of War of the Worlds was a deepfake. The Blair Witch Project was a deepfake.

Politicians worry that now, social media allows deepfakes to spread more quickly and broadly than ever. Some are using the deep concern over deepfakes to do an end-run around the First Amendment — to stop the spread of misinformation they consider unflattering.

Viewers can’t always tell if a video is a fake, but the good news is that technology can identify when an image or video has been edited. The real problem is not deepfakes, but more, the spread of misinformation.

Click here to continue reading in The Hill.