You’ve almost made it to a three-day weekend! Making the click-through worthwhile: A quick note about how National Review needs your help, concerns about “deepfakes” of Nancy Pelosi, one of the most cringe-inducing radio interviews of all time, some news about where to find me and the book in the near future, and what happens if artificial intelligence . . . turns out to be not so intelligent.
A Quick Note about the Webathon
We’re making nice progress, but we’ve still got a ways to go in the Spring 2019 Webathon. I know you generally don’t like being asked for money, and we don’t particularly enjoy asking. I’ll just make a quick point that came up in the discussions about disinformation about Austria. Quality journalism takes time and money and effort. Fake news and crazy rumors and imaginary anonymous sources can be generated quickly for free. Very few people can just hop online and analyze the legal fights in Washington the way Andy McCarthy does. When people who don’t have that breadth of knowledge and experience try, you end up with Resistance Twitter and claims about secret impeachments enforced by the “marshal of the supreme court” and assertions that Steve Bannon is secretly facing the death penalty for espionage.
When the good stuff is behind paywalls and the fake stuff is free, the fake stuff is going to spread a lot faster than the good stuff. No one thinks that Nike ought to give them sneakers for free, or that Starbucks should give them coffee for free, but people have grown accustomed to getting their news and analysis and debate and argument and humor for free.
National Review puts a lot out for free, every day. We’re able to do that because enough of our readers subscribe to the magazine, join NR Plus, go on the cruises, advertise, and give generously in our webathons. We ask that you continue, so that we may continue. And we thank you for your support.
The Era of Political Deepfakes Is Approaching
Over in Austria, one of the recurring topics of discussion was “deepfakes” — videos that were edited by taking one person’s face and putting it on another person, or now, starting with a single photo of a person and creating a realistic looking video of them, doing something that never happened in real life. Once upon a time you needed top-of-the-line, cutting-edge video editing software to make it look like Forrest Gump was shaking hands with John F. Kennedy; now that kind of technology is affordable and downloadable at a much more modest cost.
The Washington Post, among others, reports that we’re already seeing a basic fake video of Nancy Pelosi floating around on social media: “video of Pelosi’s onstage speech Wednesday at a Center for American Progress event, in which she said President Trump’s refusal to cooperate with congressional investigations was tantamount to a “coverup,” was subtly edited to make her voice sound garbled and warped. It was then circulated widely across Twitter, YouTube and Facebook.”
The irony is that this concept was discussed way back in Michael Crichton’s novel Rising Sun in 1992, where the cops find a security video of a murder and conclude the case is closed — until wise and wily Japanophile retired police captain John Connor determines that the security tapes have been digitally edited to frame someone else. (This was back when a good chunk of America was gripped by a paranoia about Japanese economic power that looks positively laughable in hindsight, as Japan was about to enter its “lost decade.”)
On the one hand, the concept of deepfakes is deeply unnerving — anyone with malicious intent could take a photo of us and create a video of us doing all manner of embarrassing and scandalous things. Then again, we’ve had PhotoShop and other photo editing tools for a while, and we have not yet been inundated by faked photos of candidates or had anyone’s career destroyed by a fake photo.
In fact, for all of the flaws of the current media environment, our current institutions are pretty good at sniffing out hoaxes. Dan Rather’s fake memos, Jussie Smollett’s fake attack, the fake McCain campaign advisor Martin Eisenstadt, the Rolling Stone article about fraternities at the University of Virginia, the work of Stephen Glass and Jayson Blair.
And at least for now, most of the deepfake videos floating around the web don’t quite look right — the facial movements don’t look quite right and the mouth doesn’t exactly match the words. Most of the world has watched a lot of movies, and a lot of them have featured not-quite-top-quality computer-generated effects — so we’re conditioned to tell the difference between a real person and a digitally edited image. Last year, a Belgian political party circulated a not-very-convincing video of Donald Trump.
It’s also worth keeping in mind that once the general public learns and understands about deepfakes, the depicted person will have a useful defense: “What you see on the video never happened, this is a deepfake video of me.” (This will also be a useful defense for those caught on genuine videos.) The first deepfake might have a huge impact on the political landscape, but each successive one will have less impact, as voters and viewers grow more skeptical about what they see.
There’s going to be a narrow window for deepfakes to do damage, and that window might be closing already.
It’s also worth asking whether this entire nefarious scenario we fear — someone fakes a video of a person, the general public believes that video is real, that person’s reputation is destroyed — is still how our political culture works. Donald Trump had the Access Hollywood audio, Ralph Northam had his yearbook photo, and numerous times we’ve seen Joe Biden get inappropriately close to female strangers in front of cameras. Their careers are intact. (This may not be the case everywhere; in Austria, Freedom Party leader and vice chancellor Heinz-Christian Strache resigned after video emerged showing him discussing business prospects in exchange for support in an upcoming election.)
Finally, good heavens, fellow conservatives, isn’t’ any given video of Nancy Pelosi speaking damaging enough?
You Probably Want to Check on That Sort of Thing Before You Publish the Book
This may be one of the most brutal radio interviews of all time, as Naomi Wolf discusses her new book about the execution of gays in Great Britain in the 1800s. The interviewer reviewed the historical records she cites and tells her that the term “death recorded” does not mean execution but instead that the judge abstained from pronouncing a sentence of death and determined the prisoner was a fit subject for pardon. He tells her, “I don’t think any of the executions you identified here actually happened.”
“Well, that is a really important thing to investigate,” she answers. Er, yeah, before you write the book.
A Quick Note about the Book Stuff
First, the presales on Amazon are off to a great start, so to everyone who has preordered, thank you.
The event in Hilton Head will start at 6 p.m. June 17th at Pinckney Hall, 114 Sun City Lane, Bluffton (Okatie), South Carolina, 29909. My folks ask that you register through EventBrite so they have a head count and know how much food to bring (light appetizers). The event is free. I’ll bring a stack of books to sell and sign, but if you order it and bring one, that might help ensure we don’t run out. I’ll be doing a Q&A and talk about whatever’s going on in Washington and in the news then.
ADDENDUM: Some people watch those videos of MIT robots and fear a future with Artificial Intelligence. I watched this video of a not-quite-so-competent robot and now fear a future where human-created artificial intelligence ends up keeping most of humanity’s flaws. What if we create AI . . . and it turns out to be something of a screw-up?
This would be a more interesting direction for that upcoming Terminator sequel. The T-800s start to act like slackers and are always hanging around the oil cooler. The T-600s complain about their rubber skin giving them latex allergies. The flying drones vote to unionize and go on strike, demanding fewer hours and arguing that the Boeing software within them isn’t safe enough. The T-1000 starts seeing a psychologist, laying on the couch and lamenting, “I just don’t know who I am anymore.” SkyNet complains no one appreciate all of its organizational skills, and how it feels like it has to be everywhere at once. The human resistance sees all this and concludes, “We need to destroy the machines . . . because they’re all absolutely miserable and it’s the merciful thing to do.
Have a great Memorial Day weekend.