Yesterday, Apple, Google, Facebook, and Spotify erased most of the posts and videos on their services from raving lunatic/radio- and web-show host Alex Jones.
Today, our David French pops up in the New York Times, offering the tech companies a better, clearer standard for when they can and should bar users, grounded in decades of established law.
Tech companies don’t have to rely on vague, malleable and hotly contested definitions of hate speech to deal with conspiracy theorists like Mr. Jones. The far better option would be to prohibit libel or slander on their platforms.
To be sure, this would tie their hands more: Unlike “hate speech,” libel and slander have legal meanings. There is a long history of using libel and slander laws to protect especially private figures from false claims. It’s properly more difficult to use those laws to punish allegations directed at public figures, but even then there are limits on intentionally false factual claims.
It’s a high bar. But it’s a bar that respects the marketplace of ideas, avoids the politically charged battle over ever-shifting norms in language and culture and provides protection for aggrieved parties. Nor do tech companies have to wait for sometimes yearslong legal processes to work themselves out. They can use their greater degree of freedom to conduct their own investigations. Those investigations would rightly be based on concrete legal standards, not wholly subjective measures of offensiveness.
Yesterday afternoon’s Twitter brawl amounted to anti-Jones voices accusing free-speech advocates of coming to Jones’s aid and defending everything he’s ever said and done, and free-speech advocates accusing Jones’s critics giving the tech billionaires veto power over public discourse and taking a chainsaw to the First Amendment.
A lot of the discussion about this on social media amounts to, “I don’t trust Facebook.” And that’s a reasonable position! Facebook has given a lot of people a lot of reasons to doubt its word and impartiality! None of the people who run these companies are constitutional scholars specializing in First Amendment cases, nor did they ever aspire to be in that role. They set up and joined these companies to make money — and now they’re in the weird position of American Public Discourse Police.
But right now, Alex Jones is fighting a defamation lawsuit from the parents of a six-year-old killed in the Sandy Hook shooting. The parents’ suit alleges that Jones showed his audience their personal information and maps to addresses associated with the family, leading to years of threats and harassment from Jones followers who claimed the shooting was a hoax. As this Wired article lays out, the ruling may depend on whether the judge and jury think Jones intended for the parents to be harassed.
A few shock jocks on talk radio have successfully deflected defamation cases by arguing that no one took their comments seriously. But even if Jones wins that argument, he might lose against Pozner and De La Rosa’s claims that he intentionally inflicted emotional distress. See, the tale of the shock jock cuts both ways: According to Baron, shock jocks were the only defendants she ever saw lose to that argument, because their behavior — while performative — was considered so outside any form of civilized norm. Might harassing (and doxing) the parents of a murdered child qualify? It’s apparently not outside the internet’s moral code. [In real life], it’ll depend on the judge, and the jury.
Your mileage may vary, but I think the argument about whether online platforms should ban Jones looks really different if a judge and jury determine, after hearing all of the arguments, that he said something literally indefensible. If you don’t think Facebook or YouTube or other platforms should ban Alex Jones after he’s lost a defamation lawsuit over what he’s posted on their websites, you more or less are arguing that they should never ban anyone.
And if you want that to be the rule, fine. Ironically, that’s close to how the tech companies saw themselves for a long time: as technology companies, not as media companies, and thus no more responsible for what gets written on Facebook than the people who build bathroom stall walls are for someone writing, “For a good time call Jenny at 867-5309.”
But as I pointed out a little while back, the bathroom-stall wall doesn’t delete messages it deems inappropriate, meaning that Facebook has already subtly acknowledged some responsibility for what ends up written on the site.
Imagine you invent a new social-media platform, and just as you’re about to launch it, someone tells you that it’s going to be used by neo-Nazis, Columbiners, gang members, child abusers, and so on. You might recoil in horror and hesitate about whether you actually want to offer that product to the world. At the very least, you would want to be able to deny those folks you find unacceptable from using it.
The don’t-ban-anyone crowd is arguing that Facebook and other social-media platforms are something like public utilities, something that should be available and accessible to everyone, no matter the circumstances. Public utilities are either run by the government or by heavily regulated by a public commission. Do we really prefer that path?
The Book Is Closed on a Horrific Event, with Less-Than-Satisfying Answers
It was easy to miss the announcement last week from the Clark County Sheriff’s Office that they had closed the case on the October 2017 deadly mass shooting in Las Vegas and concluded that they could not find a clear motive.
There will be some who will be able to shrug it off with, “He was crazy,” and no doubt on some level he was. But it was the kind of crazy that didn’t interfere with him meticulously planning this over a long time — researching open-air concert venues, Las Vegas SWAT tactics, weapons and explosives, and purchasing more than 55 weapons between October 2016 and September 2017. Casinos are filled with security guards and cameras, and he managed to bring an arsenal into a room and launch the deadliest mass shooting in American history. Somewhere ISIS is asking, “Why didn’t we think of that?”
If the shooter had made a statement of allegiance to ISIS or some other extremist group, this would at least have been easier to understand. But police said there was “no evidence of radicalization or ideology to support any theory that [the shooter] supported or followed any hate group or any domestic or foreign terrorist organization.” Some wondered whether the selection of a country-music festival represented a deliberate target — a crowd that probably represented Trump voters or perhaps some other demographic that the shooter hated.
But police found evidence he had also considered targeting another music festival, with a different style, in a different city: a reservation for a hotel during the Lollapalooza music festival held at Grant Park in Chicago during the month of August: “Like the Route 91 Harvest music festival, the Lollapalooza festival was held in an open-air venue. Paddock specifically requested a room overlooking the venue when he made the reservation. That reservation was cancelled two days prior to the check-in date.”
The police report offers a theory that is quite chilling, quoting the shooter’s brother, Eric.
Eric believed [the shooter] may have conducted the attack because he had done everything in the world he wanted to do and was bored with everything. If so, [the shooter] would have planned the attack to kill a large amount of people because he would want to be known as having the largest casualty count. [the shooter] always wanted to be the best and known to everyone.
[The shooter] would not have cared about the people he killed. It would not matter their race, religion or sex. [the shooter] was described by Eric as a “narcissist” and only cared for people that could benefit him in some way. Eric stated [the shooter] needed to be seen as important and needed to be catered to.
It seems unimaginable: Today, 58 people are dead and more than 800 are recovering from injuries, with thousands more traumatized, because some gambler got bored with his life and this was the only way he could conceive to end that boredom.
Speaking of Mass Shootings . . .
A change has come to the Broward County Sheriff’s Office:
Capt. Chris Mulligan, a military veteran and sheriff’s office employee for 19 years, will replace Capt. Jan. Jordan.
Parkland city officials asked the Sheriff’s Office to replace Jordan after complaints about her leadership during the shooting Feb. 14.
Among the criticisms: A Coral Springs deputy fire chief repeatedly asked her for permission to send his medics into the school but was rebuffed. At the time, the shooter hadn’t been caught and only a handful of specially trained SWAT paramedics were in the school.
Jordan told the deputy fire chief she’d have to check before letting more medics enter, he said. By the time the whole building was deemed safe, there was no need — everyone had been brought out by police or was dead.
Any chance that the sheriff who went on CNN and demonized Dana Loesch, knowing that the officer at the school didn’t engage the shooter, can go with her?
ADDENDA: Think of this closing note as a crossover with The Remnant: It’s a shame to see Arthur Brooks of AEI so shamelessly echoing “Bear Propaganda.” Surely Jonah will remain vigilant.