The Corner

Politics & Policy

Don’t Hide Extremism, Counter It

Kyle Smith wrote an interesting piece on Thursday. He argues that kicking white supremacists off the Internet, as tech companies from Google to GoDaddy seem determined to do, will make it harder for the police to track extremists and catch them before they turn to violence.

I suspect he’s right; in just the second half of 2016, Google complied with over 26,000 government requests for its users’ data. This data helps our law enforcement agencies to stop attacks before they happen. But if the extremists are kicked off Google, there won’t be any data to hand over.

This gets us to the heart of the problem with the tech companies’ approach: Our goal should not be to hide white supremacists from view but rather to counter their extremism.

It is thus quite useful to have extremists using third-party platforms. Recognizing that they are going to find ways to communicate and congregate, those platforms—as long as they are controlled by us and not them—can be used to moderate or convert extremists with counter-messaging.

Jigsaw, Google’s in-house think tank and technology incubator, has developed a program called the “Redirect Method,” for exactly this purpose. Currently deployed on Google and YouTube’s search engines, it uses an algorithm to identify ISIS or white supremacist sympathizers from their search terms. Then, instead of supplying advertising corresponding to their interests, it advertises Arabic- and English-language YouTube playlists designed to moderate their views. These can include testimonials from former extremists, imams denouncing ISIS’s corruption of Islam, or embarrassing clips of the incompetence of neo-Nazi groups, all advertised under headlines such as “Is ISIS legitimate?” or “Want to Join ISIS?” rather than explicitly anti-extremist titles.

In use on Google since 2016 and on YouTube since July, the Redirect Method seems to be effective. A pilot program found that users were three times more likely to click on these ads than on normal advertisements, and spent far longer watching the counterextremism videos than on normal YouTube videos.

These videos are of course only a first step. De-radicalization is extremely difficult, often requiring prolonged human contact with an understanding but more moderate interlocutor. It is not easy, after all, to abandon a worldview. That said, platforms like Google and YouTube are valuable tools to reach people and help bring them into contact with new ideas that can persuade them. We can leverage the popularity of these platforms, even among the worst of the worst, to help make them a little less bad, or a little less violent.

However, by expelling them from these platforms, we forfeit that opportunity. The actions of large technology companies are sending extremists underground, where we cannot influence and moderate them, or counter their messaging. This, I fear, is a mistake.

Exit mobile version