Imagine that a group of us invents our own new language. Let’s say we’re eleven in all, the size of a football team that relies on hand signals and barks of “Omaha!” code to deceive the opponent and move the ball down the field. Could the government ban us from speaking our new language to each other or from using it until some FBI linguists mastered it? And all on the off chance that we might use our new language to carry out a terrorist attack, execute a massive fraud, conduct a child-porn enterprise, or commit some other heinous offense.
Could the government tell us that literature, poetry, and innovation in our new language could evolve only as ploddingly as government agents could keep up? Our collective creativity would be stifled. We’d be presumed innocent in court if we ever actually committed a crime. Could the government nevertheless decide that it must — for our own good, of course — presume us guilty in our daily lives?
Do we have freedom of speech, or freedom to speak only what law-enforcement can monitor?
Do we still believe that? Do we have freedom of speech, or freedom to speak only what law-enforcement can monitor? Does the Fourth Amendment guarantee freedom from unreasonable searches, or afford only whatever expectation of privacy the government, not the society, decides is reasonable — and cabined by what the government is technologically capable of searching?
Within the four corners of the case, there are many good reasons to rip Apple. The government has overwhelming probable cause to search the phone after the mass-murder attack. There is a compelling public interest in identifying other jihadists and terror plots about which the phone data may provide evidence. And in the narrow confines of this case, Apple is protecting nobody’s privacy. Farook is dead, and the phone wasn’t even his: It belongs to the municipal agency that employed him, the San Bernardino County Department of Public Health, which assigned the phone to him for work purposes. Farook waived any conceivable privacy interest by signing an acknowledgment that the SBCDPH could search the phone at any time. And the SBCDPH, which is cooperating with the FBI, consents to the search of its phone.
Apple should not be fighting on this one. It should be working with the FBI to try to catch the terrorists before we have yet another San Bernardino, or worse. Yet, much as I want to rip the company’s execs, I’m inclined to cut them some slack.
That’s because cases like this are not tidy four-corner disputes that arise in a vacuum. They have a context. This case occurs in the midst of a government campaign to accomplish something as dangerous as it is preposterous: the straitjacketing of Internet communications technology by the limits of government’s surveillance capabilities. Because those capabilities are woefully finite, the government wants companies like Apple either to cease to innovate or to accept the government as its business partner.
Apple, like most private companies, does not want to be in business with the government, does not want its intellect stifled and its intellectual property usurped. And in our system, it is supposed to have that right. Indeed, the economic vitality that enhances our national security and fills our profligate government’s coffers depends on its having that right.
Apple does not want to be in business with the government, does not want its intellect stifled and its intellectual property usurped.
The focus of the broader battle Apple is fighting against the government is encryption. The revolution in Internet applications, which has mainstreamed encryption on both ends of routine communications, has created the equivalent of languages the FBI is unable to translate, of code it cannot crack. Director James Comey’s eloquent insistence to the contrary notwithstanding, the Bureau is seeking “backdoor” access to the encryption technology that companies like Apple create. Nor is it just the FBI. President Obama and British prime minister David Cameron both advocate the prohibition of communication methods the government is incapable of tracking.
Their rationale is pleasantly straightforward: Government has been able to penetrate communications technology in the past if it had probable cause, a warrant, and physical access to the relevant device. Encryption upends this arrangement; therefore, encrypting must be prohibited or licensed only insofar as the government is competent to decrypt.
Unfortunately, the rationale gets it exactly backward.
RELATED: Hooray for Tim Cook
You get the point here? We have a constitutional right to communicate. We have a constitutional guarantee against unreasonable searches — generally, searches without warrants based on probable cause – which Congress long ago extended to our telephone calls and subsequent technological advances in communication. Our rights pre-exist and are independent of law enforcement’s capacity to intrude on them. It is law enforcement’s burden to evolve technological surveillance capabilities that can be deployed in a manner consistent with our rights; our rights are not burdened by the confines of law enforcement’s capabilities. The point of the Constitution is to limit government’s ability to intrude on liberty, not to limit the scope of liberty to government’s capacity for intrusion.
It is law enforcement’s burden to evolve technological surveillance capabilities that can be deployed in a manner consistent with our rights.
These core assumptions veered off the rails when, in 1977, the Supreme Court saw fit to commandeer the New York Telephone Company into the FBI’s wiretapping operations. The decision rested on the dubious theory that, as a highly regulated, monopolistic public utility, the phone company was a virtual arm of the government. In fact, the Court stressed that New York Tel had “a duty to serve the public”; that it had a “substantial interest” in helping the FBI, since the government’s objectives were not “offensive” to the company’s; and that the assistance it was being forced to provide was “meager” and not “in any way burdensome.”
Of course, the ratchet of big government is such that this theory has expanded over the ensuing half-century. In the 1990s, when digital and wireless cellular technologies began to supersede traditional analog, government could not keep up. It thus proposed a radical solution: namely, that telecoms be required to build wiretapping capability into their systems – i.e., that advances in communications technology beneficial to hundreds of millions of American citizens and businesses be stymied until government’s capacity to monitor a few thousand criminal suspects was assured.
The result was CALEA — the Communications Assistance to Law Enforcement Act of 1994. CALEA enabled the government to eavesdrop on many bad actors — to make cases and even prevent crimes. It also had predictable downsides. A “wiretap interface” to give the U.S. government built-in monitoring capability cannot be limited to the U.S. government. The network security weakness it necessarily creates is routinely exploited by criminals and spies.
The increasingly obsolete CALEA principles are what the government now wants to transpose to the Internet, a very different animal from telephony. We are no longer talking about a few big companies providing the single basic service of communications; the Internet is a Wild West of information and innovation, in which apps with communications capability are created not just by Apple, Google, and Facebook but by hundreds of thousands of ordinary people with a bit of basic knowledge. Even if the toolkit for regulating telecoms, as it evolved, were a comfortable fit with constitutional principles, it is a mismatch for regulating the web.
Encryption evolved because it is vital to protect information in the information age.
Moreover, encryption did not develop as a means for bad actors to evade surveillance. That is a side effect, one that is comparatively small, albeit horrifically consequential when terrorists strike or innocent people are victimized by crime. Encryption evolved because it is vital to protect information in the information age. Even if government had the technological capacity to keep up with advances in encryption (it doesn’t), making technological advance the slave of government capabilities would create catastrophic vulnerabilities. It would risk our data storehouses, intelligence, business records, financial assets, credit transactions, power grids, protections against rogue states (see, e.g., China’s recent theft of over 20 million U.S.-government personnel files), etc.
Back to Apple and the FBI. For what it’s worth, I believe San Bernardino is the wrong battleground for Apple to mount its defense. Its claim that by seeking access to Farook’s data the government is effectively seeking backdoor access to the encrypted data of all Apple customers is overblown. In the narrow case, the FBI is not asking Apple to decrypt anything. It is asking Apple to remove the security block on the phone so that the FBI, using its own algorithms, can attempt to gather the data. As programmed, the block renders the data indecipherable if a user enters the wrong personal identification number (PIN) ten times in a row; thus, if the FBI tries to figure out the PIN on its own, it could inadvertently destroy the data, defeating the purpose of the search.
By its own questionable reasoning, the Supreme Court’s ruling in New York Telephone should not apply to Apple: The company is not a public utility, it does not have a duty to serve the public, its interests in enhancing privacy do not align with government interests in enabling intrusion, and the demands the government is making would be burdensome. Nevertheless, the trajectory of jurisprudence since that 1977 case — the aforementioned big-government ratchet — is decidedly against Apple. Under current law, the company’s resistance is not just futile, it is nigh frivolous. In this skirmish, Apple should do the right thing and help the government root out the terrorists.
But can we really blame the company for keeping an eye on the wider war as it deals, ham-handedly, with the current skirmish? On encryption, Apple figures resistance is a language the government understands.
— Andrew C. McCarthy is as senior policy fellow at the National Review Institute and a contributing editor of National Review.