There is a new custom among humans, the custom of clicking a box that says, “OK” when you haven’t read the tens of thousands of words of microprint on the software use agreement you are entering. Out of this understandable absence of mind, an enormous industry arises. This industry trades in our personal data, the passively-generated facts about where, and with whom, we spend our time, what stories we visit, who we talk to, and what we say.
This enormous trade in data is not, as it currently exists, reconcilable with our society’s inherited notions of privacy. Often enough, this new data industry creates conflict with the Fourth Amendment’s constitutional protections afforded to our “papers, and effects.”
We can tame or modify this personal data industry with our laws, or we can decide to surrender our traditional notions of privacy as a common inheritance. Not just the privacy, but all that goes with it: intimacy, a sense of shelter, the ability to coordinate without the world checking in on you. Privacy could become like good schools, clean neighborhoods, access to great amenities: a privilege for those with means to find them.
Conservatives need to begin thinking about it.
We’re expanding the data sphere to sci-fi levels and there’s no stopping it. Too many of the benefits we covet derive from it. So our central choice now is whether this surveillance is a secret, one-way panopticon — or a mutual, transparent kind of “coveillance” that involves watching the watchers. The first option is hell, the second redeemable.
It’s important to note that Kelly is wrong. There is nothing inevitable about this, and we don’t have to have a transparent panopticon — either one way, or mutual. Conservatives actually need to make the case that there is no such thing as a transparent society.
European governments have jumped ahead of the United States, and passed regulations forcing tech companies to do a more thorough job disclosing the type of spying users allow. The difference in laws already leads to differences in operation. In member states of the European Union, Facebook must ask if you will allow it to use its facial-recognition technology to tag you, explicitly or internally, across all user photos. In the United States, it does this by default.
In fact, Facebook probably has a profile of you even if you’ve never used Facebook, or its subsidiary services such as Instagram. You just have to appear in the contacts list your real-life friends uploaded.
These giant reserves of personal data are almost unregulated. Think of all the questions you can’t answer about your data right now.
If a major data-business leaks your information about your debts, your loans, and your home address, do you know who to sue? Or what to sue for? Do you even have a sense about whether you’d have standing? After all, in the case of the infamous Equifax leak, your data may have been put out into public view, but you were neither an Equifax user or customer. They didn’t break a contract with you. You can sue, but it sure makes things trickier. The U.S. has standard practices for distributing personal property after you die, the product of centuries of moral reflection, religious conviction, and legal practice. But who is allowed to profit from your likeness, found in the photos you or your friends uploaded to social media? And for how long?
There are other social questions, such as whether it is good to have social networks automatically create the equivalent of digital cemeteries, terabytes and terabytes of accessible profile information about the dead, with active walls on which people continue to post their continued bereavement.
Do you know the quality of information you are broadcasting? Users of CPAP machines have recently discovered that the devices doctors prescribe to help them breathe were also sending information to insurance companies, helping them to avoid financial liability.
To be alarmist for a moment, there is a national-security angle as well. While the social-media networks have been hammered for hosting a few poorly designed Russian memes, the real problem is that giant troves of passively created data on what people do, who they speak to, and what their vices might be, used to be the sort of thing that foreign governments tried to create as potential intelligence assets, over years and with great expense and human intelligence. Now, they just need to exploit an unsecure database, or get a few assets hired into Silicon Valley firms.
These data industries are like vice industries. Ambitious and unscrupulous people make a business out of human vices — in this case, our laziness about reading contracts on computer screens and our idle curiosity about ourselves. So far, our political class has only hammered Silicon Valley for its supposed role in the disruption of the post–Cold War status quo. One wishes a politician or judge might hold them to account for the threat to privacy itself.