A Digital-Surveillance State Won’t Make Us Any Safer

Visitors experience facial-recognition technology at the China Public Security Expo in Shenzhen, China, in 2017. (Bobby Yip/Reuters)

In the liberty-vs.-safety cage match, we should be rooting for liberty.

Sign in here to read more.

In the liberty-vs.-safety cage match, we should be rooting for liberty.

O n April 18, 2018, Joseph James DeAngelo visited a Hobby Lobby store near his home in Roseville, Calif. As he shopped inside, Sacramento investigators swabbed his car door handle, obtaining a sample of his DNA.

Months later, police arrested DeAngelo under suspicion of being the “Golden State Killer,” a serial murderer and rapist who had evaded capture for 40 years. The swab taken from his car door, coupled with DNA collected from a tissue he discarded from his home, was run through a publicly available DNA database, allowing the cops to construct a family tree of the perpetrator. From there, they narrowed down potential suspects to men of a certain age who lived in the area at the time of the crimes.

There was much celebration when DeAngelo, who, beginning in the mid-1970s, is believed to have committed at least 13 murders and 50 rapes, was finally caught. But his capture came with one unsettling footnote: You mean you can just access a person’s entire life history if their hand brushes up against a door handle?

This use of DNA wasn’t like the time-honored process of lifting someone’s fingerprints and comparing them with those of known offenders. You leave traces of your genetic makeup everywhere you go, and with the help of forensic genealogy, you can be identified by them. And it’s not just your genetic makeup — you are spreading the DNA of everyone else to whom you are related, which can aid in the identification process whether you know it or not.

Unlike the fingerprinting process, the results aren’t just known by law enforcement. Again, the Sacramento cops found DeAngelo by accessing a publicly available DNA database used by people trying to find biological relatives or solve family mysteries.

The Orwellian end result is, we might all be on the grid at all times.

And, of course, DNA identification is a small portion of the way technology is making us perpetually trackable. Of course, everyone wants to be safe, but the ubiquity of cameras, GPS, tracking devices, the internet, and AI is also lulling Americans into a false sense of personal privacy.

With technology that currently exists, we could reduce criminal activity significantly if citizens allowed themselves to be monitored. Just last month, the National Transportation Safety Board recommended that every car be fitted with a device that would prohibit the car from exceeding the posted speed limit. In Washington, D.C., auto theft was so rampant, the police threw up their hands and gave people free Apple tracking devices to hide in their car so the car could be found when it was inevitably stolen. When Democrats passed a massive infrastructure bill in 2021, it included a provision requiring auto manufacturers to install a “kill switch” on cars that would either disable the car automatically if it detected erratic driving or allow a third party to disable the car remotely.

If safety at any cost is the goal, why shouldn’t every car be equipped with an ignition lock that would require the driver to pass a breathalyzer before they start the engine? Or with a camera so police can monitor the goings-on inside the car as it is driving?

And our money isn’t safe, either. The Treasury Department is currently debating whether to create a central bank digital currency, or CBDC, which is effectively a “digital dollar” — completely trackable and revocable based on bureaucratic fiat. Unlike cash, if the Treasury secretary thinks you’re spending too much money on whiskey or gambling or entertaining amorous partners (or worse yet, actually wasting your money), he or she could set limits to curtail your spending habit. And if you are accused of a crime, the federal government could wipe out your savings completely with the press of a button.

And, of course, there is the possibility that the government could build a giant DNA database so everyone is instantly identifiable and trackable. For decades, New Jersey has kept DNA samples from every child born in the state (through the blood drawn from those foot pricks they give newborns) without parents’ informed consent. So the state now has DNA samples from millions of its residents who were unaware their genetic blueprint was being held in a state database. (New Jersey isn’t alone: Lawsuits against such databases were filed in Texas, Minnesota, and Michigan, where the samples were actually sold to for-profit companies for research.)

But perhaps the biggest battle in the liberty-vs.-safety cage match is found in the rapid development of facial-recognition technology. In 2023, just about every American is equipped with a 4K camera in their pocket; if algorithms keep developing at this pace, soon a camera will be able to tell you everything about a person at whom it is aimed.

This, naturally, is different from the facial-recognition software that allows you to, say, unlock your phone or access your bank-account app. When facial-recognition AI is able to immediately compare a live image to billions of images online and make a positive identification, it will be a gold mine for stalkers who see a girl and want to know her name and where she lives. Or a business that wants to know all of your buying habits before you even walk up to the cash register.

Want to start a new life in a town where nobody knows that you, say, recorded some adult videos in your younger days? Too bad, facial recognition will make it impossible to escape. Are you a right-leaning reporter who wants to attend a meeting of local Democrats to see what they’re planning? Sorry, facial-screening software will connect your face to your social-media accounts, and the organizers can keep you out based on what you’ve written and perhaps the accounts you follow. (Reportedly, Ukrainian soldiers used facial-recognition software to identify dead Russian soldiers on the battlefield. Those photos were then sent to the families of those soldiers in order to persuade them to oppose the war.)

Still, you may say a robust facial-recognition algorithm is valuable; it has, after all, aided police in capturing criminals spotted on surveillance cameras and pedophiles whose faces end up in photos.

But facial-recognition software is too monumental a power to trust government with. The government, after all, is made up of actual people, who can use AI software to stalk or harass political opponents. If you think Donald Trump should have access to facial-recognition software, imagine how you would feel if Elizabeth Warren had the same power. (Such software is, for example, used in China to identify political dissidents walking down the street — and it has gotten so precise, it can find people who are wearing sunglasses and masks.)

Further, there is no way such powerful software would remain quarantined within government, especially when it is being developed by tech entrepreneurs. In her excellent book Your Face Belongs to Us: A Secretive Startup’s Quest to End Privacy as We Know It, the New York Times reporter Kashmir Hill describes the development of Clearview AI, a shady facial-recognition company that initially used their software to prevent liberal activists from infiltrating their right-wing political events. (One of the initial creators of Clearview was Chuck Johnson, an alt-right Holocaust denier who was once the guest of Congressman Matt Gaetz at the State of the Union. These cretins would have first pass at finding out everything about you.)

The problem is, Americans have grown perfectly comfortable being surveilled at all times. Cameras on street corners and in businesses record our every move. Colleges encourage students to inform on one another in case someone in a private conversation says something that annoys or offends the listener. People happily post details of their personal lives, such as when they are on vacation or what they are spending money on, hoping it gets them more attention, not less. The curve is bending toward less privacy, not more.

A recent survey taken by the Cato Institute (my other employer) found that 30 percent of Gen Zers (people under the age of 30) support allowing the government to install video cameras in peoples’ homes to “reduce domestic violence, abuse, and other illegal activity.”

Do any of the surveillance possibilities mentioned earlier in this column make me sound a touch hysterical? Well, the young people are fine with them. In fact, as evidenced by the above poll result, a good number of them think these surveillance techniques might not be intrusive enough.

The whole issue of technology watching over us like the eyes of Doctor T. J. Eckleburg in The Great Gatsby has libertarians and other small-government types in a bit of a pickle. On the one hand, we support all the innovation and progress brought to us by the free market. Freedom is the surest way to prosperity for all.

But what if that entrepreneurship also leads to the loss of our freedom to move about, or to make private transactions without government snooping? What if freedom strangles us?

Nearly a century ago, Supreme Court justice Louis Brandeis, who was instrumental in carving out a “right to privacy,” said the “greatest dangers to liberty lurk in insidious encroachment by men of zeal, well-meaning without understanding.”

A technological surveillance state could be a much safer one. It may be a large reason why you rarely hear of serial killers anymore — DNA and tracking devices make it almost impossible to roam the country, Ted Bundy–style, and get away with it.

But the trade-off would be far, far worse. A society that picks safety over privacy will end up with neither, and the effects will be irreversible.

You have 1 article remaining.
You have 2 articles remaining.
You have 3 articles remaining.
You have 4 articles remaining.
You have 5 articles remaining.
Exit mobile version