From time to time, after I’ve expressed my view on a given topic, a reader will write with a singular denunciation. “Good luck running on that platform,” he will scoff, and then he will make a dismissive allusion to the decline of the American Whigs in the 19th century.
This criticism is an extraordinarily irritating one, carrying with it the dual presumptions that what the incumbent majority wants is the most important of questions and that, if one refers an issue to the transient desires of the crowd, one is thereby relieved of the responsibility to argue that issue on its merits, neither of which is true.
Whether or not my interlocutors are aware of it, reactions such as theirs reflect the increasing fetishization of democracy for its own sake and, by extension, the perilous elevation of procedural-democratic activities — such as picking candidates, casting ballots, and subjecting views to majority scrutiny — above fundamental liberal-democratic attributes such as protecting individuals from the whims of the crowd and ensuring that plebiscites are constrained in their effect by a predictable timetable and a stable rule of law.
The differences between these two ideas are pronounced. In American life, we vote for almost everything: legislators, judges, commissioners — in some parts of Texas, citizens even elect the person in charge of weights and measures. And yet, although we are happy to accept the results of our elections, we do not regard them as the end of the matter. In a pure representative democracy, our politicians would be accorded almost free rein, their power tempered only by the understanding that they will be removed if they push their luck. In the United States, by contrast, we demand hard limitations. Consider how inappropriate it sounds to suggest that being elected affords one carte blanche. Electoral tampering? “Of course he can cancel the election: A majority wants him to.” Or, perhaps, how odd it feels to hear established individual rights being subjected to the democratic test: A murderer deserves a fair trial in a court of law, with a good chance of getting off on a technicality? “Good luck getting that past the people.”
Rightly, most of us would balk at such objections. And yet, for some reason, this does not temper our ardor for the vote. On the contrary: It is nowadays common to hear it claimed that the opportunity to cast ballots represents the most important of a person’s fundamental rights — the franchise serving as a barometer of civic equality and individual safety. This, I’d propose, is a significant misunderstanding of what it is that has made Anglo-American society so great and so powerful. Democracy is an important component of liberty and of civil society, certainly. But it is just one component — a tool, really. And, as with any tool, it can be used for good and used for ill. Slavery, remember, was abolished not by democratic means, but in part by executive order in the midst of a bloody war, and in part by a guarantee of manumission that was successfully inserted into the Constitution only because those who objected the most strenuously had been excluded from the process. Segregation, likewise, was destroyed by the judgment of a court that privileged the counter-majoritarian words of the national charter above the ugly desires of local majorities, and by a president who was willing to send in men with guns to back up the decision.
Pace Lyndon Johnson, who posited famously that “a man without a vote is a man without protection,” I remain as free as a non-citizen as any man who is able to choose his representatives. Like others, I may speak in sharp and harsh terms without interference or censure from the state — and I do. I may own firearms for my defense and carry them with me should I so wish — and I do. I may expect to secure my person, house, papers, and effects against unreasonable searches and seizures — and I do. With no meaningful exceptions, I enjoy equal protection under the law and the right to due process should I be accused of having violated that law. Given how hard it would be to repeal these safeguards, it seems fair for me to conclude that I may do these things not primarily because others vote to permit me to, but because the questions have been deliberately set outside of the standard democratic process and engraved deep into the highest law in the land: the Constitution.
#page#That Constitution, thank goodness, is a largely prescriptive document. For all the misplaced and saccharine focus on the first three words, “We the people,” all of the most necessary and beautiful elements of the thing are anti-democratic in nature — or, at least, anti-majoritarian. The Bill of Rights strictly prohibits the state from limiting essential individual liberties, whatever the zeitgeist may think of them or those who exercise them. The Reconstruction-era amendments make it abundantly clear that such protections apply to all citizens, regardless of their race. The inclusion of the Senate within the legislative branch permits the individual states that make up the federation to slow the will of a simple numerical national majority, thereby ensuring that what are constructed as semi-sovereign entities are not relegated to being mere departments. Changes in personnel are strictly governed, to avoid the possibility of a government’s using its power to declare itself a dictatorship. And, although it has taken a beating in the last 70 years, the structure of the document does still forbid the national government to exercise powers that it does not possess, and prohibits each branch from expanding beyond its remit. Thus does the system restrict those who are elected to the halls of power to the confines of their job descriptions. Certainly, the Constitution leaves a role for democratic input and policy experimentation within its firm shell, presupposes sensibly that with too little democracy there will be no feedback mechanism by which those who might abridge the rights of the people can be held to account, and provides a tortuous democratic mechanism for its own alteration. (Infuriatingly enough, it is in these few legitimate areas that progressives tend to want to deprive the people of their votes.) Nevertheless, its purpose is primarily prohibitory: That is, it is intended to act as a straitjacket on the ambitions of men and on the size of the state. “Americans,” George Will suggests, “accept judicial supervision of their democracy — judicial review of popular but possibly unconstitutional statutes — because they know that if the Constitution is truly to constitute the nation, it must trump some majority preferences.” Or, put another way, because they understand that in a significant number of cases, liberty trumps the vote.
To look back at the convention that produced our current legal order — or, for that matter, at the wider debates that raged in America in the late 18th century — is to see a consistent focus not on what electoral system the new country should boast but on how individual freedom might be most effectively conserved. In Federalist No. 10, James Madison rejected the idea that the United States should become a pure democracy, judging caustically that “democracies have ever been spectacles of turbulence and contention; have ever been found incompatible with personal security or the rights of property; and have in general been as short in their lives as they have been violent in their deaths.” Alexis de Tocqueville, whose admiration of the scope of Americans’ political participation was checked by a fear of the “tyranny of the majority,” noted just under half a century later that it is “an impious and an execrable maxim that, politically speaking, a people has a right to do whatsoever it pleases.” That the franchise had grown so quickly, Tocqueville thought a remarkable, praiseworthy thing. But that the will of this newly active bloc might come into conflict with the rights of the minority worried him.
The concern was well founded. Whether power lies with a monarch, an aristocracy, or the people at large, that there is a tension between the individual and the state has been understood for centuries. It must not be forgotten now that we have transferred more and more authority to a majority of the people. “You can,” Roger Scruton has observed, “have liberty without democracy, but not democracy without liberty: Such is the lesson of European history.” To illustrate his point, one need look no further than seminal British constitutional documents such as the Magna Carta, the 1628 Petition of Right, and the 1689 Bill of Rights — all of which had nothing to say about the franchise and a great deal to say apropos the structure of the state and the rights that could not be taken away from individuals by the king or by anybody else besides. Among the protections included in the Magna Carta were the right of all people to own and inherit property, limitations on the scale of taxation, restrictions on state interference with religious institutions, and an embryonic framework for what we would now regard as due process. For its part, the 1689 Bill of Rights, which, along with the English common law, served as the basis for much of the American Revolution’s restorative radicalism, included guarantees of freedom of speech and of the right to keep and bear arms, confirmed that no excessive bail or “cruel and unusual” punishments might be imposed upon those accused of crime, and established in law that pre-conviction “grants and promises of fines or forfeitures” were verboten. These bulwarks against despotism had little to do with the onset of elections. Britons would not be uniformly included in the democratic process until the Representation of the People Act of 1918.
#page#This confusion of process with substance leads us to some strange places. Pro-voting outfits such as “Rock the Vote” and AIGA’s “Get Out the Vote” presuppose that low public participation in a country with democratic input mechanisms is a problem in and of itself — an indictment, perhaps, of an unhealthy political culture. The New York Times’s Charles Blow goes one further, arguing acidically that “voter apathy is a civic abdication.” In his post-midterm press conference, President Obama took this instinct to its extreme, taking the utterly extraordinary step of attempting to divine the intentions of the two-thirds of the country that did not vote at all. Expressing concern that so many had stayed at home, the president first informed the reticent that he had “heard” their silence and then appeared to interpret that reluctance as a form of quasi-supportive criticism. It seems that we are hooked on participation — whether the people participate or not.
One might ask, “Why”? The United States was established on the principle that just power is derived from the consent of the governed. Can an unwillingness to involve oneself in public affairs not be interpreted as a sign of contentment with the status quo — qui tacet consentire, and all that — or, at least, as an indication that one is happy to watch from afar as things play out? Is it not virtuous, too, for Americans who have no interest in matters political to stay away from the realm? The more that the state does, the harder it can be to avoid that realm, certainly. But, as far as remedies go, it is just as rational to propose that the size of government be reduced as to attempt to prompt into action those who have no idea what they’re doing.
Increasingly, modern American “democracy” represents less a mixed system in which the public is asked for its input into the questions that are up for debate, and more a holistic cult-of-the-collective in which the state is accorded a say in almost all public and private decisions and in which, having handed over their autonomy and their treasure, citizens are asked to vote for the person they believe will manage their lives and property in the manner most agreeable to them. Those among us who are more excited by our own autonomy than by the chance to engage in this process might begin to ask aloud whether we would be better served by having less to do with the affairs of our neighbors and more to do with our own. Who’s with me — can I get a show of hands?
— Charles Cooke is a staff writer at National Review. This piece is adapted from an article that appeared in the November 17, 2014, issue of NR.