Michigan — the cradle of the American labor movement — is about to become a “right to work” state. And it’s something of a return to America’s economic roots.
Before the New Deal, nearly every state was a “right to work” state. They assumed that competent parties were free to make employment contracts on any terms that they found mutually advantageous. Employer and employee were perfectly equal before the law — this was called “employment at will.” Either party could terminate the contract for any reason whatsoever; employers could not force employees to work, nor could employees compel employers to retain them. The state would not interfere as individuals bargained over wages, hours, and working conditions.
The free labor market was an important factor in the rapid economic growth of the United States after the Civil War. Critics, reformers, and radicals in the 19th century denounced the whole philosophy of employment at will, arguing that the formal equality of employer and employee was a sham, and that the overwhelming power of corporations permitted them to impose “wage slavery” on their workers. Roscoe Pound, the dean of Harvard Law School and one of the fathers of progressive jurisprudence, averred that it was absurd for judges to pretend that a billion-dollar corporation and a penniless immigrant really bargained about the terms and conditions of employment “as if [they] were farmers haggling over the sale of a horse.”
Labor unions were among the more controversial and legally contentious ways in which workers tried to reform the industrial labor-relations system. Simply put, unions were voluntary associations of workers that tried to use their members’ aggregate power to bargain with employers. Legally, they were perfectly free to do so. The problems came when unions attempted to compel employers to bargain with them, since employers had the right to refuse to recognize unions, and indeed to fire employees who joined unions.
The next step for union members who faced recalcitrant employers was to begin a “strike,” to withhold their labor in an effort to bring economic pressure on their employer. In most cases, workers were perfectly free to do so; they could not be compelled to work. However, in most cases, industrial workers could be easily replaced and business would continue as usual. Thus, unionists adopted tactics such as picketing and boycotting.
Employers responded by hiring professional strikebreaking firms, using detectives and spies, and blacklisting union organizers. Strikes had a tendency to degenerate into violence, with threats and assaults used against the “scabs” and “finks” who would replace striking workers. Near industrial warfare accompanied the Great Railroad Strike of 1877, the Homestead Strike of 1892, and the Pullman Strike of 1894.
Public opinion usually turned against the unionists when violence broke out, at which point the power of the state, and courts in particular, broke most strikes. Employers were able to get court orders to prohibit strikers from interfering with employers and non-striking workers who wanted to continue the business. With one exception, every injunction was issued after strikers had engaged in violent or intimidating activity.
The American Federation of Labor demanded that courts refrain from issuing injunctions in labor disputes. Unions also demanded exemption from the antitrust laws. Congress enacted the Sherman Antitrust Act in 1890, which outlawed every “combination in restraint of trade.” Yet while Congress’s principal concern was monopolistic producers, labor unions were every bit as monopolistic as their employers. Indeed, the antitrust laws were more effective against union conspiracies, because unions engaged in more overtly cartel-like behavior, and because union combinations could rarely be shown, like many producer combinations, to have any public or consumer benefit. So even in 1911, when the Supreme Court stated the “rule of reason,” applying the Sherman Act only to harmful combinations, unions remained in violation of the law.
Given the antitrust laws, what unions sought was immunity to engage in what would otherwise be criminal activity. The AFL found the Democratic party more receptive to its demands, and began to support that party. But even when unionists got the chance to amend antitrust law, all they managed to pass was the Clayton Act of 1914, a law so watered-down as to change nothing.
But eventually the New Deal brought unions the privileges they sought. The 1935 National Labor Relations Act (or “the Wagner Act”) is widely seen as the key to union power. But the real breakthrough had come under Herbert Hoover, in the Norris–La Guardia Act of 1932, which beefed up the Clayton Act’s language exempting unions from antitrust laws and protecting them from injunctions. The Wagner Act went further, compelling employers to bargain exclusively with whomever a majority of their employees chose. The resulting contracts could include requirements that companies hire only union members (the “closed shop”), or that they force new employees to join the union (the “union shop”).
The Supreme Court was the last obstacle; almost everyone assumed it would declare the Wagner Act unconstitutional. But the Court backed down after President Franklin D. Roosevelt threatened to “pack” it, and his new appointees interpreted the law in a pro-union fashion. In 1940 the Court held that even “a lawless invasion of [a] plant and destruction of property by force and violence of the most brutal and wanton character, under [union] leadership and direction” did not amount to a conspiracy in restraint of trade. Two years later, the Court held that federal labor law did not forbid the New York City Teamsters to hijack trucks entering the city or extort a day’s union wage from such trucks as a toll, calling it “traditional union activity.” When Congress amended the law to prohibit such mayhem, the Court effectively nullified the adjustments.
Such union tactics, and the strike wave that followed World War II, led to demands that Congress limit unions’ power. The 1947 Taft-Hartley Act preserved the fundamentals of the Wagner Act (majority and compulsory unionism), but did allow states to outlaw the “union shop” — the requirement that employees join the union or pay dues as a condition of employment. Most states in the South and West exercised this option and became “right to work” states.
Since then, new investment and jobs have been migrating from union-shop to right-to-work states — or abroad to countries with lower labor costs. Michigan alone lost 600,000 union jobs between 2000 and 2010. The number of private-sector union members has declined by half since 1973, and their percentage of the work force is about where it was before the New Deal. Michigan’s move is, in a way, a return to that freer status quo.
— Paul Moreno is the director of academic programs at Hillsdale College’s Kirby Center for Constitutional Studies and Citizenship.