Politics & Policy

Market Forces, Not Ideology, Drive Colleges to Cut American-History Requirements

Bust of Washington on the GWU campus (Photo: Tea/Dreamstime)
George Washington University made a pragmatic decision. Charges of political correctness are off base.

The history department at my alma mater, George Washington University (from which, full disclosure, I received my B.A. and Ph.D.), recently implemented a new curriculum. As part of the reform, the requirement for GW history majors to take courses in American history has been dropped. This change was welcomed coolly by some on the right. Over the festive season, the new requirements became the subject of several negative commentaries, including one by Ian Tuttle in these pages and one at PJ Media as well as a critical segment on Fox News and stories on right-leaning websites dedicated to academic news.

This criticism is unfounded. GW’s history department had legitimate reasons for adopting the new regulations — reasons that many observers of American higher education, including those on the right, have long supported.

Under the previous standards, majors had to take at least two upper-division courses each on American history, European history, and the rest of the world. Under the new requirements, they don’t have to take any classes in American or European history, not even at the introductory level. They can take the introductory survey on American or European history if they choose, but they can take the one on world history instead. Students can skip the surveys entirely by receiving a high-enough score on the equivalent Advanced Placement exam in high school. At the upper level, they must now take at least one course in three of seven areas: Europe, North America, Asia, Africa, the Middle East, Latin America, and theory and methods. In addition, for their capstone, majors can opt to do a digital project instead of the typical thesis.

Careful examination of the new curriculum reveals that majors can pass through GW’s history department studying almost nothing but European or U.S. history, but attention has dwelled on the news that they no longer have to take such classes. National Review’s Ian Tuttle portrayed the change as a consequence of elite American universities’ post–World War II shift of focus from America to the world. According to Tuttle, for the faculty and administrators at such institutions, “the nation-state has had its day.” Universities are more interested in improving the world than in educating citizens. Yet only by first being grounded in their own history can students obtain a genuine sense of their place in the world. By eliminating any obligation to study that history, GW’s new history curriculum is, in Tuttle’s words, emblematic of “how the elite American university is failing its students.”

Tyler O’Neil of PJ Media indicted a familiar culprit for the alteration: political correctness. “This move seems to fit with the trend of rejecting the study of Western heritage as somehow oppressive and close-minded,” he contended. The decision to no longer require American history courses is just the latest manifestation of the campus Left’s campaign against Western civilization and dead white men. The radio host Larry Elder made this argument as fatuously as possible, protesting against “indoctrination” and leftist professors who teach that American history is nothing but a record of “oppression.” Elder even accused Hillary Clinton and Bernie Sanders of pushing for free college so more young Americans can be reprogrammed to become progressive drones.

Contra O’Neil, it’s not “kowtow[ing]” to attempt to “make history more popular.” As he himself noted, the number of GW history majors declined from 153 in 2011 to 83 in 2016. That’s a drop of nearly half in five years. When funding is tied to the total of students enrolled per class, it’s imperative for any department to recruit more bodies to fill more seats. Yes, the new curriculum will “better reflect a globalizing world” by allowing students to concentrate their studies on regions outside North America and Europe. (This merely codifies a now quarter-century-long transformation in the teaching and study of history.) But these changes were driven by “enrollment pressures.” Simply put, GW’s history department changed its requirements because society demanded it.

Statements by Katrin Schultheiss, chairwoman of the GW history department, explaining the changes prove it. “Whatever they [students] want to do, there’s a way to make the history department work for them,” she told GW’s student newspaper, the Hatchet. “Work for them.” These are the words not of an academic but of a consultant or administrator. Young Americans (and their moms and dads) have been encouraged to treat the college experience like consumers, to spend their money on what they like and not on what they don’t. When a product loses market share, you try to make it more attractive. GW’s history major lost market share. Solution? Make it more attractive. Auf wiedersehen, foreign-language requirement, dumped to entice wary students to reconsider Clio’s wares.

When a product loses market share, you try to make it more attractive. GW’s history major lost market share. Solution? Make it more attractive.

Why did Clio lose customers? That’s the real question that must be addressed. The answer, to put it bluntly, is that they were importuned to stay away by the same people decrying GW’s new history requirements. I don’t mean Ian Tuttle or Tyler O’Neil personally. I mean, rather, the writers, pundits, commentators, even political figures they represent. The country is presently in the midst of a protracted debate about the purpose and role of higher education. One of the most hotly contested salients in this battle is the one between the STEM (science, technology, engineering, and mathematics) fields and those of the humanities.

President Obama deprecated art history’s employment potential. Mitt Romney did the same for English. In one of the early GOP presidential debates last cycle, Marco Rubio avowed that welders earn more than philosophers. Despite the increasing emphasis on career preparation, employers continue to find students unprepared for the jobs they’ll fill. A vociferous smattering of heretics and dissidents notwithstanding, an orthodoxy has taken shape and become entrenched: Liberal-arts degrees are useless and their funding should be reduced in favor of the STEM fields that will permit students to find gainful employment after graduating.

Such is the message American teenagers have received from parents, politicians, pedagogues, the media, their peers, from culture and society. They’ve heeded it, loud and clear. The population of history majors began declining in 2013, a slide that has only become steeper since. The trend isn’t confined to history, either. The number of English majors has also plummeted. Indeed, between 2012 and 2014, the number of bachelor’s degrees awarded in the liberal arts fell a precipitous 8.7 percent. The proportion of B.A.s granted in core humanities disciplines in 2014 was 6.1 percent, the lowest figure since collection of such data began in 1948.

The drop in humanities enrollments is not confined to one or two departments. It is a secular decline. But what else, given current attitudes about liberal-arts degrees, did anyone expect? Students are told that the purpose of attending university is career preparation. College exists to impart the knowledge and skills that will meet the demands of the 21st-century economy. Art history, philosophy, French literature, and their humanistic brethren have been written off as antiquated and obsolete. To study them is to declare to prospective employers that so are you. At the wizened old age of 22.

In these circumstances the burden is on those who pursue the liberal arts to prove to the world that they have what it takes to compete in the marketplace. And here the onus shifts to humanities departments, to convince prospective majors that they won’t be left behind their cohorts who matriculated in STEM fields. Hence new major requirements that let students concentrate on geographic areas outside the West or even on thematic issues that transcend geographic boundaries. To prove their mastery, students can dispense with the traditional written thesis in favor of the sort of digital project they might have to do routinely in their 21st-century workplace.

GW’s history department shouldn’t be decried for trying to make history more popular. It should be congratulated for the attempt and for proving that it can be competitive and adapt to the needs of contemporary society. Digital humanities is a growing field of history. There is probably greater demand for what students can learn from it than for what they’ll learn from more traditional forms of study. Humanities departments have been told forever to stop living in the past. Of course GW changed its history requirements. In this climate it would have been malpractice not to.

GW’s history department is neither placating the forces of political correctness nor dumbing down its standards nor letting down its students. Rather, it is bringing its curriculum in line with the latest developments in the discipline (and the requirements of most other history departments) and positioning itself so that its graduates will be on an equal footing with those of other programs when they enter the real world. One can’t but wonder if this controversy would have flared up at all if George Washington University were still called “Columbian College,” as it was at its founding. A school named after the Father of the Country no longer requiring the study of American history makes for a much juicier story than Columbian College’s doing the same.

GW was established to educate citizens of the new nation in the arts and sciences, and it will continue to do so. It isn’t betraying its heritage. Yet if it had, the question to ask would have been not how but why it felt obliged to do so. GW’s history department altered its requirements for history majors because, frankly, we insisted it alter them. We told humanities programs that they were outdated, and we told students that if they undertook liberal-arts degrees they’d fall behind in the race for a better job, a better income, a better life. Get with the times! society proclaimed. Pupil and professor both listened. If, then, we want our children to study American history, perhaps we should stop telling them they’ll be bad citizens if they do.

Most Popular

Politics & Policy

Pelosi’s House of Pain

Not so long ago — as recently as the cover of the March 2019 Rolling Stone, in fact — they seemed like the best of friends. I'm referring to Nancy Pelosi and the members of "The Squad": Ilhan Omar, Alexandria Ocasio-Cortez, and (not pictured) Rashida Tlaib and Ayanna Pressley. They shared some good ... Read More

Gender Dissenter Gets Fired

Allan M. Josephson is a distinguished psychiatrist who, since 2003, has transformed the division of child and adolescent psychiatry and psychology at the University of Louisville from a struggling department to a nationally acclaimed program. In the fall of 2017 he appeared on a panel at the Heritage Foundation ... Read More
Film & TV

How Seinfeld Mastered the Comedy Domain

I can’t say whether Larry Charles, Larry David, Alec Berg, Spike Feresten, and the rest of the brilliant writers of Seinfeld were students of F. Scott Fitzgerald, but they might as well have been. Fitzgerald supplied the best advice for sitcom writers: Start with an individual, and before you know it you find ... Read More