The Corner

Yes, Teach for America Works

A new study, released yesterday, confirms something unsurprising: Teach for America, a highly competitive program that hires enthusiastic college graduates, gives them minimal training, and sends them to work as teachers for a couple years in low-income school districts, produces better outcomes than the comparable teachers students would otherwise get. Specifically, a Department of Education study of about 4,500 students at 45 schools, between 2009 and 2011, showed that students with a TFA teacher for one year do so much better on standardized math tests at the end of the year that it’s as if they had about two and a half months more time in the classroom.

This isn’t quite news: The consensus for a while has been that TFA, which has been around for a couple decades now, helps students, especially in math achievement, but there has been plenty of criticism from the indefatigable education-labor establishment. One of their objections has been that in some of the flattering evaluations, TFA teachers aren’t being compared with experienced teachers, who’d outdo the rookies easily, or teachers who’ve gone through proper teacher training. But this new study examines those issues with about the best social-science methods you can get — randomized trials — and it turns out TFA-ers trump both experienced teachers and traditionally certified ones:

TFA teachers do better than both traditionally trained teachers and those from alternative programs, addressing one of the skeptics’ arguments against previous evidence of the program’s success. They outperformed experienced teachers, too (defined by this study as those with more than three years of experience) by the same margin they beat the overall competition. That isn’t to say experience doesn’t matter — the study also took a look at, across all the teachers, what factors mattered, and after five years of experience, each additional year of teaching was correlated with a .005 standard deviation increase in student achievement, which isn’t nothing, but pales in comparison to the TFA advantage, which is, hypothetically, equivalent to 14 years of experience (there weren’t benefits from experience before the five-year mark). TFA’s advantage, by the way, was even bigger in high school than it was in middle school, amounting to 0.13 standard deviations, or almost twice as much as it was generally.

Why, exactly, are TFA teachers better than traditional teachers? The study actually has basically no answer for that (you can read the discussion on pages 101 and following of the study). It looks at what characteristics of teachers predict success among their students (a messier business than the TFA comparison, it should be noted), and didn’t find much to write home about. Attending a selective college, taking more college math, scoring better on professional math exams, and additional teacher training all didn’t show any signs of improving effectiveness (being enrolled in teacher training  in a given year of teaching actually had a noticeable negative effect, maybe unsurprisingly). TFA-ers differed from normal teachers in a number of ways, but the researches didn’t find any particular differences that explain the greater effectiveness of TFA. In fact, since TFA-ers have a lot less experience than the comparison teachers, and experience helps, the study suggests TFA teachers should have actually done .028 standard deviations worse than the competition, rather than .07 better. So why are TFA teachers that much better than normal ones, and what does this say about education policy more generally? We still don’t know, but we can be pretty much positive the program is helping thousands of students across the country, no matter what teachers’ unions say

Dana Goldstein, an education writer for The Atlantic, argues we need a little more perspective on what these numbers mean: The researchers themselves translate them into months of progress that sound pretty impressive, but Goldstein says the TFA advantage “represents a relatively modest improvement in student achievement,” amounting to the average student shifting up three percentiles, from, say, the 27th percentile to the 30th. That might be a long way from proficiency for some, but I’m not sure it’s so modest — these improvements are hard to find and across a population, moving the average student up three percentiles seems remarkable.

The study also examined another program, Teaching Fellows, which runs something like TFA, but takes professionals looking to switch careers rather than fresh college graduates. The results there were a lot less impressive, but not actually bad: They significantly outperformed unexperienced normal teachers, but somewhat underperformed versus veteran comparison teachers. 

One of the major issues in social science is always controlling for other variables, but recent TFA studies have been able to escape this issue — how can we be sure the difference between kids’ performances comes from whether they had a TFA teacher or not, rather than some other difference — have managed to defuse the issue by randomly sorting students into classrooms, rather than trying to adjust for differences after the fact. It would, of course, be great if we had more opportunities, or even more controlled ones, to look at what works — wouldn’t it be nice if someone wrote a book arguing that?

Patrick Brennan was a senior communications official at the Department of Health and Human Services during the Trump administration and is former opinion editor of National Review Online.
Exit mobile version