Politics & Policy

Failure to Measure Up

The U.N. abuse of science.

At the end of September, The Lancet, a British medical journal, published papers demonstrating the United Nations’s misuse of scientific information in relation to child mortality, and especially in relation to malaria. It is pleasing that the Lancet has exposed this misuse because it is a rare event — the exposure, not the misuse of data. More alarming, however, is that the U.S. media chose not to report on this significant abuse. One wonders if data manipulation from the U.S. government, on say, climate change, would receive no headlines as well.

Yesterday the great and the good of the malaria world descended on Seattle for a Gates Foundation sponsored malaria summit. And the World Bank and UNICEF (the U.N.’s Children’s Fund) released malaria-control progress reports. Those experienced at wading through aid agency documents will know that they contain few useful data. The latest reports are no exceptions. To be sure, each agency has some good work to showcase, but the pages and pages of high gloss photos are merely a distraction. These agencies have little new information to report and have made meager progress in measuring changes in malaria cases and deaths.

No measurement, no failure

As recently as 2005, the international health community was not particularly keen on the idea of providing useful measurements of its performance. It didn’t have a vast amount of funding to spend, and it hadn’t tried very hard to measure whether the money it had spent was having much of an impact. A positive summary of its accomplishments would be as follows: it largely got on with the job of running health experiments, it supported local health systems, and in a limited piecemeal fashion, it tried to reduce disease. Operational research was an afterthought.

Stung by criticism of its inability to demonstrate performance emanating from a few academics and conservative voices in Congress, and creditably supported inside a few of the multilateral and bilateral aid agencies (notably the President’s Malaria Initiative), a push for better measurement of performance has grown. The focus has been more intense in part because the United States government has increased funding, both bilaterally and multilaterally, from under $2 billion a year to over $6 billion a year in the recent past.

Some of the efforts have undoubtedly worked, as an Africa Fighting Malaria report on malaria in Uganda points out. But international health agencies still have a problem. Since they haven’t bothered to measure much in the past, there is little data yet compiled to show off recent success. Thus, the latest reports largely rely on 2005 data. Indeed, time and again we read the phrase, “data are not yet available” in these reports. It is therefore understandable that agencies want to grab hold of any useful data, massage them to maximum advantage, and then aggressively promote themselves. And that is what UNICEF, in particular, has just done. But it has overstepped the bounds of acceptable behavior.

Chris Murray is a respected health economist at the University of Washington’s Institute of Health Metrics and Evaluation. He leads a team assessing child mortality, which works in collaboration with the World Health Organization (WHO), and the Universities of Harvard and Queensland. The Lancet editorial claims that Murray’s team “report[s] disappointing progress in efforts to reduce child mortality. Although work to accelerate child survival has been scaled up in recent years, it is too soon to be sure of its success.” Murray and fellow researchers sent UNICEF a copy of the report expecting some strong discussion and potential disagreement about the data. It is widely known that UNICEF and other agencies have been concerned about ‘competition’ in monitoring of performance — in other words, agencies don’t like having independent researchers undermining their nicely spun figures. UNICEF apparently urged the Lancet not to publish Murray’s report. Fortunately, the Lancet did not heed their urgings. Here’s what the Lancet says:

In December each year UNICEF publishes its State of the World’s Children report. That publication regularly carries with it an estimate of global child mortality. But on Sept 10-6 days after we informed UNICEF of the publication date of the paper by Murray and colleagues-and in a break with its usual practice, UNICEF contacted selected journalists about “a major public health success”. For the first time UNICEF strongly publicised its claim that annual under-5 child deaths had fallen below 10 million.

Several journalists were puzzled. The sudden UNICEF contact was unexpected. It was unusually dissociated from UNICEF’s annual report. There were no detailed data for journalists to examine in order to interpret UNICEF’s claim. UNICEF denies that it released the positive 9.7 million figure to pre-empt the more critical tone of the paper by Murray and colleagues.

UNICEF received a lot of positive publicity from its statement that its programs were being effective, but it appears that the data are not what they seem. The alleged success of their programs is a mirage.

Cleaning up their act

First, the good news. Agencies are desperate to show success because they know that donors, especially USG, are watching and they know that the past approach of non-measurement equating with success is over. This means that performance should improve. However, there is bad news too. If U.N. agencies get away with such blatant disregard for scientific process, is it safe to rely on any of the data they publish (e.g. climate change)? Also, agency leaders will have to fund programs in which they can show success, such as specific disease reductions in malaria and HIV. At a certain stage, this becomes problematic for poor countries that often need help across entire health systems. In these circumstances focusing on a single disease in order to measure progress can skew priorities. It is up to commentators (myself included) to remember this when we demand success from aid agency programs. We don’t want malaria rates to drop only to discover that TB or another respiratory disease has increased.

Hopefully media attention to (if not coverage of) the Lancet reports will force improved performance at global health agencies, and for doing just that, the Lancet deserves enormous credit for exposing abuse

On a final note, it is important to recognize that The Lancet isn’t perfect either. The newer sister journal to the Lancet, Lancet Infectious Diseases, recently published an article on DDT in which the authors disparaged conservative groups, a U.S. senator, and the group Africa Fighting Malaria (which I founded), claiming they advocated for DDT to combat malaria as a panacea. It’s as though all conservatives were simplistically promoting only one technology. What is most worrying is that the article contained not a single reference to their assertions about the groups they disparage, or indeed any references to support their pet solution to malaria, and from which they may benefit financially. Such unsubstantiated bias means that the Lancet needs to be watched too.

The take-home message surely must be that more time and money must be spent measuring and evaluating progress and that data collected can and should be analyzed independently. As press freedom shines a bright light on politicians and their questionable behavior, so should a bright and enduring light be shined on public-health programs and the way they spend taxpayers’ money.

 –Roger Bate is a resident fellow of the American Enterprise Institute. He cofounded Africa Fighting Malaria in 1999, and is today a board member.

Exit mobile version