Lancet surveys of Iraq War casualties
The Lancet, one of the oldest scientific medical journals in the world, published two peer-reviewed studies on the effect of the 2003 invasion of Iraq and subsequent occupation on the Iraqi mortality rate. The first was published in 2004; the second (by many of the same authors) in 2006. The studies estimate the number of excess deaths caused by the occupation, both direct (combatants plus non-combatants) and indirect (due to increased lawlessness, degraded infrastructure, poor healthcare, etc.).
The first survey[1] published on 29 October 2004, estimated 98,000 excess Iraqi deaths (with a range of 8,000 to 194,000, using a 95% confidence interval (CI)) from the 2003 invasion and subsequent occupation of Iraq to that time, or about 50% higher than the death rate prior to the invasion. The authors described this as a conservative estimate, because it excluded the extreme statistical outlier data from Fallujah. If the Fallujah cluster were included, the mortality estimate would increase to 150% over pre-invasion rates (95% CI: 1.6 to 4.2).
The second survey[2][3][4] published on 11 October 2006, estimated 654,965 excess deaths related to the war, or 2.5% of the population, through the end of June 2006. The new study applied similar methods and involved surveys between May 20 and July 10, 2006.[4] More households were surveyed, allowing for a 95% confidence interval of 392,979 to 942,636 excess Iraqi deaths. 601,027 deaths (range of 426,369 to 793,663 using a 95% confidence interval) were due to violence. 31% (186,318) of those were attributed to the US-led Coalition, 24% (144,246) to others, and 46% (276,472) unknown. The causes of violent deaths were gunshot (56% or 336,575), car bomb (13% or 78,133), other explosion/ordnance (14%), air strike (13% or 78,133), accident (2% or 12,020), and unknown (2%).
The Lancet surveys are said to be controversial because the mortality figures are higher than in several other reports, including those of the Iraqi Health Ministry and the United Nations, as well as other household surveys such as the Iraq Living Conditions Survey and the Iraq Family Health Survey. The 2007 ORB survey of Iraq War casualties estimated more deaths than the Lancet, though it covered a longer period of the conflict.[5][6] It has also been argued that the controversy results from an incompatibility between the survey results and the comparatively positive image various media outlets have of the invasion of Iraq.[7][8] The Lancet surveys have triggered criticism and disbelief from some journalists, governments, the Iraq Body Count project, some epidemiologists and statisticians and others, but have also been supported by some journalists, governments, epidemiologists and statisticians.[9]
First study (2004)
The survey was sponsored by the Center for International Emergency Disaster and Refugee Studies, Johns Hopkins Bloomberg School of Public Health, Baltimore, MD, United States (authors L Roberts PhD, G Burnham MD) and the Department of Community Medicine, College of Medicine, Al-Mustansiriya University, Baghdad, Iraq. Roberts' team was chosen for their experience in estimating total mortality in war zones, for example his estimate of 1.7 million deaths due to the war in the Congo[10] which not only met with widespread acceptance and no challenge when published in 2000,[11] but resulted and was cited in a U.N. Security Council resolution that all foreign armies must leave Congo, a United Nations request for $140 million in aid, and the US State Department pledging an additional $10 million in aid. Similar studies have been accepted uncritically as estimates of wartime mortality in Darfur[12] and Bosnia.
Roberts' regular technique is to estimate total mortality by personal surveys of a sample of the households in the area under study; this method being chosen in order to avoid the under-counting inherent in using only reported deaths in areas so chaotic that many deaths are unreported, and to include those deaths not directly attributable to violence but nevertheless the result of the conflict through indirect means, such as contamination of water supply or unavailability of medical care. The baseline mortality rate calculated from the interviewees' reports for the period prior to the conflict is subtracted from that reported during the conflict, to estimate the excess mortality which may be attributed to the presence of the conflict, directly or indirectly. This technique has been accepted uncritically in the previous mortality surveys discussed above.
Because of the impracticality of carrying out an evenly distributed survey, particularly during a war, Roberts' surveys use "cluster sampling", dividing the area into a number of randomly selected, approximately equally populated regions; a random point is chosen within each region, and a fixed number of the households closest to that point are surveyed as a "cluster". While not as accurate as an evenly distributed survey of the same number of households, this technique is more accurate than merely surveying one household for each selected point.
In his study of Iraq, Roberts divided the country into 33 regions, attempting to sample 30 households for each cluster, and selecting 988 households, with 7868 residents. In September 2004, each surveyed household was interviewed about household composition, births, and deaths since January, 2002. Of 78 households where members were asked to show documentation to confirm their claims after the interview was finished, 63 were able to present death certificates. According to the authors, 5 (0.5%) of the 988 households that were randomly chosen to be surveyed refused to be interviewed.
The relative risk of death due to the 2003 invasion and occupation was estimated by comparing mortality in the 17.8 months after the invasion with the 14.6 months preceding it. The authors stated, "Making conservative assumptions, we think that about 100,000 excess deaths, or more have happened since the 2003 invasion of Iraq." Among such "conservative assumptions" is the exclusion of data from Fallujah in many of its findings. Since interpreting the results of the study would be complicated by the inclusion of an outlier cluster in Fallujah, where heavy fighting caused far more casualties than elsewhere in Iraq, the study focused mainly on the results that excluded the Fallujah cluster. While the authors argued that the Fallujah cluster's inclusion could be justified as a normal part of the sampling strategy (the authors noted that other "hotspots" like Najaf had not ended up being surveyed), and the authors presented two sets of results in some cases (one set including the Fallujah data and one not), the article, and most press coverage of the article, stresses the data that excluded the Fallujah cluster.
The main debate in the media in the U.S. and UK focused on whether 98,000 (95% CI 8000–194,000) more Iraqis died as a result of coalition intervention, calculated from their estimate of an increased mortality of 1.5 times (95% CI 1.1-2.3) the prewar rate (excluding the Fallujah data). Had the Fallujah sample been included, the survey's estimate that mortality rates had increased about 2.5 times since the invasion (with a 95% CI 1.6-4.2) including the Fallujah data would have resulted in an excess of about 298,000 deaths (95% CI ?-?), with 200,000 concentrated in the 3% of Iraq around Fallujah (Roberts et al. p. 5).
According to the article, violence was responsible for most of the extra deaths whether or not the Fallujah data was excluded. Coalition airstrikes would be the main cause of these violent deaths if Fallujah data were included. The study makes the controversial conclusion that: "Violent deaths were widespread, reported in 15 of 33 clusters, and were mainly attributed to coalition forces." and "Violence accounted for most of the excess deaths and air strikes from coalition forces accounted for most violent deaths." The study estimates that the risk of death specifically from violence in Iraq during the period after the invasion was approximately 58 times higher than in the period before the war, with the CI95 being 8.1-419, meaning that there is a 97.5% chance that the risk of death from violence after the invasion is at least 8.1 times higher than it was before. Newsday reported:
- "The most common causes of death before the invasion of Iraq were heart attacks, strokes and other chronic diseases. However, after the invasion, violence was recorded as the primary cause of death and was mainly attributed to coalition forces—with about 95 percent of those deaths caused by bombs or fire from helicopter gunships".
It was noted that the large estimate of excess death is even more shocking in view of the widely accepted belief that deaths in Iraq were already very high at 0.5% per year, particularly among children, with many arguing this was due to UN sanctions against Iraq.[13]
Criticisms and countercriticisms
Some criticisms have focused on the relatively broad 95% confidence intervals (CI95), resulting from the difficulty and scarcity of reliable sources.[14]
Lila Guterman, after writing a long article[15] in January 2005 in The Chronicle of Higher Education, wrote a short article in the Columbia Journalism Review that stated: "I called about ten biostatisticians and mortality experts. Not one of them took issue with the study’s methods or its conclusions. If anything, the scientists told me, the authors had been cautious in their estimates. With a quick call to a statistician, reporters would have found that the probability forms a bell curve — the likelihood is very small that the number of deaths fell at either extreme of the range. It was very likely to fall near the middle."[16]
A Ministerial Statement written 17 November 2004, by the UK government stated "the Government does not accept its [the study's] central conclusion", because they were apparently inconsistent with figures published by the Iraq Ministry of Health, based on figures collected by hospitals, which said that "between 5 April 2004 and 5 October 2004, 3,853 civilians were killed and 15,517 were injured".[17]
Some critics have said that The Lancet study authors were unable to visit certain randomly selected sample areas. In an interview on the radio program "This American Life" however, the authors of the study say that they never substituted different, more accessible, areas, and that every place that was randomly selected at the beginning of the study was surveyed in full, despite the risk of death to the surveyors.[18]
Critics of the Lancet study have pointed out other difficulties in obtaining accurate statistics in a war zone. The authors of the study readily acknowledge this point and note the problems in the paper; for example they state that "there can be a dramatic clustering of deaths in wars where many die from bombings". They also said that the data their projections were based on were of "limited precision" because the quality of the information depended on the accuracy of the household interviews used for the study.[19][20]
The results of the study were politically sensitive, since a heavy death toll could raise questions regarding the humanitarian justifications on the eve of a contested US presidential election. Critics objected to the timing of the report, claiming it was hastily prepared and published despite what they perceived as its poor quality in order to sway the U.S. electorate. On this topic, Les Roberts stated "I emailed it in on Sept. 30 under the condition that it came out before the election. My motive in doing that was not to skew the election. My motive was that if this came out during the campaign, both candidates would be forced to pledge to protect civilian lives in Iraq. I was opposed to the war and I still think that the war was a bad idea, but I think that our science has transcended our perspectives."[19][20] He replied to criticisms by Professor John Allen Paulos of the Temple University Math Department of "an expedient rush to publish" with
- Dear Dr. Paulos,
- I read your note below with some sadness. FYI, there was a rush to publish as I have said in every major interview I have given.
- A) I have done over 20 mortality surveys in recent years and have never taken more than a week to produce and release a report (because people dying is important) until this article. Thus, this was the least rushed mortality result I have ever produced.
- B) We finished the survey on the 20 September If this had not come out until mid-November or later, in the politicized lens of Baghdad (where the chief of police does not allow his name to be made public and where all the newly trained Iraqi soldiers I saw had bandannas to hide their faces to avoid their families being murdered…) this would have been seen as the researchers covering up for the Bush White House until after the election and I am convinced my Iraqi co-investigators would have been killed. Given that Kerry and Bush had the same attitude about invading and similar plans for how to proceed, I never thought it would influence the election and the investigators never discussed it with each other or briefed any political player.
- C) if you have information about how and why people in New Orleans were dying today, would you rush to release it? The Falluja downfall happened just one week after the study came out and whether you believe the 500 or the 1600 or the 3600 estimates of associated Iraqi deaths, that alone was probably more than will occur from this moment on due to Katrina.
- So, we rushed to get it out, I do not understand why the ‘study's scientific neutrality’ is influenced or the likelihood that the sample was valid, the analysis fair… What does neutrality mean? Do people who publish about malaria deaths need to be neutral about malaria?
- Yours in confusion and disgust,
- Les Roberts[21]
On the contrary, Roberts views critics of his study as motivated more by politics than by science; "It is odd that the logic of epidemiology embraced by the press every day regarding new drugs or health risks somehow changes when the mechanism of death is their armed forces."[22]
Lancet publications related to criticisms
- November 20, 2004. Criticism and suggestions by peer reviewer Professor Sheila Bird, MRC Biostatistics Unit, Cambridge CB2 2SR, UK, chair of the Royal Statistical Society's Working Party on Performance Monitoring in the Public Services. Calls scientific method "generally well described and readily repeatable", but says "[p]articular attention is needed to the methodology for randomly selecting the location(s) of cluster(s) within governorates. Roberts and colleagues describe this rather too succinctly". Suggests additional information be included so that more precise multipliers (to obtain the final estimate) can be applied. Discusses an example hypothetical circumstance incorporating said information, regarding airstrike deaths and collateral damage, under which over-counting could occur due to population density variances among cluster representations.[23]
- March 26, 2005. Criticism by Stephen Apfelroth, Department of Pathology, Albert Einstein College of Medicine. Criticizes "several questionable sampling techniques that should have been more thoroughly examined before publication" and lists several flaws, including a "fatal" one, that "In such a situation, multiple random sample points are required within each geographic region, not one per 739000 individuals."[24]
- March 26, 2005. Response by Les Roberts et al. to Apfelroth. Acknowledges flaws, but says "the key public-health findings of this study are robust despite this imprecision. These findings include: a higher death rate after the invasion; a 58-fold increase in death from violence, making it the main cause of death; and most violent deaths being caused by air-strikes from Coalition Forces. Whether the true death toll is 90000 or 150000, these three findings give ample guidance towards understanding what must happen to reduce civilian deaths. ... Before publication, the article was critically reviewed by many leading authorities in statistics and public health and their suggestions were incorporated into the paper. The death toll estimated by our study is indeed imprecise, and those interested in international law and historical records should not be content with our study. We encourage Apfelroth and others to improve on our efforts. In the interim, we feel this study, as well as the only other published sample survey we know of on the subject, point to violence from the Coalition Forces as the main cause of death and remind us that the number of Iraqi deaths is certainly many times higher than reported by passive surveillance methods or in press accounts."[25]
Other responses to criticism
The Chronicle of Higher Education also wrote an article discussing the differences in the survey's reception in the popular press over how it was received in the scientific community.[15]
Epidemiologist Klim McPherson writes in the March 12, 2005 British Medical Journal:[26] "The government rejected this survey and its estimates as unreliable; in part absurdly because statistical extrapolation from samples was thought invalid. Imprecise they are, but to a known extent. These are unique estimates from a dispassionate survey conducted in the most dangerous of epidemiological conditions. Hence the estimates, as far as they can go, are unlikely to be biased, even allowing for the reinstatement of Falluja. To confuse imprecision with bias is unjustified."
Second study (2006)
A second study by some of the same authors was published in October, 2006, in The Lancet.[2][27][28]
- "We estimate that between March 18, 2003, and June, 2006, an additional 654,965 (392,979–942,636) Iraqis have died above what would have been expected on the basis of the pre-invasion crude mortality rate as a consequence of the coalition invasion. Of these deaths, we estimate that 601,027 (426,369–793,663) were due to violence."[2]
If accurate, these figures would imply the death of an average 500 people per day, or 2.5% of Iraq's population during the period.[29]
An October 11, 2006 Washington Post article[4] reports:
- "The survey was conducted between May 20 and July 10 [2006] by eight Iraqi physicians organized through Mustansiriya University in Baghdad. They visited 1,849 randomly selected households that had an average of seven members each. One person in each household was asked about deaths in the 14 months before the invasion and in the period after. The interviewers asked for death certificates 87 percent of the time; when they did, more than 90 percent of households produced certificates."
Lancet:[2] "Only 47 of the sought 50 clusters were included in this analysis. On two occasions, miscommunication resulted in clusters not being visited in Muthanna and Dahuk, and instead being included in other Governorates. In Wassit, insecurity caused the team to choose the next nearest population area, in accordance with the study protocol. Later it was discovered that this second site was actually across the boundary in Baghdad Governorate. These three misattributed clusters were therefore excluded, leaving a final sample of 1849 households in 47 randomly selected clusters."
The Lancet authors based their calculations on an overall, post-invasion, excess mortality rate of 7.8/1000/year. "Pre-invasion mortality rates were 5.5 per 1000 people per year (95% CI 4.3–7.1), compared with 13.3 per 1000 people per year (10.9–16.1) in the 40 months post-invasion."[2] See Table 3 in the Lancet article.[2] The population number used in the calculation is reported in the Lancet supplement:[3] "Mortality projections were applied to the 2004 mid-year population estimates (26,112,353) of the surveyed areas (which exclude the governorates of Muthanna and Dahuk, which had been omitted through misattribution) to establish the mortality projections."
Of 629 deaths verified and recorded among a sample of 1,849 households incorporating some 12,801 people at the time of the survey, 13% took place in the 14 months before the invasion and 87% in the 40 months afterwards. "The study population at the beginning of the recall period (January 1, 2002) was calculated to be 11 956, and a total of 1474 births and 629 deaths were reported during the study period."[2]
The study concluded that the mortality rate per 1,000 population per year in the pre-invasion period was 5.5 (range of 4.3-7.1, using a 95% CI, confidence interval) and in the post-invasion period was 13.3 (95% CI, 10.9-16.1). Excess mortality rate over the pre-invasion period was therefore 7.8 per 1,000 population per year, with violent death accounting for 92% of the increased mortality rate.
Washington Post:[4] "Gunshot wounds caused 56 percent of violent deaths, with car bombs and other explosions causing 14 percent, according to the survey results. Of the violent deaths that occurred after the invasion, 31 percent were caused by coalition forces or airstrikes, the respondents said."
The study results show an increasing mortality rate throughout the post-invasion periods, with the excess mortality rate for June 2005-June 2006 of 14.2 (95% CI, 8.6-21.5) being nearly 5.5 times the excess mortality rate for March 2003-April 2004 of 2.6 (95% CI, 0.6-4.7). The 2006 study also provides an estimate for the 18-month period following the invasion (March 2003 through September 2004) of 112,000 deaths (95% CI, 69,000-155,000). The authors conclude, "Thus, the data presented here validates our 2004 study, which conservatively estimated an excess mortality of nearly 100,000 as of September, 2004."
The authors described the fact that their estimate is over ten times higher than other estimates, such as the Iraq Body Count project (IBC) estimate and U.S. Department of Defence estimates, as "not unexpected", stating that this is a common occurrence in conflict situations. They stated, "Aside from Bosnia, we can find no conflict situation where passive surveillance recorded more than 20% of the deaths measured by population-based methods. In several outbreaks, disease and death recorded by facility-based methods underestimated events by a factor of ten or more when compared with population-based estimates. Between 1960 and 1990, newspaper accounts of political deaths in Guatemala correctly reported over 50% of deaths in years of low violence but less than 5% in years of highest violence."[2]
Official reactions
An October 12, 2006 San Francisco Chronicle article[28] reported:
- "Six hundred thousand or whatever they guessed at is just, it's not credible," Bush said, and he dismissed the methodology as "pretty well discredited." In December [2005], Bush estimated that 30,000 Iraqis had died in the war. Asked at the news conference what he thinks the number is now, Bush said: "I stand by the figure a lot of innocent people have lost their life." At a separate Pentagon briefing, Gen. George Casey, the top U.S. commander in Iraq, said that the figure "seems way, way beyond any number that I have seen. I've not seen a number higher than 50,000. And so I don't give it that much credibility at all."
The UK government, too, rejected the researchers' conclusions. In doing so, it did not mention the advice of the Ministry of Defence's Chief Scientific Adviser, Sir Roy Anderson, who had called the study "robust" and its claimed methods "close to 'best practice' in this area, given the difficulties of data collection and verification in the present circumstances in Iraq", in an internal memo on the day the study was published, dated 13 October 2006.[29][30]
Criticisms
The Iraq Body Count project (IBC), who compiles a database of reported civilian deaths, has criticised the Lancet's estimate of 601,000 violent deaths[31] out of the Lancet estimate of 654,965 total excess deaths related to the war. An October 2006 article by IBC argues that the Lancet estimate is suspect "because of a very different conclusion reached by another random household survey, the Iraq Living Conditions Survey 2004 (ILCS), using a comparable method but a considerably better-distributed and much larger sample." IBC also enumerates several "shocking implications" which would be true if the Lancet report were accurate, e.g. "Half a million death certificates were received by families which were never officially recorded as having been issued" and claims that these "extreme and improbable implications" and "utter failure of local or external agencies to notice and respond to a decimation of the adult male population in key urban areas" are some of several reasons why they doubt the study's estimates. IBC states that these consequences would constitute "extreme notions".[32] Later statements in a 2010 article by IBC say that the "hugely exaggerated death toll figures" from the 2006 Lancet report have "been comprehensively discredited" by recently published research.[33]
Jon Pedersen of the Fafo Institute[34] and research director for the ILCS survey, which estimated approximately 24,000 (95% CI 18,000-29,000) war-related deaths in Iraq up to April 2004, expressed reservations about the low pre-war mortality rate used in the Lancet study and about the ability of its authors to oversee the interviews properly as they were conducted throughout Iraq. Pedersen has been quoted saying he thinks the Lancet numbers are "high, and probably way too high. I would accept something in the vicinity of 100,000 but 600,000 is too much."[35]
Debarati Guha-Sapir, director of the Centre for Research on the Epidemiology of Disasters in Brussels, was quoted in an interview for Nature.com saying that Burnham's team have published "inflated" numbers that "discredit" the process of estimating death counts. "Why are they doing this?" she asks. "It's because of the elections.".[36] However, another interviewer a week later paints a more measured picture of her criticisms: "She has some methodological concerns about the paper, including the use of local people — who might have opposed the occupation — as interviewers. She also points out that the result does not fit with any she has recorded in 15 years of studying conflict zones. Even in Darfur, where armed groups have wiped out whole villages, she says that researchers have not recorded the 500 predominately [sic] violent deaths per day that the Johns Hopkins team estimates are occurring in Iraq. But overall Guha-Sapir says the paper contains the best data yet on the mortality rate in Iraq."[37] A subsequent article co-authored by Guha-Sapir and Olivier Degomme for CRED reviews the Lancet data in detail. It concludes that The Lancet overestimated deaths and that the war-related death toll was most likely to be around 125,000 for the period covered by the Lancet study, reaching its conclusions by correcting errors in the 2006 Lancet estimate and triangulating with data from IBC and ILCS.[38]
Beth Osborne Daponte, a demographer known for producing death estimates for the first Gulf War, evaluates the Lancet survey and other sources in a paper for the International Review of the Red Cross.[39] Among other criticisms, Daponte questions the reliability of pre-war estimates used in the Lancet study to derive its "excess deaths" estimate, and the ethical approval for the survey. She concludes that the most reliable information available to date is provided by the Iraq Family Health Survey, the Iraq Living Conditions Survey and Iraq Body Count.
Mark van der Lann, professor of biostatistics and statistics at UC Berkeley, disputes the estimates of both Lancet studies on several grounds in a paper co-authored with writer Leon de Winter.[40] The authors argue that the confidence intervals in the Lancet study are too narrow, saying, "our statistical analysis could at most conclude that the total number of violent deaths is more than 100.000 with a 0.95 confidence — but this takes not into account various other potential biases in the original data." Among the main conclusions of their evaluation are that "the estimates based upon these data are extremely unreliable and cannot stand a decent scientific evaluation. It may be that the number of violent deaths is much higher than previously reported, but this specific report, just like the October 2004 report, cannot support the estimates that have been flying around the world on October 29, 2006. It is not science. It is propaganda."
Fred Kaplan of Slate criticized the first Lancet study and has again raised concerns about the second.[41][42] Kaplan argues that the second study has made some improvements over the first, such as "a larger sample, more fastidious attention to data-gathering procedures, a narrower range of uncertainty", and writes that "this methodology is entirely proper if the sample was truly representative of the entire population—i.e., as long as those households were really randomly selected." He cites the low pre-war mortality estimate and the "main street bias" critique as two reasons for doubting that the sample in this study was truly random. And he concludes saying that the question of the war's human toll is "a question that the Lancet study doesn't really answer".
Dr. Madelyn Hsaio-Rei Hicks published a paper on the 2006 Lancet survey expressing concern over problems presented by the rapid interviewing rate reported by the survey. Her paper concluded that, "In view of the significant questions that remain unanswered about the feasibility of their study’s methods as practiced at the level of field interviews, it is necessary that Burnham and his co-authors provide detailed, data-based evidence that all reported interviews were indeed carried out, and how this was done in a valid manner. In addition, they need to explain and to demonstrate to what degree their published methodology was adhered to or departed from across interviews, and to demonstrate convincingly that interviews were done in accordance with the standards of ethical research."[43]
Borzou Daragahi Iraq correspondent for the Los Angeles Times, in an interview with PBS, questioned the study based on their earlier research in Iraq, saying, "Well, we think—the Los Angeles Times thinks these numbers are too large, depending on the extensive research we've done. Earlier this year, around June, the report was published at least in June, but the reporting was done over weeks earlier. We went to morgues, cemeteries, hospitals, health officials, and we gathered as many statistics as we could on the actual dead bodies, and the number we came up with around June was about at least 50,000. And that kind of jibed with some of the news report that were out there, the accumulation of news reports, in terms of the numbers killed. The U.N. says that there's about 3,000 a month being killed; that also fits in with our numbers and with morgue numbers. This number of 600,000 or more killed since the beginning of the war, it's way off our charts."[44][45]
The October 2006 Lancet estimate also drew criticism from the Iraqi government. Government spokesman Ali Debbagh said, "This figure, which in reality has no basis, is exaggerated".[46] Iraq's Health Minister Ali al-Shemari gave a similar view in November 2006: "Since three and a half years, since the change of the Saddam regime, some people say we have 600,000 are killed. This is an exaggerated number. I think 150 is OK."[47]
A 2008 article in the National Journal revealed for the first time that the Lancet survey was funded in part by George Soros' Open Society Institute.[48] This led to some concerns regarding the objectivity of the survey, including erroneous claims in the media that Soros may have been directly involved in the survey. Outspoken survey critic Michael Spagat, economics professor at Royal Holloway, University of London stated "The authors should have disclosed the donation and for many people that would have been a disqualifying factor in terms of publishing the research."[49][50]
John Tirman, executive director of MIT's Center for International Studies, responded to the issue of Soros funding: "My center at MIT used internal funds to underwrite the survey. More than six months after the survey was commissioned, the Open Society Institute, the charitable foundation begun by Soros, provided a grant to support public education efforts of the issue. We used that to pay for some travel for lectures, a web site, and so on. OSI (Open Society Institute), much less Soros himself (who likely was not even aware of this small grant), had nothing to do with the origination, conduct, or results of the survey. The researchers and authors did not know OSI, among other donors, had contributed." [51]
A 2010 paper by Professor Michael Spagat entitled "Ethical and Data-Integrity Problems in the Second Lancet Survey of Mortality in Iraq" was published in the peer reviewed journal Defense & Peace Economics. This paper argues that there were several "ethical violations to the survey's respondents", faults the study authors for "non-disclosure of the survey's questionnaire, data-entry form, data matching anonymised interviewer identifications with households and sample design", and presents "evidence relating to data fabrication and falsification, which falls into nine broad categories." The paper concludes that the Lancet survey, "cannot be considered a reliable or valid contribution towards knowledge about the extent of mortality in Iraq since 2003."[52]
Two articles from August 2012 by Joel Wing, who runs the blog Musings on Iraq, explore many of the major flaws that have been raised by various researchers over the years. Wing notes that, "The two Lancet studies taken at face value seemed like legitimate estimates of the number of Iraqis that might have been killed after the 2003 invasion," but finds that, "There is evidence that the authors overestimated the number of post-invasion Iraq deaths, did not include data that contradicted their findings, and misrepresented other reports about casualties during the Iraq War to bolster their argument. The Iraqi survey teams might not have followed protocol and the methodology, and some may have faked their results. The responses of the Lancet writers to their critics have been a series of convenient and contradictory statements, and their refusal to openly share all of their data only adds to the suspicions that their work is deeply flawed." Wing concludes that other credible studies "should be consulted first, while the two Lancet papers should be dismissed."[53][54]
AAPOR investigation of the 2nd Lancet survey
On February 3, 2009, the Executive Council of the American Association for Public Opinion Research (AAPOR) announced that an 8-month investigation found the author of the 2006 Lancet survey, Dr. Gilbert Burnham, had violated the Association's Code of Professional Ethics & Practices for repeatedly refusing to disclose essential facts about his research. "Dr. Burnham provided only partial information and explicitly refused to provide complete information about the basic elements of his research," said Mary Losch, chair of the association’s Standards Committee.[55][56] AAPOR's President, Richard A. Kulka, added:
- "When researchers draw important conclusions and make public statements and arguments based on survey research data, then subsequently refuse to answer even basic questions about how their research was conducted, this violates the fundamental standards of science, seriously undermines open public debate on critical issues, and undermines the credibility of all survey and public opinion research. These concerns have been at the foundation of AAPOR’s standards and professional code throughout our history, and when these principles have clearly been violated, making the public aware of these violations is in integral part of our mission and values as a professional organization."[57]
AAPOR subsequently released a more detailed list of eight specific pieces of information Burnham failed to disclose after repeated requests. These include a copy of the survey questionnaire in all languages into which it was translated, the consent statement, information of sample selection methodology and a summary of the disposition of all sample cases.[58]
Neither Dr. Burnham nor the Johns Hopkins Bloomberg School of Public Health are members of AAPOR. Tim Parsons, public affairs director of the Bloomberg School wrote in an official statement that the school was "not in a position to comment" on AAPOR's findings because the school is not a member of the organization and "does not know what procedures or standards were followed in reaching the decision regarding this study." Parsons also noted that the school was nearing completion of its own investigation into the study.[59]
At least one article has been written critical of AAPOR's decision to censure Burnham. Debora MacKenzie, writing in New Scientist, said "There is no direct evidence that the latest attack on Burnham is politically motivated," but the APPOR's stated purpose, "to ensure survey-based research meets high standards," has itself "been questioned by experts." which MacKnenzie does not name.[60]
According to New Scientist's investigation... Burnham has sent his data and methods to other researchers, who found it sufficient. A spokesman for the Bloomberg School of Public Health at Johns Hopkins, where Burnham works, says the school advised him not to send his data to AAPOR, as the group has no authority to judge the research. The "correct forum", it says, is the scientific literature.
According to MacKenzie, "Burnham's complete data, including details of households, is available to bona fide researchers on request." She further noted that the AAPOR's own journal, Public Opinion Quarterly, "published an analysis of Burnham's Iraq survey by David Marker of Westat, a consultancy in Maryland that designs surveys."[60]
The American Statistical Association has subsequently written in support of the actions taken by AAPOR, saying: "We are aware that, in taking this action, you have subjected yourselves to some criticism. On behalf of the American Statistical Association, we wish to recognize AAPOR for following procedure and acting professionally on such a difficult and divisive matter. In so doing, you eloquently express by your actions the goals stated in your Code."[61]
On February 1, 2010, The Bloomberg School and Dr. Burnham were named for the "STONEWALLING/COVERUP” award in iMediaEthics' 2010 Top Ten "Dubious Polling" Awards, based largely on the AAPOR censure. The authors David W. Moore and George F. Bishop, write that Bloomberg and Burnham received the award, "for stonewalling in the face of serious questions about a flawed survey project, which reported more than 600,000 Iraqi deaths from 2003 to 2006," saying, "AAPOR asked for the kind of information that any scientist doing this type of work should release ... The Bloomberg School will not attempt to evaluate what experts believe is almost certainly a faulty methodology, saying the scientific community should make the evaluation. But then the school advises Burnham not to release details about his methods, so the scientific community can’t have the information it needs for a definitive assessment. Sounds like a cop-out and a Catch 22, all rolled into one!"[62]
Johns Hopkins Investigation of the 2nd Lancet survey
In February 2009 Johns Hopkins Bloomberg School of Public Health published the results of an internal review of the study.[63] The review found that researchers in the field used data collection forms that were different from those approved in the original protocol. The forms used in the field contained spaces for names of respondents or householders and many such names were collected, in violation of the protocol. The press release said the review did not find evidence that any individual was harmed as a result of these violations, and that no identifiable info was ever out of the possession of the researchers. As a result of their investigation, Hopkins suspended Dr. Burnham’s privileges to serve as a principal investigator on projects involving human subjects research.
The press release also discussed an examination of all the original data collection forms:
- "An examination was conducted of all the original data collection forms, numbering over 1,800 forms, which included review by a translator. The original forms have the appearance of authenticity in variation of handwriting, language and manner of completion. The information contained on the forms was validated against the two numerical databases used in the study analyses. These numerical databases have been available to outside researchers and provided to them upon request since April 2007. Some minor, ordinary errors in transcription were detected, but they were not of variables that affected the study’s primary mortality analysis or causes of death. The review concluded that the data files used in the study accurately reflect the information collected on the original field surveys."
Number of clusters
Steven E. Moore, who conducted survey research in Iraq for the Coalition Provisional Authority and was an advisor to Paul Bremer for the International Republican Institute, ridiculed the Lancet study in an October 18, 2006 editorial in the Wall Street Journal. In a piece entitled, "655,000 War Dead? A bogus study on Iraq casualties", Moore wrote, "I wouldn't survey a junior high school, no less an entire country, using only 47 cluster points. Neither would anyone else..."[64]
Gilbert Burnham replied on October 20, 2006:
"Mr. Moore did not question our methodology, but rather the number of clusters we used to develop a representative sample. Our study used 47 randomly selected clusters of 40 households each. In his critique, Mr. Moore did not note that our survey sample included 12,801 people living in 47 clusters, which is the equivalent to a survey of 3,700 randomly selected individuals. As a comparison, a 3,700-person survey is nearly 3 times larger than the average U.S. political survey that reports a margin of error of +/-3%."[65]
Pre-invasion death rate
Fred Kaplan, writing for Slate, has criticized the pre-invasion death rate used in both the 2004 and 2006 Lancet surveys.
In an October 29, 2004 article in Slate he wrote:
"But there are two problems with this calculation. First, Daponte (who has studied Iraqi population figures for many years) questions the finding that prewar mortality was 5 deaths per 1,000. According to quite comprehensive data collected by the United Nations, Iraq's mortality rate from 1980–85 was 8.1 per 1,000. From 1985–90, the years leading up to the 1991 Gulf War, the rate declined to 6.8 per 1,000. After '91, the numbers are murkier, but clearly they went up. Whatever they were in 2002, they were almost certainly higher than 5 per 1,000."[14]
See also a related article about Beth Daponte:[66]
In an October 20, 2006 Slate article Fred Kaplan wrote that the pre-invasion death rate calculated by the 2006 Lancet report authors was also too low. This he said would cause the Lancet estimate of excess deaths since the invasion to be too high. Fred Kaplan wrote:
"Based on the household surveys, the report estimates that, just before the war, Iraq's mortality rate was 5.5 per 1,000. (That is, for every 1,000 people, 5.5 die each year.) The results also show that, in the three and a half years since the war began, this rate has shot up to 13.3 per 1,000. So, the 'excess deaths' amount to 7.8 (13.3 minus 5.5) per 1,000. They extrapolate from this figure to reach their estimate of 655,000 deaths. However, according to data from the United Nations, based on surveys taken at the time, Iraq's preinvasion mortality rate was 10 per 1,000."[41]
In a November 20, 2006 Slate article, 2 of the Lancet study authors, Gilbert Burnham and Les Roberts, write:
"Kaplan claims that the rate was really 10, according to U.N. figures. He wrote, '[I]f Iraq's pre-invasion rate really was 5.5 per 1,000, it was lower than almost every country in the Middle East, and many countries in Europe.' This is just wrong! If Kaplan had checked the U.N. death-rate figures, most Middle Eastern nations really do have lower death rates than most European countries, and in fact have lower death rates than 5.5. Jordan's death rate is 4.2, Iran's 5.3, and Syria's 3.5. The reason for the lower rate is simple: Most Middle Eastern nations have much younger populations compared to most Western nations."[42]
From an October 19, 2006 Washington Post article[35] there is this:
- "In a telephone interview, Jon Pedersen, research director for the 2004 [UNDP] study, said several factors probably account for researchers' different findings. One key issue is how researchers extrapolate from the deaths identified in their field research to a death toll for the whole country. Pedersen noted that the Lancet study is based on a pre-invasion mortality rate of 5.5 deaths per thousand people [per year]. The U.N., he said, used the figure of 9 deaths per thousand. Extrapolating from the lower pre-invasion mortality rate would yield a greater increase in post-invasion deaths, he noted."
The above-mentioned U.N. "pre-invasion mortality rate" of 9 deaths/1,000/year is more than either the 2002 or 2003 mortality rates measured by both Lancet studies.
Even though the 2004[1] and 2006[2][3] Lancet studies interviewed different sets of households across Iraq, they came up with the same 2002 pre-war mortality rate. From the 2006 Lancet article: "The striking similarity between the 2004 and 2006 estimates of pre-war mortality diminishes concerns about people’s ability to recall deaths accurately over a 4-year period."[2]
Here is an excerpt from the supplement[3] to the 2006 Lancet study:
- "For the purpose of analysis, the 40 months of survey data were divided into three equal periods—March 2003 to April 2004; May 2004 to May 2005, and June 2005 to June 2006. Following the invasion the death rate rose each year."
- "Pre-invasion: 5.5 deaths/1,000/year
- March 2003–April 2004: 7.5 deaths/1,000/year
- May 2004–May 2005: 10.9 deaths/1,000/year
- June 2005–June 2006: 19.8 deaths/1,000/year
- Overall post-invasion: 13.2 deaths/1,000/year"
The difference between the pre-invasion mortality rate and the different mortality rates after the invasion are the excess mortality rates for each period. Table 3 in the Lancet article[2] lists those rates as 2.6, 5.6, and 14.2. Why the excess mortality rate for June 2005 to June 2006 is listed as 14.2 instead of 14.3 may be due to how rounding was done. The overall excess mortality rate for the whole post-invasion survey period is listed as 7.8 deaths/1000/year in Table 3.
The difference between the Lancet and U.N. pre-invasion mortality rates is 3.5 deaths/1,000/year. The Lancet study used the number of 26,112,353 (from Lancet supplement[3]) as the population of Iraq. 3.5 times 26,112 equals 91,392. So 3.5 deaths/1,000/year means around 91,400 deaths in one year in a population of 26.1 million.
Responses below by Lancet study co-author Les Roberts (LR below) to 2 questions are from an October 31, 2006 MediaLens article.[67]
Question 9:
- 9. Lancet 2 found a pre-invasion death rate of 5.5/ per 1000 people per year. The UN has as estimate of 10? Isn't that evidence of inaccuracy in the study?
- LR: The last census in Iraq was a decade ago and I suspect the UN number is somewhat outdated. The death rate in Jordan and Syria is about 5. Thus, I suspect that our number is valid. ...
Question 10:
- 10. The pre-invasion death rate you found for Iraq was lower than for many rich countries. Is it credible that a poor country like Iraq would have a lower death rate than a rich country like Australia?
- LR: Yes. Jordan and Syria have death rates far below that of the UK because the population in the Middle-east is so young. Over half of the population in Iraq is under 18. Elderly populations in the West are a larger part of the population profile and they die at a much higher rate.
Infant and child death rates
In a March 5, 2007 article[68] in The Times, economist Michael Spagat says there is a perplexing finding in the 2006 Lancet report that child deaths have fallen.
A May 25, 2000 BBC article[69] reported that before Iraq sanctions were imposed by the UN in 1990, infant mortality had "fallen to 47 per 1,000 live births between 1984 and 1989. This compares to approximately 7 per 1,000 in the UK." The BBC article was reporting from a study of the London School of Hygiene & Tropical Medicine, titled "Sanctions and childhood mortality in Iraq", that was published in the May 2000 Lancet medical journal.[70][71]
The 2000 BBC article reported that after the UN sanctions were imposed after Iraq's 1990 invasion of Kuwait, "They found that in south and central Iraq, infant mortality had risen to 108 per 1,000 between 1994 and 1999, while child mortality — covering those between one and five years — rocketed from 56 to 131 per 1,000."
The 2000 BBC article also reported, "However, it found that infant and child mortality in the autonomous, mainly Kurd region in the North of the country, has actually fallen, perhaps reflecting the more favourable distribution of aid in that area."
UN-sponsored studies taken after 2003 revealed that the previous childhood mortality figures for South/Central Iraq (supplied by Saddam's government) were inflated by more than a factor of two and that the childhood mortality rate in those regions was even lower than the rate in northern Iraq.[72]
The UN sanctions ended on May 22, 2003 (with certain arms-related exceptions).[73]
40 houses surveyed per day
Madelyn Hicks, a psychiatrist and public health researcher at King's College London in the U.K., says she "simply cannot believe" the paper's claim that 40 consecutive houses were surveyed in a single day. "There is simply not enough time in the day," she says, "so I have to conclude that something else is going on for at least some of these interviews." Households may have been "prepared by someone, made ready for rapid reporting," she says, which "raises the issue of bias being introduced."[74]
An October 24, 2006 The Guardian article reports this response from Lancet study author Gilbert Burnham:
"Others had suggested that it was impossible for 40 households to be surveyed in one day — but in fact the researchers were split into two teams and conducted 20 household interviews each, he said."[75]
An October 30, 2006 BBC article reports this response from Lancet study author Les Roberts:
"In Iraq in 2004, the surveys took about twice as long and it usually took a two-person team about three hours to interview a 30-house cluster. I remember one rural cluster that took about six hours and we got back after dark. Nonetheless, Dr. Hicks' concerns are not valid as many days one team interviewed two clusters in 2004."[76]
Death Certificates
Of the 1849 households that completed the survey there were reports of 629 deaths during the study period from January 1, 2002 through June 2006.[2]
The Lancet study claims that, "Survey teams asked for death certificates in 545 (87%) reported deaths and these were present in 501 cases. The pattern of deaths in households without death certificates was no different from those with certificates."[2]
So, 92% of those asked for death certificates produced them.
In an interview in April 2007 Lancet study author Les Roberts reported that, "90 percent of the people we interviewed had death certificates. We're quite sure they didn't make these deaths up."[77]
The Iraq Body Count project questioned the Lancet study's death certificate findings saying the Lancet study authors "would imply that officials in Iraq have issued approximately 550,000 death certificates for violent deaths (92% of 601,000). Yet in June 2006, the total figure of post-war violent deaths known to the Iraqi Ministry of Health (MoH), combined with the Baghdad morgue, was approximately 50,000."[78]
The August 2006 Basrah Governorate Assessment Report[79] of the United Nations High Commissioner for Refugees described death certificate procedures of the Ministry of Health (MoH) as follows:
- Death certificates, which are needed in order to obtain retirement benefits for a person’s surviving spouse or children, as well as for inheritance purposes, are issued by the MoH Births/Deaths Administrative Offices which are located in Public Hospitals. Death certificates are usually issued the same day. The following documents are required:
- Medical report;
- Civil ID card of the deceased person;
- Food ration card of the deceased person.
- The issuance of death certificates is free.
In a November 20, 2006 Slate article, 2 of the Lancet study authors, Gilbert Burnham and Les Roberts, write:
"In July [2006], for example, the Ministry of Health reported exactly zero violent deaths in Anbar Province, in spite of the contradictory evidence we saw on our televisions. Is that a surveillance network on which our understanding of what is going on in Iraq can depend?"[42]
In October 2006 Middle East Professor Juan Cole supported the Lancet findings, noting that Iraqis often bury their dead on the same day, and thus don't require a death certificate, and also may not report it for fear of reprisals by militias:
"Although there are benefits to registering with the government for a death certificate, there are also disadvantages. Many families who have had someone killed believe that the government or the Americans were involved, and will have wanted to avoid drawing further attention to themselves by filling out state forms and giving their address."[80]
In a peer-reviewed paper on the Lancet survey, economist Michael Spagat examined the death certificate data. He noted that the very high reported rate of death certificates by the survey "implies that the official death certificate system has issued, but failed to record the issuance of, about 500,000 death certificates", and notes that the rate of confirmations claimed by the second survey is substantially higher than the rate found in the first survey, despite covering a longer period, and calculates the odds against this to be very high. Spagat further notes several "unlikely patterns in the confirmations of violent deaths through the viewing of death certificates and in the patterns of when death certificates were requested and when they were not requested." His analysis concludes that "there is likely fabrication in the death-certificate data" and that "these data do not give reliable support to [the Lancet survey's] very high estimated death rate."[52]
Main street bias
The research team of Professors Neil Johnson, Sean Gourley and J.P. Onella of the physics department at Oxford University, Professor Michael Spagat of the economics department of Royal Holloway, University of London, and Professor Gesine Reinert of the statistics department at Oxford University, claimed the methodology of the study was fundamentally flawed by what they term "main street bias". They claimed the sampling methods used "will result in an over-estimation of the death toll in Iraq" because "by sampling only cross streets which are more accessible, you get an over-estimation of deaths."[75][81]
These professors have published a detailed paper discussing this bias and the Lancet study called "Conflict Mortality Surveys".[82]
An October 24, 2006 The Guardian article reported this response from a Lancet study author:
"But Prof Burnham said the researchers penetrated much further into residential areas than was clear from the Lancet paper. The notion 'that we avoided back alleys was totally untrue'. He added that 28% of households were in rural areas — which matches the population spread."[75]
An article in Science magazine by John Bohannon describes some of the criticisms, as well as some responses from the Lancet report's lead author Gilbert Burnham. According to Bohannon and Johnson, the Lancet paper indicates that the survey team avoided small back alleys for safety reasons. But this could bias the data because deaths from car bombs, street-market explosions, and shootings from vehicles should be more likely on larger streets. Burnham counters that such streets were included and that the methods section of the published Lancet paper is oversimplified.[74]
Bohannon also alleged that Burnham told Science that he does not know exactly how the Iraqi team conducted its survey; the details about neighborhoods surveyed were destroyed "in case they fell into the wrong hands and could increase the risks to residents." These explanations have infuriated the study's critics. Michael Spagat, who specializes in civil conflicts, says the scientific community should call for an in-depth investigation into the researchers' procedures. "It is almost a crime to let it go unchallenged," adds Johnson.[74]
In a 24 November 2006 letter to Science, the authors of the Lancet report claimed that Bohannon misquoted Burnham, stating that "in no place does our Lancet paper say that the survey team avoided small back alleys", and that "The methods section of the paper was modified with the suggestions of peer reviewers and the editorial staff. At no time did Burnham describe it to Bohannon as 'oversimplified'."[83]
Bohannon defended his comments as accurate, citing Burnham saying, in response to questions about why details of selecting "residential streets that did not cross the main avenues", that "in trying to shorten the paper from its original very large size, this bit got chopped, unfortunately." In addition, the details which were destroyed refer to the "scraps" of paper on which streets and addresses were written to "randomly" choose households.[83] The data set is now being selectively released.[84]
The authors of the main street bias critique published a formal paper on this idea in the Journal of Peace Research.[85] This paper subsequently won the journal's 2008 Article of the Year award.[86] The jury states that the article "provides an important advance in the methodology for estimating the number of casualties in civil wars," and that, "the authors show convincingly that previous studies which are based on a cross-street cluster-sampling algorithm (CSSA) have significantly overestimated the number of casualties in Iraq."
The authors have also published a follow-up paper in Europhysics Letters which provides a generic framework than can be used to assess sampling bias in certain social and biological systems.[87] A special case of the framework can be used to derive the results presented in their Journal of Peace Research paper. The authors also investigate the sensitivity of their results to the underlying model parameter values. They reiterate their view that a more precise determination of the model parameters and, hence, the extent of sampling bias, is possible only if the actual micro-level data of the Lancet study are released.
Criticism of graph
Figure 4 from the October 2006 Lancet survey of Iraq War mortality, showing a comparison of 3 mortality estimates. Two letters subsequently published in the Lancet journal challenged this graph.[88][89]
The purpose of the graph in the Lancet article is in "monitoring trends over time," which show increased deaths from 3 mortality different mortality estimates. Results from other studies track results from the Lancet surveys. The graph states, "the similar patterns of mortality over time documented in our survey and by other sources corroborate our findings about the trends in mortality over time." The graph shows that the IBC and DoD data document the rise in cumulative deaths over time (plotted along the "Deaths" axis on the left). Rates for the Lancet are plotted independently using the "Deaths per 1,000 per year" axis on the right.
A letter by Debarati Guha-Sapir, Olivier Degomme and Jon Pedersen argues: "Burnham and colleagues' figure 4, in which cumulated Iraq Body Count deaths parallel their study's mortality rates, is misleading. Rates cannot be compared with numbers, much less with cumulative numbers." A second letter by Josh Dougherty argues that the DoD figure is misrepresented: "Burnham and colleagues' assertion that the DoD 'estimated the civilian casualty rate at 117 deaths per day' is mistaken, as is their figure 4, which repeats this error in graphic form. These data refer to Iraqi civilians and security-force personnel, not just to civilians, and to casualties (ie, deaths or injuries), not just deaths."
The Lancet authors replied, "Josh Dougherty and Debarati Guha-Sapir and colleagues all point out that figure 4 of our report mixes rates and counts, creating a confusing image. We find this criticism valid and accept this as an error on our part. Moreover, Dougherty rightly points out that the data in the US Department of Defense source were casualties, not deaths alone... We wanted to show that the three sources all similarly pointed to an escalating conflict."
More responses to criticisms
In a Democracy Now! interview, study co-author Les Roberts defended the methodology by noting that the method is the standard used in poor countries. He also said that the same method was used by the US government following wars in Kosovo and Afghanistan. Roberts also said that the US government's Smart Initiative program is spending millions of dollars per year teaching NGOs and UN workers how to use the same cluster method for estimating mortality rates.[91]
The article's authors defended their research, claiming that their work was the only active study of the death toll, and that this is more accurate than passively counting reported deaths.[27] They cited a number of factors that could lead to smaller figures from other sources; for example, the Islamic requirement that bodies be buried within 24 hours of death. They claim that the sources of bias in their study push the figure down.
An October 11, 2006 Washington Post article[4] reports:
- Ronald Waldman, an epidemiologist at Columbia University who worked at the Centers for Disease Control and Prevention for many years, called the survey method "tried and true," and added that "this is the best estimate of mortality we have."
In a letter to The Age, published on 21 October 2006, 27 epidemiologists and health professionals defended the methods of the study, writing that the study's "methodology is sound and its conclusions should be taken seriously."[9]
A Reuters article reports on other researchers, epidemiologists, professors, and physicians who have defended the study. For example; this quote from the article;
- "Over the last 25 years, this sort of methodology has been used more and more often, especially by relief agencies in times of emergency," said Dr. David Rush, a professor and epidemiologist at Tufts University in Boston.[92]
Sir Richard Peto, Professor of Medical Statistics and Epidemiology in the University of Oxford, described the 2006 report as "statistically valid" in an interview on BBC television.[93]
Dr. Ben Coghlan, an epidemiologist in Melbourne Australia, writes: "The US Congress should agree: in June this year [2006] they unanimously passed a bill outlining financial and political measures to promote relief, security and democracy in the Democratic Republic of Congo. The bill was based in part on the veracity of a survey conducted by the Burnet Institute (Melbourne) and the International Rescue Committee (New York) that found 3.9 million Congolese had perished because of the conflict. This survey used the same methodology as Burnham and his associates. It also passed the scrutiny of a UK parliamentary delegation and the European Union."[94] Burnham is one of the authors of both of the Lancet studies.
October 19, 2006 Washington Post article[35] reports:
- "The numbers do add up," said Daniel Davies, a stockbroker and blogger for the Guardian. He argued that the sample of 1,849 households interviewed by Iraqi doctors working for the JHU research team was as large as that used by political pollsters.
An article on the British media analysis Media Lens website dated October 16, 2006 quotes many health experts, epidemiologists, biostatistics experts, polling experts, etc. who approve of the Lancet study and methodology.[95] For example:
- John Zogby, whose New York-based polling agency, Zogby International, has done several surveys in Iraq since the war began, said: "The sampling is solid. The methodology is as good as it gets. It is what people in the statistics business do." ...
- Professor Sheila Bird of the Biostatistics Unit at the Medical Research Council said: "They have enhanced the precision this time around and it is the only scientifically based estimate that we have got where proper sampling has been done and where we get a proper measure of certainty about these results."
In a 31 October 2006 article on the Media Lens website, Lancet study co-author Les Roberts responded to several questions on the report, concluding that: "Of any high profile scientific report in recent history, ours might be the easiest to verify. If we are correct, in the morgues and graveyards of Iraq, most deaths during the occupation would have been due to violence. If Mr. Bush's '30,000 more or less' figure from last December is correct, less than 1 in 10 deaths has been from violence. Let us address the discomfort of Mr. Moore and millions of other Americans, not by uninformed speculation about epidemiological techniques, but by having the press travel the country and tell us how people are dying in Iraq."[67]
A review of a variety of mortality estimates for Iraq was made by a group of scientists and published in 2008 in Conflict and Health, a peer-reviewed journal; their conclusion is that the Lancet "studies provided the most rigorous methodology as their primary outcome was mortality." [96]
UNDP ILCS study compared to Lancet studies
UNDP ILCS stands for the 2004 United Nations Development Programme Iraq Living Conditions Survey[97]
The Iraq Body Count project (IBC) records civilian deaths reported by English-language media, including all civilian deaths due to coalition military action, the insurgency or increased criminal violence.[98] The IBC site states: "it should be noted that many deaths will likely go unreported or unrecorded by officials and media."[99]
The IBC death count at the time of the October 2006 Lancet study was released was between 43,546 and 48,343, or roughly 7% of the estimate in the Lancet study. Besides the admitted IBC undercount due to its media reliance, some of the difference between the Lancet and IBC estimates is explained by the fact that the Lancet study was estimating all "excess" deaths from any and all violent and nonviolent causes, and includes combatants and civilians alike.
However, IBC believes some of it may also be explained by the Lancet having overestimated, citing the lower estimate from the UNDP's 2004 Iraq Living Conditions Survey (ILCS).
IBC illustrated several of what it calls "the main data that are relevant to a comparative assessment of" the ILCS study and the 2004 Lancet study. It points to, for example, a much larger number of clusters (2,200 for ILCS vs. 33 for Lancet), and a more accurate sampling rate (1 in 200 for ILCS vs. 1 in 3,000 for Lancet).[100] The 2006 Lancet study is somewhat larger than the first (it used 47 clusters instead of 33, and had a lower sampling rate). The 2004 Lancet study surveyed 988 households, and the 2006 Lancet study surveyed 1849 households. The ILCS study surveyed 22,000 households.
Lancet authors draw a different kind of comparison. From Appendix C of the 2006 Lancet study supplement[3] there is this concerning the ILCS study:
- "Working for the U.N. Development Program [UNDP], the highly regarded Norwegian researcher Jon Pederson led a survey that recorded between 18,000 and 29,000 violent deaths during the first year of occupation. The survey was not focused on deaths, but asked about them over the course of lengthy interviews that focused on access to services. While this was more than twice the rate recorded by IBC [Iraq Body Count project] at the time, Pederson expressed concern for the completeness and quality of the data in a newspaper interview last year. The surveys reported in The Lancet were focused solely on recording deaths and count about two and a half times as many excess deaths from all causes over the same period."
In an October 30, 2006 BBC article Lancet study author Les Roberts compares the number of violent deaths found in the UNDP survey and in the 2 Lancet surveys through the first year after the invasion (by April 2004):
"This UNDP survey covered about 13 months after the invasion. Our first survey recorded almost twice as many violent deaths from the 13th to the 18th months after the invasion as it did during the first 12. The second survey found an excess rate of 2.6/1000/year over the same period corresponding to approximately 70,000 deaths by April 2004. Thus, the rates of violent death recorded in the two survey groups are not so divergent."[76]
In a Media Lens article[67] Les Roberts discussed the UNDP ILCS (Jon Pederson) method of recording deaths:
- "His group conducted interviews about living conditions, which averaged about 82 minutes, and recorded many things. Questions about deaths were asked, and if there were any, there were a couple of follow-up questions. A) I suspect that Jon's mortality estimate was not complete. ... Jon sent interviewers back after the survey was over to the same interviewed houses and asked just about <5 [less than 5] year old deaths. The same houses reported ~50% more deaths the second time around. In our surveys, we sent medical doctors who asked primarily about deaths. Thus, I think we got more complete reporting."
The ILCS asked about deaths during the course of a lengthy interview on the household's living conditions. In the 3 main ILCS documents (in pdf form) all the war-related deaths info is in 6 paragraphs on page 54 of the analytical report.[97][101] It states:
- "The ILCS data has been derived from a question posed to households concerning missing and dead persons during the two years prior to the survey. Although the date was not asked for, it is reasonable to suppose that the vast majority of deaths due to warfare occurred after the beginning of 2003."
Iraq Body Count project compared to Lancet studies
Besides the comparisons made in various publications,[31][67] and in previous sections here, there are also more comparisons and criticisms of both studies in the relevant sections of the above-linked articles. In particular see the "Undercounting" section at Casualties of the conflict in Iraq since 2003 which lists many examples of how the media, hospitals, morgues, government, etc. miss some of the deaths caused by the war.
ORB survey compared with Lancet studies
On September 14, 2007, ORB (Opinion Research Business), an independent UK based polling agency, published an estimate of the total casualties of the Iraq war. The figure suggested by ORB, which was based on survey responses from 1,499 adults, stands at 1,220,580 deaths, with a margin of error of 2.5%. This estimate, although conducted independently, and using a different polling methodology, is consistent with the Lancet findings if accounting for the additional 14 months covered by the ORB poll.[5]
On 28 January 2008, ORB published an update based on additional work carried out in rural areas of Iraq. Some 600 additional interviews were undertaken and as a result of this the death estimate was revised to 1,033,000 with a given range of 946,000 to 1,120,000.[102]
This ORB poll estimate came under criticism in a peer reviewed paper called "Conflict Deaths in Iraq: A Methodological Critique of the ORB Survey Estimate", published in the journal Survey Research Methods. This paper "finds fundamental flaws in the data underpinning ORB’s estimate", and concludes that the ORB data "are not suitable for deriving any credible estimate but, given proper scrutiny, it is clear that ORB has overestimated by a wide margin.[6][103]
Iraq Family Health Survey compared with Lancet studies
The "Iraq Family Health Survey" published in the New England Journal of Medicine surveyed 9,345 households across Iraq and estimated 151,000 deaths due to violence (95% uncertainty range, 104,000 to 223,000) over the same period covered in the second Lancet survey by Burnham et al.[104] The NEJM article stated that the second Lancet survey "considerably overestimated the number of violent deaths and said the Lancet results were, "highly improbable, given the internal and external consistency of the data and the much larger sample size and quality-control measures taken in the implementation of the IFHS."
The figures provided by this survey on the total violent deaths in Iraq, are lower than Lancet's estimate by a factor of roughly 4. However, despite the differences, Lancet co-author Les Roberts said there were a few underlying similarities as well, such as a doubling of mortality rate after the invasion of Iraq in the study, compared to the 2.4-fold increase reported by Lancet. It has been estimated by Roberts that the "excess death" toll in the IFHS survey would be about 400,000, which he says, puts these figures in league with Lancet''s. Roberts says the discrepancy between the two studies arise with Lancet attributing most of the post-war excess deaths to violence, while only one-third of the excess deaths would be due to violence in the IFHS.[105] See: Iraq Family Health Survey#400,000
The authors of the IFHS report have disputed this conclusion, saying, "The excess deaths reported by Burnham et al. included only 8.2% of deaths from nonviolent causes, so inclusion of these deaths will not increase the agreement between the estimates from the IFHS and Burnham et al." They defended the results of their survey saying, "It is unlikely that a small survey with only 47 clusters has provided a more accurate estimate of violence-related mortality than a much larger survey sampling of 971 clusters."[106]
See also
- Casualties of the Iraq War. An overview of many casualty estimates.
- ORB survey of Iraq War casualties
- Iraq Family Health Survey
References
- 1 2 ""Mortality before and after the 2003 invasion of Iraq: cluster sample survey"" (PDF). (263 KB). By Les Roberts, Riyadh Lafta, Richard Garfield, Jamal Khudhairi, and Gilbert Burnham. The Lancet, 29 October 2004. There is a version of the PDF article that has a clickable table of contents. It is here: .
- 1 2 3 4 5 6 7 8 9 10 11 12 13 ""Mortality after the 2003 invasion of Iraq: a cross-sectional cluster sample survey"" (PDF). (242 KB). By Gilbert Burnham, Riyadh Lafta, Shannon Doocy, and Les Roberts. The Lancet, October 11, 2006
- 1 2 3 4 5 6 ""The Human Cost of the War in Iraq: A Mortality Study, 2002–2006"" (PDF). (603 KB). By Gilbert Burnham, Shannon Doocy, Elizabeth Dzeng, Riyadh Lafta, and Les Roberts. A supplement to the October 2006 Lancet study. It is also found here:
- 1 2 3 4 5 "Study Claims Iraq's 'Excess' Death Toll Has Reached 655,000". By David Brown. Washington Post. October 11, 2006.
- 1 2 "UK Poll Consistent with 1 Million Extrapolation of Lancet Death Toll". By Robert Naiman (Just Foreign Policy). September 14, 2007. Huffington Post.
- 1 2 "Conflict Deaths in Iraq: A Methodological Critique of the ORB Survey Estimate" By Michael Spagat and Josh Dougherty
- ↑ A Million Iraqi Dead? The U.S. press buries the evidence
- ↑ IRAQ BODY COUNT: “A VERY MISLEADING EXERCISE”
- 1 2 "The Iraq deaths study was valid and correct", The Age, 21 October 2006
- ↑ "CNN.com In-Depth Specials - Counting the dead in Congo". CNN. Archived from the original on September 22, 2006.
- ↑ IRC | Mortality Study, Eastern D.R. Congo (April-May 2000) Archived April 6, 2005, at the Wayback Machine.
- ↑ Archived June 19, 2006, at the Wayback Machine.
- ↑ "100,000 War Crimes". TomPaine.com. Retrieved 2010-08-10.
- 1 2 "100,000 Dead—or 8,000. How many Iraqi civilians have died as a result of the war?". By Fred Kaplan. October 29, 2004. Slate.
- 1 2 "Researchers Who Rushed Into Print a Study of Iraqi Civilian Deaths Now Wonder Why It Was Ignored". By Lila Guterman. The Chronicle of Higher Education. January 27, 2005.
- ↑ "Dead Iraqis. Why an Estimate was Ignored". By Lila Guterman, Columbia Journalism Review, March/April 2005.
- ↑ "News Foreign & Commonwealth Office". Fco.gov.uk. Retrieved 2010-08-10.
- ↑ "320: What's In A Number? — 2006 Edition". Nov 3, 2006. Retrieved 27 August 2013.
- 1 2 ABC News: ABC News Archived March 2, 2005, at the Wayback Machine.
- 1 2 "Scientists estimate 100,000 Iraqis may have died in war". USA Today. October 28, 2004. Retrieved May 7, 2010.
- ↑ "Burying The Lancet — Update". Medialens.org. 2005-09-12. Retrieved 2010-08-10.
- ↑ Media Alert: Burying The Lancet, Media Lens, September 5, 2005
- ↑ Bird, S (2004). "Military and public-health sciences need to ally". The Lancet. 364 (9448): 1831–1833. doi:10.1016/S0140-6736(04)17452-7. Retrieved 2010-08-10.
- ↑ Apfelroth, S (2005). "Mortality in Iraq". The Lancet. 365 (9465): 1133. doi:10.1016/S0140-6736(05)71866-3. Retrieved 2010-08-10.
- ↑ Roberts, L; Burnham, G; Garfield, R (2005). "Mortality in Iraq". The Lancet. 365 (9465): 1133–1134. doi:10.1016/S0140-6736(05)71867-5. Retrieved 2010-08-10.
- ↑ McPherson K (2005). "Counting the dead in Iraq". BMJ. 330 (7491): 550–1. doi:10.1136/bmj.330.7491.550. PMC 554014. PMID 15760972.
- 1 2 "'Huge rise' in Iraqi death tolls". BBC News. 11 October 2006.
- 1 2 "Critics say 600,000 Iraqi dead doesn't tally. But pollsters defend methods used in Johns Hopkins study". By Anna Badkhen. San Francisco Chronicle. October 12, 2006.
- 1 2 Iraqi deaths survey 'was robust'. BBC News, 26 March 2007.
- ↑
- Ministers were told not to rubbish Iraq deaths study Guardian Unlimited, March 26, 2007
- Counting the cost by Richard Horton, March 27, 2007
- 1 2 "Reality checks: some responses to the latest Lancet estimates". By Hamit Dardagan, John Sloboda, and Josh Dougherty. Iraq Body Count project. October 16, 2006.
- ↑ "Press Release 14 (16 Oct 2006)". Iraq Body Count. 2006-10-16. Retrieved 2010-08-10.
- ↑ "Exaggerated claims, substandard research, and a disservice to truth".
- ↑ "Jon Pedersen". Fafo.no. 1996-02-02. Retrieved 2010-08-10.
- 1 2 3 "Is Iraq's Civilian Death Toll 'Horrible' -- Or Worse?". By Jefferson Morley, Washington Post, October 19, 2006.
- ↑ Jim Giles. "Access : Huge Iraqi death estimate sparks controversy : Nature News". Nature.com. Retrieved 2010-08-10.
- ↑ Giles, Jim (2006-10-19). "Iraqi death toll withstands scrutiny". Nature. 443 (7113): 728–729. doi:10.1038/443728a. ISSN 0028-0836. (subscription required (help)).
- ↑ "1" (PDF). Retrieved 2010-08-10.
- ↑ "Search results - Resource centre" (PDF). International Committee of the Red Cross.
- ↑ http://socrates.berkeley.edu/~jewell/LancetNov061.pdf
- 1 2 "Number Crunching. Taking another look at the Lancet's Iraq study". By Fred Kaplan. October 20, 2006. Slate.
- 1 2 3 "Counting Corpses. The Lancet number crunchers respond to Slate's Fred Kaplan. (And Kaplan replies)". By Gilbert Burnham and Les Roberts. November 20, 2006. Slate.
- ↑ "Mortality after the 2003 invasion of Iraq: Were valid and ethical field methods used in this survey?" (PDF). Retrieved 2010-08-10.
- ↑ NewsHour with Jim Lehrer (2006-10-11). "Online NewsHour: Update | Study Finds 655,000 Iraqi Deaths | October 11, 2006". PBS. Retrieved 2010-08-10.
- ↑ "War's Iraqi Death Toll Tops 50,000". Commondreams.org. Retrieved 2010-08-10.
- ↑ "Iraq toll report 'exaggerated'". Adelaide Now. Archived from the original on May 15, 2015.
- ↑ "Iraqi official estimates at least 150,000 Iraqis killed by insurgents". November 9, 2006. Associated Press.
- ↑ http://news.nationaljournal.com/articles/databomb/index.htm
- ↑ "Anti-War Billionaire George Soros Funded Iraq Study — International News | News of the World | Middle East News | Europe News". FOXNews.com. 2008-01-13. Retrieved 2010-08-10.
- ↑ Lancet Iraq Study Authors Reply to Editorial Wall Street Journal January 14, 2008
- ↑ "Counting Iraqi Casualties -- and a Media Controversy". By John Tirman. February 14, 2008. Editor and Publisher.
- 1 2 Spagata, Michael (2010). "Ethical and Data-Integrity Problems in the Second Lancet survey of Mortality in Iraq". Defence and Peace Economics. 21 (1): 1–41. doi:10.1080/10242690802496898.
- ↑ "Major Flaws With The Lancet Reports On Iraqi Deaths, Part I". By Joel Wing. August 2, 2012. AK News.
- ↑ "Major Flaws With The Lancet Reports On Iraqi Deaths, Part II". By Joel Wing. August 4, 2012. AK News.
- ↑ "Nondisclosure Cited in Iraq Casualties Study". By Gary Langer. February 4, 2009. ABC News.
- ↑ "Author Of Shocking Iraq Study Accused Of Bad Ethics". February 4, 2009. Houston Chronicle.
- ↑ "AAPOR Finds Gilbert Burnham in Violation of Ethics Code". Aapor.org. Retrieved 2010-08-10.
- ↑ "Executive Council, American Association for Public Opinion Research February 13, 2009" (PDF). Archived from the original (PDF) on 7 October 2009.
- ↑ "Nondisclosure Cited in Iraq Casualties Study — ABC News". Abcnews.go.com. 2009-02-04. Retrieved 2010-08-10.
- 1 2 MacKenzie, Debora. What is behind criticism of Iraq deaths estimate?, New Scientist. February 9, 2009.
- ↑ "Letter to AAPOR President Kulka" (PDF). Retrieved 2010-08-10.
- ↑ "2010 Top Ten Dubious Polling Awards". February 1, 2010. "iMediaEthics."
- ↑ Review Completed of 2006 Iraq Mortality Study. Feb 23, 2009. Johns Hopkins Bloomberg School of Public Health.
- ↑ "655,000 War Dead? A bogus study on Iraq casualties". By Steven E. Moore. October 18, 2006. Wall Street Journal.
- ↑ Response to the Wall Street Journal's "655,000 War Dead?" at the Wayback Machine (archived March 8, 2008). By Gilbert Burnham. October 20, 2006. Johns Hopkins Bloomberg School of Public Health.
- ↑ "Estimates of deaths in first war still in dispute". By Jack Kelly. Pittsburgh Post-Gazette. February 16, 2003.
- ↑ "Could 650,000 Iraqis really have died because of the invasion?". By Anjana Ahuja. The Times. March 5, 2007.
- ↑ "Child death rate doubles in Iraq". BBC. May 25, 2000.
- ↑ Ali MM, Shah IH (2000). "Sanctions and childhood mortality in Iraq". Lancet. 355 (9218): 1851–7. doi:10.1016/S0140-6736(00)02289-3. PMID 10866440.
- ↑ Centre for Population Studies. DFID Reproductive Health Work Programme. Lists bibliographic details for article, "Sanctions and childhood mortality in Iraq".
- ↑ Spagat, Michael (September 2010). "Truth and death in Iraq under sanctions" (PDF). Significance. Retrieved 2016-12-01.
- ↑ "Security Council Resolution 1483. May 22, 2003". Globalpolicy.org. Retrieved 2010-08-10.
- 1 2 3 Bohannon J (2006). "Epidemiology. Iraqi death estimates called too high; methods faulted". Science. 314 (5798): 396–7. doi:10.1126/science.314.5798.396. PMID 17053114.
- 1 2 3 "UK scientists attack Lancet study over death toll". By Sarah Boseley. October 24, 2006. The Guardian.
- 1 2 Lancet author answers your questions". October 30, 2006. BBC.
- ↑ "A moment with ... Dr. Les Roberts, political scientist". By Tom Paulson. April 20, 2007. Seattle Post-Intelligencer.
- ↑ Reality checks: some responses to the latest Lancet estimates. By Hamit Dardagan, John Sloboda, and Josh Dougherty. Press Release 14, 16 Oct 2006. Iraq Body Count project
- ↑ Basrah Governorate Assessment Report. August 2006. United Nations High Commissioner for Refugees
- ↑ "655,000 Dead in Iraq since Bush Invasion". By Middle Eastern Professor Juan Cole. October 11, 2006.
- ↑ "Lancet Iraq Study Flawed: Death Toll Too High". By Sean Gourley, Neil Johnson, and Michael Spagat. October 20, 2006. Scoop
- ↑ Conflict Mortality Surveys. Department of Economics, Royal Holloway, University of London.
- 1 2 Burnham G, Roberts L (2006). "A debate over Iraqi death estimates". Science. 314 (5803): 1241; author reply 1241. doi:10.1126/science.314.5803.1241b. PMID 17124305.
- ↑ Release of Data from the 2006 Iraq Mortality Study". Johns Hopkins Bloomberg School of Public Health.
- ↑ Johnson NF, Spagat M, Gourley S, Onnela JP, Reinert G (2008). "Bias in Epidemiological Studies of Conflict Mortality". Journal of Peace Research. 45 (5): 653–663. doi:10.1177/0022343308094325.
- ↑ "Article of the Year — PRIO". Prio.no. Retrieved 2010-08-10.
- ↑ Onnela, Jukka-Pekka; Johnson, Neil F.; Gourley, Sean; Reinert, Gesine; Spagat, Michael (2008). "Sampling bias in systems with structural heterogeneity and limited internal diffusion". EPL (Europhysics Letters). 85 (2): 28001. arXiv:0807.4420. doi:10.1209/0295-5075/85/28001.
- 1 2 Guha-Sapir, Debarati; Degomme, Olivier; Pedersen, Jon (2007). "Mortality in Iraq". The Lancet. 369 (9556): 102. doi:10.1016/S0140-6736(07)60061-0. Retrieved 2010-08-10.
- 1 2 Dougherty, Josh (2007). "Mortality in Iraq". The Lancet. 369 (9556): 102–103. doi:10.1016/S0140-6736(07)60062-2. Retrieved 2010-08-10.
- ↑ Burnham, Gilbert; Lafta, Riyadh; Doocy, Shannon; Roberts, Les (2007). "Mortality in Iraq – Authors' reply". The Lancet. 369 (9556): 103–104. doi:10.1016/S0140-6736(07)60063-4. Retrieved 2010-08-10.
- ↑ "Co-Author of Medical Study Estimating 650,000 Iraqi Deaths Defends Research in the Face of White House Dismissal". Democracy Now!. Retrieved 2010-08-10.
- ↑ "Iraq death rate estimates defended by researchers". By Deena Beasley. Reuters. October 21, 2006. Article is also here .
- ↑ Peter Barron (2006-10-13). "BBC NEWS | The Editors". Bbc.co.uk. Retrieved 2010-08-10.
- ↑ Coghlan, Ben, "Gut reaction aside, those on the ground know Iraq reality", Centre for International Health, 31 October 2006
- ↑ "Democracy and Iraq — Killing debate", Media Lens, 18 October 2006
- ↑ Tapp, Christine; Frederick, M. Burkle Jr.; Wilson, Kumanan; Takaro, Tim; Guyatt, Gordon H.; Amad, Hani; Mills, Edward J. (2008). "Iraq War Mortality Estimates: A Systematic Review" (PDF). Conflict and Health. 2 (1): 9–10.
- 1 2 "Iraq Living Conditions Survey 2004". 2004 UNDP ILCS. United Nations Development Programme Iraq Living Conditions Survey.
- ↑ About the Iraq Body Count project.
- ↑ Iraq Body Count project. Source of IBC quote on undercounting by media is here .
- ↑ "Speculation is no substitute: a defence of Iraq Body Count". April 2006. By Hamit Dardagan, John Sloboda, and Josh Dougherty. Iraq Body Count project.
- ↑ Archived August 14, 2015, at the Wayback Machine.
- ↑ Update on Iraqi Casualty Data by Opinion Research Business, January 2008.
- ↑ "Study reveals fundamental flaws to 2007 estimate of one million Iraqis killed" Royal Holloway University of London
- ↑ "New study estimates 151 000 violent Iraqi deaths since 2003 invasion". January 9, 2008 news release. World Health Organization. See right sidebar for related links.
- ↑ "Lancet Study Author Assesses New Report on Iraqi Death Toll". January 11, 2008. Institute for Public Accuracy.
- ↑ Burnham GM (July 2008). "Violence-related mortality in Iraq, 2002–2006". N. Engl. J. Med. 359 (4): 431–2; author reply 434. doi:10.1056/NEJMc080419. PMID 18650523.
External links
- First study
- Elisabeth Rosenthal, "Study Puts Iraqi Deaths of Civilians at 100,000", International Herald Tribune via the New York Times, 29 October 2004.
- Sarah Boseley, "100,000 Iraqi civilians dead, says study", The Guardian, 29 October 2004.
- Stephen Soldz, "100,000 Iraqis Dead: Should We Believe It?", ZNet, November 3, 2004.
- Jamie Doward, "The Lancet and the bodies in question", column in The Guardian, 7 November 2004.
- Joseph Choonara, "Counting the dead in Iraq", interview of Les Roberts, Socialist Worker, 23 April 2005.
- Les Roberts, "100,000 deaths in Iraq: A year later", American Friends Service Committee, 26 October 2005.
- Les Roberts seminar at the London School of Hygiene & Tropical Medicine, on 13 April 2005; uploaded in https://www.youtube.com/watch?v=o5vD_Ub2K_c on 25 October 2006.
- Second study
- "Co-Author of Medical Study Estimating 650,000 Iraqi Deaths Defends Research in the Face of White House Dismissal" (contains link to audio), interview with Les Roberts, Democracy Now, 12 October 2006.
- "War on Iraq: Is the United States Killing 10,000 Iraqis Every Month? Or Is It More?". By Michael Schwartz. AlterNet. July 6, 2007.
- "Data Bomb". By Neil Munro, National Journal, January 4, 2008.
- Researchers Respond to National Journal Article. By Gilbert Burnham, MD, PhD, Professor and Director, Center for Refugee and Disaster Response, Johns Hopkins Bloomberg School of Public Health. And by Les Roberts, PhD, Associate Professor, Mailman School of Public Health, Columbia University.
- "Right-Wingers Can't Cover Up Iraq's Death Toll Catastrophe". By John Tirman. January 21, 2008. AlterNet.
- Answers to Questions About Iraq Mortality Surveys. The Center for Refugee and Disaster Response, of the Johns Hopkins Bloomberg School of Public Health.