Lancet author answers your questionsA recent report published in the medical journal The Lancet estimated that around 655,000 people have died in Iraq as a result of the 2003 invasion.
This figure, which is far higher than those reported in Iraq, resulted in claims that the survey had been exaggerated.
Les Roberts, one of the report's authors, answered some of your questions on the methodology and findings.
-------------------------------
How do you know that you are not reporting the same fatality multiple times? For example, if you were to ask people in the UK if they know anyone who has been involved in a traffic accident most would say they do. Applying your logic that means there are 60 million accidents every year.
Andrew M, London, UKTo be recorded as a death in a household, the decedent had to have spent most of the nights during the three months before their death "sleeping under the same roof" with the household that was being interviewed. This may have made us undercount some, but addressed your main concern that no two households could claim the same death event.
It seems The Lancet has been overrun by left-wing sixth formers. The report has a flawed methodology and the counting process shows signs of deceit.
Ian, Whitwick, UKThis study was the standard approach for measuring mortality in times of war, it went through a rigorous peer-review process and it probably could have been accepted into any of the journals that cover war and public health.
Can you explain, if your figures are correct, why 920 more people were dying each day than officially recorded by the Iraqi Ministry of Health - implying huge fraud and/or incompetence on their behalf?
Dan, ScotlandIt is really difficult to collect death information in a war zone! In 2002, in Katana Health Zone in eastern Democratic Republic of Congo (DRC) there was a terrible meningitis outbreak where the zone was supported by the Belgian Government, with perhaps the best disease surveillance network in the entire country. A survey by the NGO International Rescue Committee showed that only 7% of those meningitis deaths were recorded by the clinics and hospitals and government officials.
You and your colleagues claim to have used the same method to estimate deaths in Iraq as is used to estimate deaths in natural disasters. Is there any evidence that the method is accurate?
Rickard Loe, Stockholm, SwedenThat is a good question. In 1999, again in Katana Health Zone in the Congo, I led a mortality survey where we walked a grid over the health zone and interviewed 41 clusters of five houses at 1km spacings. In that survey, we estimated that 1,600 children had died of measles in the preceding half year. A couple of weeks later we did a standard immunization coverage survey that asked about measles deaths and we found an identical result.
Why is it so hard for people to believe The Lancet report? I am an Iraqi and can assure you that the figure given is nearer to the truth than any given before or since.
S Kazwini, London, UKI think it is hard to accept these results for a couple of reasons. People do not see the bodies. Secondly, people feel that all those government officials and all those reporters must be detecting a big portion of the deaths. When in actuality during times of war, it is rare for even 20% to be detected.
It seems to me that the timing of the publication of the 2004 and 2006 reports - in both cases shortly before a U.S. election - was a mistake.
Mik Ado, London, UKBoth were unfortunate timing. As I said at the time of the first study, I lived in fear that our Iraqi colleagues and interviewers would be killed if we had finished a survey in mid-September and it took two months for the results to get out. I think in Iraq, a post-election publication in 2004 would have been seen as my colleagues knowing something but keeping it hidden.
Joe Emersberger, who follows this issue closely, collected some of the expert criticisms of the report and a selection was put to Mr Roberts.
A research team have asserted in an article in Science that the second Lancet study is seriously flawed due to "main street bias."
We worked hard in Iraq to have every street segment have an equal chance of being selected. We worked hard to have each separate house have an equal chance of being selected. Realize, there would have to be both a systematic selection of one kind of street by our process and a radically different rate of death on that kind of street in order to skew our results. We see no evidence of either.
The second report found a pre-invasion death rate of 5.5/ per 1000 people per year. The UN has an estimate of 10. Isn't that evidence of inaccuracy in the study?
The last census in Iraq was a decade ago and I suspect the UN number is somewhat outdated. The death rate in Jordan and Syria is about 5. Thus, I suspect that our number is valid.
Madelyn Hicks, a psychiatrist and public health researcher at King's College London in the UK, says she "simply cannot believe" the paper's claim that 40 consecutive houses were surveyed in a single day.
In Iraq in 2004, the surveys took about twice as long and it usually took a two-person team about three hours to interview a 30-house cluster. I remember one rural cluster that took about six hours and we got back after dark. Nonetheless, Dr. Hicks' concerns are not valid as many days one team interviewed two clusters in 2004.
A UNDP survey, 13 months after the war, had a much higher sample size than both Lancet studies and found about one-third the number of deaths that your team has found. Given the much higher sample size, shouldn't we assume the UNDP study was more accurate and therefore your numbers are way too high?
The UNDP study was much larger, it was led by the highly revered Jon Pederson in Norway, but was not focused on mortality. I suspect that Jon's mortality estimate was not complete. I think we got more complete reporting.
This UNDP survey covered about 13 months after the invasion. Our first survey recorded almost twice as many violent deaths from the 13th to the 18th months after the invasion as it did during the first 12. The second survey found an excess rate of 2.6/1000/year over the same period corresponding to approximately 70,000 deaths by April of 2004. Thus, the rates of violent death recorded in the two survey groups are not so divergent.
© BBC MMVI