The Atlantic Monthly online is running an interesting story in its April issue looking at the on-going - and increasingly nasty - debate over the grim question of how many Iraqis have died as a result of the March 2003 invasion.
Estimates of these deaths vary widely, from around 80,000 to as high as a million, according to the article. Estimates out there include a 2006 study in the Lancet, a British medical journal , research by the World Health organisation published in the New England Journal of Medicine and (not mentioned in the article) on-going documentation work by an independent research group based in the UK, Iraq Body Count.
Debate about methodologies and numbers is all useful in the name of thorough research (although it seems to have become personally rather bitter over Iraq). But whichever estimate proves to be nearest the mark, we should always remember the shattered human lives and the misery and insecurity behind the statistics - otherwise such concerns take on a tinge of ghoulishness. Even the lowest Iraq estimates demonstrate the civilian casualties are considerable, as they are in a number of other conflicts around the world not paid so much attention.
In addition, Megan McCardle, the Atlantic Monthly article's author, puts an interesting spin on the debate by introducing the notion of "anchoring effects" - well known thanks to cognitive scientists like Daniel Kahneman and the late Amos Tversky, whose work on mental biases we've discussed before on this blog. Their empirical research, and that of others, has shown that human beings tend to fixate on numbers we've heard, even if they're arbitrary or wrong. These effects persist, even when the number has been discredited.
"We anchor most strongly on the first number we hear, particularly when it is shocking and precise—like, say, 601,027 violent deaths in Iraq. And even when such a number is presented only as a central estimate in a wide range of statistical possibilities (as the Lancet study’s figure was), we tend to ignore the range, focusing instead on the lovely, hard number in the middle. Human beings are terrible at dealing with uncertainty, and besides, headlines seldom highlight margins of error.Collecting data on conflict is difficult and, by necessity, rather imprecise, which is why in the domain of statistics methodology is so important. Relative margins of error can have huge impacts on data, as could the consequences of embedding spurious "facts" in the minds of the public and of policy-makers. Something we all have to bear in mind.
When information supports positions we already hold, we of course tend to accept it less critically; when the opposite is true, we can be quite good at shutting the information out. “Motivated reasoning” is a mighty force, as anyone who has argued politics in a bar at 2 a.m. can attest. Scientists have observed the process, using a functional MRI machine to peer into the brain while it processes political statements, and their report is unsurprising. When we are assessing neutral statements, activity is concentrated in the areas that control higher reasoning. But when we process statements with political valence, suddenly our emotional cortices light up as well. Indeed, some research indicates that the emotion precedes, and governs, the higher cognition—that logic is, literally, an afterthought.
But cognitive bias is not limited to partisans; we all anchor on the numbers we hear. The Lancet article’s central estimate exerts a gravitational pull on even its harshest critics, who seem to be mentally benchmarking their estimates by how much they differ from that 601,027. Others who are not motivated to disprove that number tend to orbit even closer."
Picture by mattsmith569 retrieved from Flickr.com.