Iraq and the Awesome Powers of Ideological Insulation — Much of my reading over the past year or so has centered on quirks of human reasoning. Though it is not exactly surprising to find out that our computational powers are limited, and that, instead of living up to Enlightenment ideals of capital ‘R’ Reason, we resort to to a kludgey toolkit of quick and dirty rules of thumb that work just well enough for Darwinian purposes, it is nevertheless humbling to understand in detail just how bad we are at thinking. This literature (bounded rationality, naturalized rationality, heuristics and biases, and so forth), once you get into it, will really erode your confidence in your ideological commitments.
The phenomenon of confirmation bias, for instance, is very robust, and very unsettling. Confirmation bias has to do with the way we seek and process information, and it works like a one-way ratchet, pushing us ever deeper into our intellectual commitments. We seek and relish every bit of data that seems to support our views, and avoid and rationalize every bit of data that undermines them. We don’t do it on purpose. We just do it. Which is what makes it troubling. And what makes our ideological opponents seem so willfully blind.
So, believe it or not, this is a post about what I take to be the impending war on Iraq. The point is that after living in the literature of human fallibility for so long, I find that I cannot muster a position on the war that leaves my sense of intellectual honesty intact. In principle, I oppose war, since it involves killing people, consuming vast amounts of resources that could be put to use in the service of less grim satisfactions, and tends to erode our liberties. However, I also don’t care to perish suddenly in a nuclear conflagration, or something equally horrible, precipitated by an ideology-mad, power-drunk, desperate dictator. What to do?
Here are the main arguments, for and against.
For: Hussein, has aided and is aiding the Al Qaeda murderers, and is developing WOMD that he may attempt to use against us. We need to punish him for his complicity, and secure our safety by effecting a “regime change” in Iraq.
Against: All the accusations against Hussein are mere rumors. And attacking Iraq may “destabilize” the whole region, and bring about even worse terroristic reprisals.
My problem: I have NO idea which is most likely to be true. My gut says prefer “against,” but I don’t know that the Administration is feeding us sheer bullshit. Maybe they know something that they can’t safely spell out. Or maybe not. I DON’T KNOW. And I don’t know that invading Iraq will or won’t have bad consequences. It could turn out that we end up in a winning war against the sundry forces of Muslim evil and wind up liberating millions, and ushering in a new era of peace on earth. Or we could fuck things up horribly and end up with a decimated Manhattan or, God forbid, a high radiation zone inside the Beltway. It seems to me that NO ONE has enough information to pin reliable probabilities on either of these or alternative scenarios. The people who don’t like war will find what seems to them very persuasive reasons to oppose it, and those who are keen to kick ass will find deep cogency in arguments to the effect that we must do so. Myself, I can only maintain a very anxious agnosticism.
Such a stance is anxious because most of us prefer “cognitive closure,” that is, a sense of having a firm grasp on why things have and will happen, over “integrative complexity,” that is, a willingness to juggle and weigh contradictory explanations and arguments. In his paper, “Close-Call Counterfactuals and Belief-System Defenses: I Was Not Almost Wrong But I Was Almost Right,” political psychologist Phil Tetlock shows that experts in world politics, and especially those who preferred closure, were likely to reject close-call counterfactuals that challenged their explanations for the past, but would happily embrace them when they protected their predictions from refutation.
A close call counterfactual statement is a “what if”, like “The Nazis would have conquered the USSR had they invaded two months earlier.” This is all to say, that when we get it wrong, we’re going to say, “Well, I WOULD have been right, if only so and so had done such and such,” and when we get it right, we won’t accept “Well, I would have been WRONG, except for the fact that so and so did such and such.” Which is precisely the kind of asymmetry in the way we process information that locks us in to our prior commitments. We are almost impervious to refutation by events.
My guess is that, whatever the outcome in Iraq (good I hope!), almost no one’s prior commitments will be dashed. Everyone will have been at least almost right, or, if right, so clearly right that they obviously weren’t almost wrong.