I spent January listening to the first half of Barbara Tuchman’s Guns of August, in which she dissects the run-up to World War I. Tuchman describes conversations taking place across Europe in which generals and politicians alike are all, “We’re absolutely going to be home by Christmas. There is no possible way that couldn’t happen. The other side? Pushovers. Probably won’t even show up to fight. Also this is totally a great idea and we will get everything we want out of this war. EVERYTHING.”
We all know how well that worked out.
Reading the heated op-eds about the necessity of war with Iran and/or Syria, it strikes me that they’re nothing new. The strange overconfidence on display in the 1910s – that war would be quick, easy, and end favorably – was echoed in the run up to Iraq and is being rehashed today. This reminded me of the Rubicon Theory of War, a barely-noted article from last summer’s issue of International Security that offers valuable food for thought, particularly for those charged with thinking or writing about war. The authors address the overconfidence conundrum, namely, that people who should know better than to think war will be quick and easy often act like this is their first rodeo. The authors conclude:
When people believe they have crossed a psychological Rubicon and perceive war to be imminent, they switch from what psychologists call a “deliberative” to an “implemental” mind-set, triggering a number of psychological biases, most notably overconfidence. These biases can cause an increase in aggressive or risky military planning. Furthermore, if actors believe that war is imminent when it is not in fact certain to occur, the switch to implemental mind-sets can be a causal factor in the outbreak of war, by raising the perceived probability of military victory and encouraging hawkish and provocative policies.
Their research suggests humans are only rational actors until we make a decision – cross the Rubicon – at which point our mental apparatus will go through whatever logical leaps necessary to avoid questioning that decision. The authors frame this idea in terms of mind-sets – deliberative vs. implemental – to account for the full range of attendant biases, which they’ve laid out in a helpful table:
Essentially, when we’ve crossed the Rubicon, we are less likely to accept information that does not support our decision, and we’re more likely to believe we will be successful regardless of evidence to the contrary. This overconfidence leads to riskier war plans and a higher likelihood of going to war. As for the standard rational actor model, the authors suggest that rationality goes out the window once a decision is taken:
Early on in the decisionmaking process, a leader is more likely to be in a deliberative mind-set and may approximate a rational actor. Later during the crisis, the same leader is more likely to be in an implemental mind-set, and may display a range of biases that deviate from rationality.
This phenomenon affects the general public as well. Take Iraq:
For example, in 2003, regime change in Iraq might have been relatively straightforward, but postwar stabilization was likely to be difficult and protracted. Nevertheless, as the invasion drew near, Americans concluded that success in both of these objectives would be swift. … In the months leading up to the conflict, a majority expected “a long and costly involvement” in Iraq. But judgments switched immediately before the war, such that a majority now expected “a fairly quick and successful effort.”
Again, we know how well that turned out.
It should be noted that this decision needn’t be a conscious one, nor is it necessarily predicated upon a rational cost/benefit analysis. However, when one writes that the alternatives are narrowing, as Elliot Abrams did, and that some action must be taken, and then concludes that action must be military in nature, we can assume the die’s been cast:
If success were made of speeches and sanctions the Obama policy would be marvelous — and adequate. The problem is that Syria is at war, and one side or the other will win that war. It will be the Assad/Russia/Iran/Hezbollah side, or the popular uprising with its European, American, and Arab support. A deus ex machina ending is possible, wherein some Syrian Army generals push Assad out and agree to a transition away from Assad and Alawite rule. But such a step by the generals is far more likely if they conclude that Assad’s war is lost.
So we must make sure he loses. Directly or indirectly, the next step is to provide plenty of money and arms, training, and intelligence to the Free Syrian Army and other opponents of the Assads.
Abrams notes that there could be problems down the road, but dismisses them with a handwave: “All those questions will come with victory against the bad guys — but only with victory.” As though the path to victory will have no bearing on the eventual outcomes. As though arming the opposition is a surefire way to win this war. As though there’s no way it won’t be over in days, not weeks or years.
An attack on Iran’s nuclear sites would also be challenging – which hasn’t hampered calls to go ahead and get on with it already. Polling suggests that Americans are in favor of military strikes if it meant preventing a nuclear Iran. Troublingly, the repetition of the expectation that strikes are imminent means we’re more likely to believe that it is true (psychological biases again), which sets up a feedback loop in which we perceive war as imminent – and thus cross the Rubicon.
Whether we should get into a war with/in Iran/Syria is outside the scope of this blog post. Rather, I want to make clear that there are unconscious psychological biases that come along with the acceptance of war that make it difficult to maintain objectivity and rationality – and that we must be on our guard against sloppy thinking. Once we’ve committed to the idea, we begin to assume things will go our way, and we avoid thinking about – and planning for – negative outcomes. If the actual decision about going to war is a determinant of our ideas about how that war will play out - and not, say, intelligence about an opponent’s military preparedness, or the potential negative consequences of war, or even the difficulty of executing the war – it’s crucial that we guard against overconfidence. And it’s not like we can’t fight against that inclination; it’s just that we often don’t.
At the end of every war, somebody says, “This. This is the end of war. Now, finally, it’s too expensive/too stupid/too wasteful/too destructive.” And indeed, it seems like the costs of war are rising and the benefits shrinking. But we seem incapable of the necessary in-the-moment questioning our cognitive processes to determine whether this war, just this one, will actually be easy, cheap, and rewarding, or if we just really want it to be.
It’s critical for leaders, intellectuals, the media, and the general public alike to understand consciously what mind set we are in and the attendant cognitive biases that brings. These sort of metacognitive tasks are admittedly difficult – our knowledge about how and what we think is limited, and gaining greater control over those processes is challenging (read Thinking, Fast and Slow for some great – and disturbing – examples of this). But it’s not impossible, and given the stakes, I’d argue that we are all responsible for knowing when we’ve cast our lots. Without the self-awareness and intellectual honesty to recognize when we’ve switched to an implemental mindset – and to then guard against the resultant surge of overconfidence – we’re doomed to the same debates and the same outcomes.
Post script: It was while chewing over all that that I made those sarcastic Go The Fuck To War prints. I’ve never been good at artist statements, so I’m going to assume y’all understand what they mean (to wit: once you start thinking war is an okay idea, you’re probably gonna be a little too enthusiastic about it). Anyway, I forgot that I was supposed to give two of them away last week, so! You get another chance: head over to this post and comment and you’ll be entered to win. Manage your expectations.