Thinking About Thinking About War

I spent January listening to the first half of Barbara Tuchman’s Guns of August, in which she dissects the run-up to World War I. Tuchman describes conversations taking place across Europe in which generals and politicians alike are all, “We’re absolutely going to be home by Christmas. There is no possible way that couldn’t happen. The other side? Pushovers. Probably won’t even show up to fight. Also this is totally a great idea and we will get everything we want out of this war. EVERYTHING.”

We all know how well that worked out.

Reading the heated op-eds about the necessity of war with Iran and/or Syria, it strikes me that they’re nothing new. The strange overconfidence on display in the 1910s – that war would be quick, easy, and end favorably – was echoed in the run up to Iraq and is being rehashed today. This reminded me of the Rubicon Theory of War, a barely-noted article from last summer’s issue of International Security that offers valuable food for thought, particularly for those charged with thinking or writing about war. The authors address the overconfidence conundrum, namely, that people who should know better than to think war will be quick and easy often act like this is their first rodeo. The authors conclude:

When people believe they have crossed a psychological Rubicon and perceive war to be imminent, they switch from what psychologists call a “deliberative” to an “implemental” mind-set, triggering a number of psychological biases, most notably overconfidence. These biases can cause an increase in aggressive or risky military planning. Furthermore, if actors believe that war is imminent when it is not in fact certain to occur, the switch to implemental mind-sets can be a causal factor in the outbreak of war, by raising the perceived probability of military victory and encouraging hawkish and provocative policies.

Their research suggests humans are only rational actors until we make a decision – cross the Rubicon – at which point our mental apparatus will go through whatever logical leaps necessary to avoid questioning that decision. The authors frame this idea in terms of mind-sets – deliberative vs. implemental – to account for the full range of attendant biases, which they’ve laid out in a helpful table:

Essentially, when we’ve crossed the Rubicon, we are less likely to accept information that does not support our decision, and we’re more likely to believe we will be successful regardless of evidence to the contrary. This overconfidence leads to riskier war plans and a higher likelihood of going to war. As for the standard rational actor model, the authors suggest that rationality goes out the window once a decision is taken:

Early on in the decisionmaking process, a leader is more likely to be in a deliberative mind-set and may approximate a rational actor. Later during the crisis, the same leader is more likely to be in an implemental mind-set, and may display a range of biases that deviate from rationality.

This phenomenon affects the general public as well. Take Iraq:

For example, in 2003, regime change in Iraq might have been relatively straightforward, but postwar stabilization was likely to be difficult and protracted. Nevertheless, as the invasion drew near, Americans concluded that success in both of these objectives would be swift. … In the months leading up to the conflict, a majority expected “a long and costly involvement” in Iraq. But judgments switched immediately before the war, such that a majority now expected “a fairly quick and successful effort.”

Again, we know how well that turned out.

It should be noted that this decision needn’t be a conscious one, nor is it necessarily predicated upon a rational cost/benefit analysis. However, when one writes that the alternatives are narrowing, as Elliot Abrams did, and that some action must be taken, and then concludes that action must be military in nature, we can assume the die’s been cast:

If success were made of speeches and sanctions the Obama policy would be marvelous — and adequate. The problem is that Syria is at war, and one side or the other will win that war. It will be the Assad/Russia/Iran/Hezbollah side, or the popular uprising with its European, American, and Arab support. A deus ex machina ending is possible, wherein some Syrian Army generals push Assad out and agree to a transition away from Assad and Alawite rule. But such a step by the generals is far more likely if they conclude that Assad’s war is lost.

So we must make sure he loses. Directly or indirectly, the next step is to provide plenty of money and arms, training, and intelligence to the Free Syrian Army and other opponents of the Assads.

Abrams notes that there could be problems down the road, but dismisses them with a handwave: “All those questions will come with victory against the bad guys — but only with victory.” As though the path to victory will have no bearing on the eventual outcomes.  As though arming the opposition is a surefire way to win this war. As though there’s no way it won’t be over in days, not weeks or years.

An attack on Iran’s nuclear sites would also be challenging – which hasn’t hampered calls to go ahead and get on with it already. Polling suggests that Americans are in favor of military strikes if it meant preventing a nuclear Iran. Troublingly, the repetition of the expectation that strikes are imminent means we’re more likely to believe that it is true (psychological biases again), which sets up a feedback loop in which we perceive war as imminent – and thus cross the Rubicon.

Whether we should get into a war with/in Iran/Syria is outside the scope of this blog post. Rather, I want to make clear that there are unconscious psychological biases that come along with the acceptance of war that make it difficult to maintain objectivity and rationality – and that we must be on our guard against sloppy thinking. Once we’ve committed to the idea, we begin to assume things will go our way, and we avoid thinking about – and planning for – negative outcomes. If the actual decision about going to war is a determinant of our ideas about how that war will play out - and not, say, intelligence about an opponent’s military preparedness, or the potential negative consequences of war, or even the difficulty of executing the war – it’s crucial that we guard against overconfidence. And it’s not like we can’t fight against that inclination; it’s just that we often don’t.

At the end of every war, somebody says, “This. This is the end of war. Now, finally, it’s too expensive/too stupid/too wasteful/too destructive.” And indeed, it seems like the costs of war are rising and the benefits shrinking. But we seem incapable of the necessary in-the-moment questioning our cognitive processes to determine whether this war, just this one, will actually be easy, cheap, and rewarding, or if we just really want it to be.

It’s critical for leaders, intellectuals, the media, and the general public alike to understand consciously what mind set we are in and the attendant cognitive biases that brings. These sort of metacognitive tasks are admittedly difficult – our knowledge about how and what we think is limited, and gaining greater control over those processes is challenging (read Thinking, Fast and Slow for some great – and disturbing – examples of this). But it’s not impossible, and given the stakes, I’d argue that we are all responsible for knowing when we’ve cast our lots. Without the self-awareness and intellectual honesty to recognize when we’ve switched to an implemental mindset – and to then guard against the resultant surge of overconfidence – we’re doomed to the same debates and the same outcomes.

****

Post script: It was while chewing over all that that I made those sarcastic Go The Fuck To War prints. I’ve never been good at artist statements, so I’m going to assume y’all understand what they mean (to wit: once you start thinking war is an okay idea, you’re probably gonna be a little too enthusiastic about it). Anyway, I forgot that I was supposed to give two of them away last week, so! You get another chance: head over to this post and comment and you’ll be entered to win. Manage your expectations.

This entry was posted in Analysis, War. Bookmark the permalink.

12 Responses to Thinking About Thinking About War

  1. Pingback: Iran, Syria, and The Rubicon Theory of War | pompousandprolix

  2. Pingback: The Rubicon Theory of War | IMSL Insights

  3. theod says:

    Eliot Abrams has never suffered a negative consequence* for being wrong about practically everything under his purview…including lying under oath to Congress. So why would anybody expect him to change his ways and be cautious and circumspect now? Political America is perhaps the last accountability-free zone. Being wrong pays pretty well (Bush, Cheney, Wolfowitz, Feith, Powell, Rice, etc). Being right makes you a pariah (Shinseki, McClatchy, Scott Ridder, Phil Donahue, protesting hippies, etc).

    * Yet thousands of people have died and been maimed because of his politics and decisions. Negative consequences are for little people.

  4. Dorothy Childs says:

    I am sure you know of Plato’s essay on “Philosopher/Kings”. (Not the title).

    To see that our next generation is engaging in rational discourse about the macrocosmic state of the world stage is immeasurably reassuring. All of us would do well to listen to intelligent thinkers who are not yet deluded into the sense of war as an attractive and immediate recourse.

  5. Well worth saving yesterday and reading it with a clear mind today.

  6. seydlitz89 says:

    This is interesting, but of only limited use in terms of strategic theory. Aggressors always think they are going to win and the group think surrounding the political leadership usually reflects the Weltanschauung they all share. Bush discounted any intelligence that indicated what he didn’t want to hear and his “policy shaped intelligence more than vice versa” as Paul Pillar (and others) have documented.

  7. Pingback: Weekend Reading « Backslash Scott Thoughts

  8. Pingback: Guilt, Inaction, and Trying to Pay it Backwards | Plastic Manzikert

  9. Pingback: From safe zones to where? » Gunpowder & Lead

  10. Pingback: Links I Liked « Hands Wide Open

  11. Pingback: Thinking About Thinking About War | dianawueger.com

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>