Wednesday, June 13, 2012

Truly Understanding What Combat Mortality Statistics Mean

I find it interesting sometimes to imagine how my worldview might change if I experienced different events.

It seems elementary that if you've made the best use of the data available, you should only change your mind based on new information. Merely experiencing an event without finding out anything you didn't know before ought not change your perception of things.

So it's funny to read about how the average person's views change with a particular experience, and try to hypothesize where your current views fit along the claimed evolution.

What prompted this (and continuing with the 'All-Fussell-All-The-Time' theme of the blog of late) was Paul Fussell's description of how the average soldier's views on the chances of death change over time.
In war it is not just the weak soldiers, or the sensitive ones, or the highly imaginative or cowardly ones, who will break down. All will break down if in combat long enough. "Long enough" is now defined by physicians and psychiatrists as between 200 and 240 days. For every frontline soldier in the Second World War, according to John Ellis, there was the "slowly dawning and dreadful realisation that there was no way out, that . . . it was only a matter of time before they got killed or maimed or broke down completely." As one British officer put it, "You go in, you come out, you go in again and you keep doing it until they break you or you are dead." This "slowly dawning and dreadful realisation" usually occurs as a result of two stages of rationalization and one of accurate perception:
1. It can't happen to me. I am too clever / agile / well-trained / good-looking / beloved / tightly laced / etc.
Personally, I can't imagine ever thinking this. Death is always certain, and there's always a chance that you're going to draw the unlucky number even in much safer events than combat. So while this might be a subconscious starting point, I doubt it. What about the second stage?
This persuasion gradually erodes into
2. It can happen to me, and I'd better be more careful. I can avoid the danger by keeping extra alert at all times / watching more prudently the way I take cover or dig in or expose my position by firing my weapon / etc.
This conviction attenuates in turn to the perception that death and injury are matters more of bad luck than lack of skill...
At a minimum, I think I'd start at this stage (or the first half, anyway) - it definitely can happen to you. The question is how much agency you have over the matter. Note that the description above tends to not focus on probabilities - it can happen, but if I do X, then it can't. I think this is empirically a good description of the world - most people don't think in probabilities.

But to those that do, it's obvious that you dying in warfare can be both a) largely determined by chance, and b) something you can still shift a bit at the margin by not doing stupid things.

In essence, you're spinning a roulette wheel, and any number above 3 means you're dead, or something equivalent. You can have crummy odds and still understand what the odds are.

So that, in short, would be where I think I'd view World War 2 combat probabilities.

But I don't think I would have gotten to the conclusion that makes up Fussell's stage 3:
...making inevitable the third stage of awareness:
3. It is going to happen to me, and only my not being there is going to prevent it.

On a number of dimensions, that is actually incredibly clear-sighted. Granted, it still makes the mistake of not thinking in the probabilistic way (a probability of 99% is not the same thing as a probability of 100%).

But which bias are you more likely to be succumbing to? Being overly optimistic that you will somehow be different and escape it all, or ignoring the tiny chance that you might actually make it? To ask the question is to know the answer. The bias is all on the side of optimism - if you round your estimated survival probability down to zero, it won't change the answer by much, the same way as if you assume that you'll never win the lottery you'll almost certainly make better choices than if you assume any non-trivial probability of the event occurring.

And indeed, it only takes a minor modification to the premise to make it technically correct as well, by beginning the sentence with the phrase 'Given long enough, ...' . This is expressed most memorably in the motto of Zero Hedge - on a long enough timeline, the survival rate for everyone drops to zero.

In wartime, you don't even need the timeline to be that long.

Which makes the second half of the sentence all the more powerful - the only way out is to not be there.

That is something that I wouldn't have figured out with equivalent clarity.

In the middle of combat, there are also very few ways out. Desert and you run a good chance of getting shot.

I can imagine that goes a fair way to explaining why people go insane in war - you figure out that it is now inevitable that you'll die a horrible, gruesome death at some random (but imminent) point, and until then you're going to be surrounded by horror and brutality.

The phrase 'only my not being there is going to prevent it' can also be paraphrased as 'the only winning move is not to play.'

No comments:

Post a Comment