For readers in a hurry, here’s the answer: There is no probability of a nuclear war. Nor of a conventional one. Nor even of you being victor in your next King Of The Hill Battle.
Thanks to Ken Fitch for pointing us to an article with same name, by an author who disagrees with me and says there is a probability of a nuclear war.
The author is Alex Tabarrok, who provides a table from another author who compiled of various sources, each providing its own probability of a nuclear war. These are “annualized”, meaning you can use the numbers and say “There is a X% chance of a nuclear war this year.” And the same next year.
If there was a probability of a nuclear war. Which there isn’t. Not unconditionally.
The compiler meant a nuclear war with Russia, and so that is the proposition to which her experts put their minds. Will there be a nuclear war? I take it to mean the use in anger of a nuclear weapon of any kind. The answer is yes or no. The answer is yes is the war is caused to happen by some entity, and no if caused not to happen by any number of reasons.
If we knew the cause, then we’d know the answer. God knows what will happen. Could anybody besides God know? Well, it’s not logically impossible. There might be somebody out there who has thought through every possible contingency (note that word!), and judged correctly every action every actor would take, up to and including whether the button is pushed or not, for every day between now and the crack of doom. There are such things as prophets.
Barring direct revelation, we only have our wits to go on. And our wits have fits. The best we can do is gather all those bits of of information which we believe are probative of the question and think through all possibilities relative to this.
Which everybody already knows. It’s the same thing people who bet on sports, horses, or stocks do.
Now everybody also knows even the best sports bettors don’t bat a thousand, to sportily mix a metaphor. Surely who will win tonight’s game is much easier to predict that whether the bombs will start flying. This means the obvious: (1) even experts don’t know exactly what to look for, and (2) the causes of a win or a war are too many to grasp.
Same thing is true even for rolls of dice! The causes aren’t hard to know in principle; it’s just some bouncing around using the equations of motion. If you knew, precisely knew, the initial conditions, and the causes, then you’d know exactly what would happen on the toss.
If you knew, precisely knew, what was on Vlad’s mind, and will be on Vlad’s mind, and the minds of all his generals, and the minds of all our generals, and the minds of all our goading neocon let’s-have-another-war armchair generals, and the minds of all peoples everywhere who will influence these other minds, and all conditions of the world that will influence the minds of those peoples everywhere, and you knew the calculus of free will, why, then you’d know whether or not there’d be a nuclear war. Simple!
Barring that level of precision, you can instead gather what little you do or can know. You’ll be left with a set of propositions you believe are probative to “Nuclear war is launched”. Best thing you could do is then say, “Given what I know, the chance of a war is slim”, and leave it at that. Without quantifying “slim.” Why?
Quantifying “slim” adds to the list of propositions you already hold about “Nuclear war is launched”. Some of those propositions will be wrong, some right. Saying you know how to get to an exact number means saying you know the exact mathematical way the evidential propositions you’re using related to “Nuclear war is launched”. Well, you might be right about that, but you’re probably wrong. It’s just too complex, and there’s no way to verify your suppositions.
Still, the question of war is important. Decisions must be made regarding it. Decisions require knowing the uncertainty of the question. This is where the temptation to quantify the probability becomes, to many, overwhelming. Some of the decisions will be monetary, and money is a number; therefore, the probability must be made into a number, too, so that some fancy equations will work.
That leads to over-certainty. You’re cramming a quantified model into a box because you want numbers. That means ignoring all the other non-quantified bits of the problem. The equations become bloated in esteem.
There will be some who say you need quantified probabilities to make decisions, which is false. We make decisions all the time without quantifying anything. The outcome of a mistake about the war is not quantifiable, whether in saying there will be a war when there won’t be, or the opposite. It is to guarantee uncertainty to put artificial numbers on outcomes, just so some formal “decision analysis” model, fed with artificial probabilities, will work.
To support this site using credit card or PayPal click here