The prize will not be awarded, because the task is impossible. Of course, there are no unconditional truths, either, since meta-logic encompasses both ordinary Aristotelian two-value logic and probability. But let that pass for this contest.
It is necessary to describe the deadly sin of reification, most recently seen in the post On The Probability God Exists. There some took to writing things like “P(G)” where G was taken as the proposition “God Exists.” Because there was no “given bar”, i.e. no “|”, the probability seemed unconditional and therefore in need of quantification. (“Priors”—I shudder to recall—were even mentioned.)
Well, any probability can be written in such a shorthand way as to imply lack of conditions, but the written equation is not alive, it is not a real thing. In particular it is statisticians (and mathematicians) who fall under the spell of their—let’s admit it—beautiful scratchings and come to see them as having a life apart from their own minds.
Notation is a great facilitator and allows easy manipulation, but just because a thing can be written, or derived, does not mean the thing has any bearing on reality. Applied mathematicians constantly point this out to their pure mathematician brothers, and it’s about time philosophers made this known to statisticians.
Now it’s true that in just about any introductory textbook on probability you will see the “equation” “P(H)”, meant to indicate “the probability of a Head in a ‘fair’ coin flip.” This is always assigned the value 1/2. Never mind (as in never mind) how that assignment comes about and the complications of the word “fair”. Instead look at the notation: it is written to suggest that, lo, here we have an unconditional probability.
We do not. The conditions are all there in the text and are what allow the quantification. In keeping with sanity, the equation should be written P(H|E) where E is the evidence (list of premises, observations, or other things taken for granted) from which we derive or assign the probability. The E is always (as in always) there.
And that is the point: the ever-present E. E may be simple, i.e. nothing more than our intuition or faith, or it may be complex, i.e. a compound proposition mixing premises with observations and inferences, but it always exists in every single probability ever.
The reason textbooks write things like P(H) as if it were unconditional is because they want introduce “conditional” probability later, as if it were a different thing. Because why? Because then they get to show off new mathematical techniques to manipulate these new symbols. And this is so because statisticians are under the mistaken impression that probability is a branch of mathematics, which it surely is not. At least, not when the probability is used to quantify uncertainty in anything of interest to human beings.
In short, it does no good to write P(X) and hide the conditions outside the equation as if it were the equation itself were the probability. The equation is just shorthand for the real thing.
Nevertheless, I am willing to be proved wrong. If you think you can demonstrate a probability, even a probability of 1, which is conditioned on nothing, then put it in the comments below and I will personally congratulate you, award you the BIP, and admit my error.
Rules: Don’t forget the very rigorous definition of nothing, which means, just as you might suspect, no thing. Anything is something and thus not nothing. Nothing is the lack of every thing. Just because you don’t write a thing (as in “P(G)”, “P(H)”, etc.) does not mean the thing isn’t there: it merely means the thing is not written. It is still there penned in invisible ink. Intuition, i.e. faith, is a thing and is therefore not nothing.
(Incidentally, you will quickly notice we are on solid ground if there appear comments carping about the rules instead of engaging the problem, or by the absence of those regulars who ordinarily argue such matters but who were somehow busy this week and couldn’t attend.)
Update My above prediction turned out pretty well (the parenthetical one). But as a service to readers who were unaware that a problem existed, and who were therefore curiously anxious to deny it, I pulled some quotes.
I got these from typing “unconditional probability” in scholar.google.com. Try it yourself. It’s fun.
“Secondly, there must be at least as many distinct conditional probability values as there are distinct unconditional probability values – to any unconditional probability P(Z), there corresponds the conditional probability P(Z/K) which has the same value.” Probabilities of conditionals — Revisited.
“Thus the unconditional probability that he will still be in B in period t + i given that he starts
at t is 3′. The unconditional probability that he will not be in B is (1 – 3′).” Occupational Choice under Uncertainty.
“The unconditional probability that at the last stage of an n + 1 stage search we will find a new species is equal to…” On Estimating the Probability of Discovering a New Species.
I got these from the same into books.google.com. This funnerer (yes, funnerer).
“However, there are some considerations that seem to favor the primacy of conditional probability…” Yes, amen, there are. “…On the other hand, given an unconditional probability, there is always a corresponding conditional probability lurking in the background.” Yes, there is. Philosophy of Statistics
“There are indications in Cohen’s paper that he does not use the unconditional probability of an event A to represent the probability of A prior to the receipt of a body of evidence relevant to A, but rather that he means to indicate the probability of A conditioned on all conceivable relevant evidence about A…” Probability and Inference in the Law of Evidence: The Uses and Limits of…
“On this reading, therefore, the apparently unconditional probability P(A), that the second toss lands heads, is really the probability EP(A|K), of A conditional on some background evidence K, e.g. about how the coin is tossed.” Probability: A Philosophical Introduction (Hey, JH, this one sound familiar?)
And it goes on and on. Texts with familiarity or a basis in philosophy at least acknowledge the discussion. Those in math-stats or are Stats 101 books do not.
Now the reason this subject is so important is that disguising or not acknowledging the full conditions, and therefore reifying the equations, probability does not seem like a matter of logic, which it is. It is therefore of fundamental important, given the areas which statistics touches.