Broken Science Epistemological Camp Report

Broken Science Epistemological Camp Report

There was another Broken Science event this past weekend in the Phoenix area. Yours Truly was there.

I gather the videos will be on line at some point, at the Broken Science YouTube channel, among other places. Meanwhile, here’s a summary.

Co-founder Emily Kaplan (who organized everything) opened the festivities giving us the state of affairs. Incidentally, she has many new short info videos on the YouTube channel.

Up next was co-founder Greg Glassman. He started with a lovely blow up photograph of the 1927 Solvay Conference, which is interesting in the history of science for many reasons. For us because it was there the idea that probability could be ontic—that it was a real physical thing—was solidified.

I followed up with a short speech on why I thought this was nuts, and the harms that have come from it. Both in physics and in all fields which use statistical models.

The first featured speaker, physicist Anthony (Anton) Garrett, gave us all background on these ideas. Anton had the distinct advantage of knowing and working with both ET Jaynes and David Stove. And if my memory is right (I am a terrible note taker) he also knew Richard Threlkeld Cox.

Cox was the physicist—and it seems to almost always be physicists who make the leaps in our understanding of probability—who derived probability from some simple, intuitive axioms. He showed that measures of belief that fit these axioms naturally lead to probability.

If you’ve ever seen Kolmogorov’s axiomatic approach to probability, you’ll get the same answer, excepting some technicalities not important to us. But it’s much clearer under Cox that all probabilities of propositions are conditional on the assumptions we make. I show in Uncertainty, you get that with Kolmogorov, too, but because he was much more concerned about mathematics, it’s harder to spot.

If you want to follow Cox’s proof, you can find a link to it at the BSI Critical Thinking Camp site. If you’ve seen any kind of functional analysis, you’ll find Cox’s treatment a breeze. If not, and for fun, here are his three axioms, modified (or rather abbreviated) from the Wokepedia entry (which is linked at BSI site):

  1. “The plausibility of a proposition is a real number and is dependent on information we have related to the proposition.”
  2. “Plausibilities should vary sensibly with the assessment of plausibilities in the model.”
  3. ” If the plausibility of a proposition can be derived in many ways, all the results must be equal.”

Naturally, there’s a lot of ankle biting over Cox’s proof, with many pointing out this or that condition, like the bit about “real” (meaning unreal) numbers, which makes it seem like it doesn’t work. Countering these are many others showing that, yes, indeed, it does work. I’m in this camp, of course, as was Jaynes and as is Anton. He has a New & Improved proof which fixes the complaints.

Incidentally, none of the complaints are valid if we start and stay with finite and discrete measures. It’s only going out to limits that things get weird. As they always do when infinities are involved. Because Infinity is a bizarre place. We can scarcely grasp the infinity of counting numbers (1,2,3,…) let alone “real” ones. So it’s no wonder that when we say we can quantify our uncertainty over numbers we can’t even fully grasp, disputes arise.

Next up was Gerd Gigerenzer, who is not only always worth reading, but is an absolute sweetheart. Several of his papers are at the BSI link, and must be read.

Like “Statistical Rituals: The Replication Delusion and How We Got There“. Couple of sentences from the Abstract will give you the flavor:

The “replication crisis” has been attributed to misguided external incentives gamed by researchers (the strategic-game hypothesis). Here, I want to draw attention to a complementary internal factor, namely, researchers’ widespread faith in a statistical ritual and associated delusions (the statistical-ritual hypothesis). The “null ritual,” unknown in statistics proper, eliminates judgment precisely at points where statistical theories demand it.

Greg put up this poster, which fleshes out the blush-inducing cringe of scientists chasing after wee Ps.

Now even though there is argument after argument, and demonstrations galore over a long, oh a very long, period, showing P-values should be thrown on the fire of failed ideas, they are still used.

In large part, as Gigerenzer showed us, ritual. Wee Ps are magic. They remove the need to think, a function which is always welcome.

In addition to the talks and very welcome camaraderie, we had many discussions, great food, better wine (too much!), a Mariachi band, and lovely warm sunshine.

More to come.

Subscribe or donate to support this site and its wholly independent host using credit card click here. Or use the paid subscription at Substack. Cash App: $WilliamMBriggs. For Zelle, use my email:, and please include yours so I know who to thank.


  1. Robin

    Great work!

    The conclusions of the book by Nunnally being 1975 is not surprising. By 1981, my stat Professors were warning us about the use of p-values. The American Statistical Association has tried to put a few nails in the P-value coffin as well but, due to the pharmaceutical industry, this zombie keeps escaping.

    Unfortunately, without the use of P-values, big Pharma has no way to prove their products are “safe and effective”. They would have to abandon 80% of their products, and they’e have little to show from their massive research budgets.

  2. McChuck

    There are far too many “scientists” the last several decades, forcing their average intellect down from midwit to merely average, or below average in the case of the “social sciences”.

  3. Kevin

    During the 6 weeks prior to lockdown, I had access to Google Scholar, and spent much of my time reading Gigerenzer’s research articles, having already read his “popular science” books. I particularly loved his takedown of the Kahnemann & Tversky “Linda” problem, and of course the statisticial rituals you refer to in your post. However, other than a post from him in March 2020 whose title was a play on the Nietzschean phrase “What does not kill me, makes me stronger”, he seemed to have disappeared from view during the entire lockdown. And in my opinion, that was when his expertise was most needed. Why was he silent, or did I just not have access to his comments?

  4. Cary D Cotterman

    There’s nothing like a good mariachi band.

  5. I didn’t know there was a BSI con this past weekend!

  6. It seems what you call “ritual” is really a convention. Aren’t conventions necessary for practicing science?

  7. Dieter Kief

    “Incidentally, none of the complaints are valid if we start and stay with finite and discrete measures. It’s only going out to limits that things get weird.”


    This is a crossing with all kinds of lanes.
    That things get complicated when we go out to the limits fools people in many ways. One of the common ones: Simple things are less true than things that can only be approached via lots of (pre)suppositions and operations.

    (In a way, you could read the late Wittgenstein’s Philosphical Remarks etc. as proof that yes, there is a problem, and no: complicated (=(pre)supposition-rich) does not necessarily mean truer – nor: True – (on the basis of formal distinctions – complexity alone is empty, but slippery when wet – – – ).

    Put differently: The sorcerer’s part of our thinking kicks in – – – the more complex problems seem to be. But that does not speak for complexity. It is rather a warning sign to be careful the more the suppositions and pre-suppositions and abstract operations pile up.

    The linguist George Lakoff has had an eye on this very problem and came up with a proposition concerning language use: Beware that most terms have a sweet spot – – – beware not least, because – – – – these sweet spots (the usual use of the terms) is loaded with lots of practical use and – that means: Proven to work from experience. And the meaning of words gets fuzzy, the more you touch the outer regions of the semantical realm they envompass.
    In a way his thesis in Women Fire and Dangerous Things is a variation of (the late) Wittgensteins Gebrauchstheorie der Bedeutung: The idea that how we use words constitutes what they mean (words do not contain more than what the ways in which we make use of them – put into them (words have .n.o. transcendental properties (which does not stand in the way of making trancendental/spiritual/religious/sci-fi… use of them).

  8. I mean, there has to be some consensus in science (consensus on methodology, terminology, etc.).

  9. Dieter Kief

    Kevin – Gerd Gigerenzer seemed to mostly avoid conflict during Covid. He did write some ho-hum articles in the leftwing german daily Frankfurter Rundschau and he gave a long interview to the schoolmarms weekly Die Zeit, which was in panic mode all the time and – he hardly contradicted them. Also: He is an old man now.

Leave a Reply

Your email address will not be published. Required fields are marked *