I recently upgraded to Ubuntu 9.10, which included a new version of Audacity 1.3.9, a persnickety package that keeps locking up. Word on the ‘net is that it happens to the 64-bit version, but I have both a 32-bit and 64-bit machine and it locks up on both. I tried Ardour—a professional soundboard—but the thing is so complicated, that I haven’t figured it out in time to record this week’s lecture.
R is coming!
There have been requests for a series of podcasts on R. These are coming, probably in late January or February. R forms Chapter 5 of the class notes.
A recognition of logical probability is that all probability is conditional. Most books don’t come to defining “conditional probability” for many chapters, but it’s best to understand that all probability is with respect to, or given, or assuming that certain information is true. Think about the die example we have been using all along. The probability of seeing a 6 is conditional on our premises.
It was not a good habit for the old books to write probability such that it looks unconditional: this led to many misunderstandings. For a good example, grab any old book at look at the notation they use for gambling problems. If ever there was an argument for logical probability, it’s gambling. They will write, for example, the probability of dealing an “Ace” as Pr(Ace). They will give the premises separately: There is a deck of 52 cards, just 4 of which are labeled “Ace”; and just one card will be drawn. All that should have been listed in the evidence explicitly, to avoid confusion: pace Pr(Ace | E).
Good news! All philosophies of probability agree on the following rules. As long as the events, propositions, and statements in which we have an interest are discrete and finite. Discrete means non-continuous; we’re not going to (yet) use real numbers, but integers and rational fractions. And we are going to limit our objects of interest to be of a finite nature. This is acceptable because (so far) all events of interest to us are discrete and finite.
Probability Rule #1
Conditional on some evidence, if some thing of interest can be broken into parts, and one of those parts must be true, than the probability of each of those parts (each conditional on our evidence) sums to 1. Take our die example: it can be broken in to the parts “a 1 shows”, “a 2 shows”, … , “a 6 shows.” The probability that some number (from 1 – 6) shows is 1.
Another way to state this is “Either a 1 shows or a 2 shows or a 3 shows…” The probability of that statement, conditional on our standard die premises, is 1. Something must show!
Probability Rule #1: ORs turns to +’s.
Probability Rule #2
Conditional on some evidence, if we have two (or more) events (or propositions, etc.) that can occur, and knowledge of the first event is irrelevant to knowing anything about the second event, then the probability of the first event and the second event occurring is the probability of the first event (given the evidence) times the probability of the second event (given the evidence).
Take throwing two dice and our standard premises. What is the probability (conditional on that evidence) of seeing two 6s? Knowing the result of the first throw tells us nothing about the second. Knowing the result of the second throw tells use nothing about the first. That is, the knowing about one or the other events is irrelevant to knowing about the other. Then the probability of the statement “a 6 shows on the first throw and a 6 shows on the second” given our premises is the probability of a 6 on the first throw times the probability of a 6 on the second throw.
Probability Rule #2: ANDs turns to x’s.
The classical way to state Rule #2 is that the events are independent. I prefer Keyne’s term of irrelevant because it keeps everything on the terms of knowledge. This can be very important in some situations; mistakes in reasoning are easier to make classically.