PH231 Evidence and Policy

Class 3: Probability Theory

Alexandru Marcoci

"Statistical thinking will one day be as necessary for efficient citizenship as the ability to read and write!"

Quote from the presidential address in 1951 of mathematical statistician Samuel S. Wilks (1906 - 1964) to the American Statistical Association found in JASA, Vol. 46, No. 253., pp. 1-18.

"The great body of physical science, a great deal of the essential fact of financial science, and endless social and political problems are only accessible and only thinkable to those who have had a sound training in mathematical analysis, and the time may not be very remote when it will be understood that for complete initiation as an efficient citizen of one of the new great complex worldwide States that are now developing, it is as necessary to be able to compute, to think in averages and maxima and minima, as it is now to be able to read and write." (H. G. Wells, 1903)

Probability Theory

"The theory of probability, as a mathematical discipline, can and should be developed from axioms in exactly the same way as Geometry and Algebra. This means that after we have defined the elements to be studied and their basic relations, and have stated the axioms by which these relations are to be governed, all further exposition must be based exclusively on these axioms, independent of the usual concrete meaning of these elements and their relations." (Kolmogorov 1950, 1)

Probability Spaces

$\langle X,\mathcal{S}, p\rangle$

Probability Spaces

Type Interpretation Tossing of a fair coin
$X$ Set Elementary Events $\{H,T\}$
$S$ $\subseteq2^X$ closed under $ \cup , \cap , \setminus $ General Events $\{\{H\},\{T\},\{H,T\},\emptyset\}$
$p$ $p: $ $S$$\to[0,1]$ $ p(\emptyset)=0,p(X)=1, $ $A_i\cap A_j=\emptyset \Rightarrow$ $p(A_1\cup\ldots\cup A_n)=$ $p(A_1)+\ldots +p(A_n)$ ? $p(\{H\})=p(\{T\})=\frac{1}{2}$
$p(\emptyset)=0, p(\{H,T\})=1$

Example 1: Probability of Two Heads

$X$$\{(H,H),(H,T),(T,H),(T,T)\}$
$S$$2^X=\{\{(H,H)\},\ldots,\{(H,T),(T,H),(T,T)\}\ldots\}$
$p$$p(\{i,j\})=\frac{1}{4}; i,j\in\{H,T\}$
  1. Probability of two heads: $p(\{H,H\})=\frac{1}{4}$
  2. Probability of first throw = head: $p(\{(H,T),(H,H)\})=p(\{(H,T)\})+p(\{(H,H)\})=\frac{1}{2}$
  3. Probability of second throw = head: $p(\{(T,H),(H,H)\})=p(\{(T,H)\})+p(\{(H,H)\})=\frac{1}{2}$

Example 2: Probability of Throwing an Even Number

$X$$\{1,2,3,4,5,6\}$
$S$$2^X=\{\{1\},\{2\}\ldots,\{1,5,6\}\ldots\}$
$p$$p(\{i\})=\frac{1}{6}; i\in\{1,2,3,4,5,6\}$
  1. Probability of throwing $4$: $p(\{4\})=\frac{1}{6}$
  2. Probability of throwing an even number: $p(\{2,4,6\})=p(\{2\})+p(\{4\})+p(\{6\})=\frac{1}{2}$
  3. Probability of throwing a number $\geq$ 5: $p(\{5,6\})=p(\{5\})+p(\{6\})=\frac{1}{3}$

Independence

$p(A\cap B)=p(A)\times p(B)$

Example

Is throwing a Head first independent of throwing a Tails second, when flipping a fair coin twice?

$X$$\{(H,H),(H,T),(T,H),(T,T)\}$
$S$$2^X=\{\{(H,H)\},\ldots,\{(H,T),(T,H),(T,T)\}\ldots\}$
$p$$p(\{i,j\})=\frac{1}{4}; i,j\in\{H,T\}$
  1. Probability of first throw = head: $p(\{(H,T),(H,H)\})=p(\{(H,T)\})+p(\{(H,H)\})=\frac{1}{2}$
  2. Probability of second throw = tails: $p(\{(H,T),(T,T)\})=p(\{(H,T)\})+p(\{(T,T)\})=\frac{1}{2}$
  3. Probability of throwing a head first and a tails second: $p(\{(H,T),(H,H)\}\cap\{(H,T),(T,T)\})=p(\{(H,T)\})=\frac{1}{4}=\frac{1}{2}\times\frac{1}{2}$

Independence (II):

$X$$\{$♥2,$\ldots,$♦2$, \dots,$♣2$, \dots, $♠2$\}$
$S$$2^X=\{\{$♥2$\},\ldots,\{$♦2$, $♣7$, $♠Q$\},\ldots\}$
$p$$p(\{$@$,i\})=\frac{1}{52}; i\in\{2\ldots A\},$@$\in \{$ ,,,$\}$
  1. Probability of drawing a red card:
    $p(\{$♥2$,\ldots,$♥A$,$♦2$,\ldots,$♦A$\})=26\times \frac{1}{52}=\frac{1}{2}$
  2. Probability of drawing hearts:
    $p(\{$♥2$,\ldots,$♥A$\})=13\times \frac{1}{52}=\frac{1}{4}$
  3. Probability of drawing a red hearts:
    $p(\{$♥2$,\ldots,$♥A$\})=13\times \frac{1}{52}=\frac{1}{4}\not =\frac{1}{2}\times\frac{1}{4} $

$p(A\cap B)=p(A)\times p(B|A)$

Or

$p(B|A)=\frac{p(A\cap B)}{p(A)}$

Total Probability

$[\bigcup_{i}\{B_i\}=X \land (\forall i,j:i\not = j\Rightarrow \{B_i\}\cap \{B_j\}=\emptyset)]$

$p(A)=\sum_i^n{p(A\cap B_i)}$

Or

$p(A)=\sum_i^n{p(A| B_i)\times p(B_i)}$

Therefore

$p(B_k|A)=\frac{p(A|B_k)\times p(B_k)}{\sum_i^n{p(A| B_i)\times p(B_i)}}$

Putting Probability to Work

Casscells, Schoenberger and Grayboys (1978)

"Suppose we are testing a particular population of people for a particular disease D. The incidence of D is 1 in 1000. The test T we are using is a good one - it has a 100% sensitivity: that is the false negative rate is zero (no one who has the disease D ever tests negative); and 95% specificity: that is of those who test positive 95% will actually have the disease (so the rate of false positives, that is of those who do not have the disease even though they tested positive is only 5%). A patient of yours has tested positive, what is the chance that s/he has the disease D?" (Howson, 2000: 52-3)

Monty Hall Problem

"Suppose you're on a game show, and you're given the choice of three doors: Behind one door is a car; behind the others, goats. You pick a door, say No. 1, and the host, who knows what's behind the doors, opens another door, say No. 3, which has a goat. He then says to you, 'Do you want to pick door No. 2?'" (vos Savant 1990)

Is it to your advantage to switch ?

Probability Spaces

Type Interpretation Tossing of a fair coin
$X$ Set Elementary Events $\{H,T\}$
$S$ $\subseteq2^X$ closed under $ \cup , \cap , \setminus $ General Events $\{\{H\},\{T\},\{H,T\},\emptyset\}$
$p$ $p: S\to[0,1]$ $ p(\emptyset)=0,p(X)=1, $ $A_i\cap A_j=\emptyset \Rightarrow$ $p(A_1\cup\ldots\cup A_n)=$ $p(A_1)+\ldots +p(A_n)$ ? $p(\{H\})=p(\{T\})=\frac{1}{2}$
$p(\emptyset)=0, p(\{H,T\})=1$

References

  1. W. Casscells, A. Schoenberger and T. B. Grayboys. 1978. Interpretation by physicians of clinical laboratory results. New England Journal of Medicine 299(18): 999-1001.
  2. Colin Howson. Hume's Problem: Induction and the justification of belief. Oxford University Press
  3. N. Kolmogorov. 1950. Foundations of the Theory of Probability.
  4. Marilyn vos Savant, (9 September 1990a). Ask Marilyn. Parade Magazine 16.
  5. H.G. Wells. 1903. Mankind in the Making. Chapman & Hall.
  6. The information on the origin of the quote from Wells came from here.