Category Archives: Basic probability

Which deck is rigged ?

Two decks of cards are sitting on a table. One deck is a standard deck of 52 cards. The other deck (called the rigged deck)  also has 52 cards but has had 4 of the 13 Harts replaced by Diamonds. (Recall that a standard deck has 4 suits: Diamonds, Harts, Spades, and Clubs. normal there are 13 of each suit.)

  1. What is the probability one chooses 4 cards from the rigged deck and gets exactly 2 diamonds and no hearts?
  2. What is the probability one chooses 4 cards from the standard deck and gets exactly 2 diamonds and no hearts?
  3. You randomly chose one of the decks and draw 4 cards. You obtain exactly 2 diamonds and no hearts.
    1. What is the probability you chose the cards from the rigged deck?
    2. What is the probability you chose the cards from the standard deck?
    3. If you had to guess which deck was used, which would you guess? The standard or the rigged ?

Match play golf problem

This problem is motivated by the new format for the PGA match play tournament. The 64 golfers are divided into 16 pools of four players. On the first three days each golfer plays one 18 hole match against the other three in his pool. After 18 holes the game continues until there is a winner. At the end of these three days the possible records of the golfers in a pool could be: (a) 3-0, 2-1, 1-2, 0-3; (b) 3-0, 1-2, 1-2, 1-2; (c) 2-1, 2-1, 1-2, 1-2; (d) 2-1, 2-1, 2-1, 0-3. What are the probabilities for each of these four possibilities?

Handing back tests

A professor randomly hands back test in a class of \(n\) people paying no attention to the names on the paper. Let \(N\) denote the number of people who got the right test. Let \(D\) denote the pairs of people who got each others tests. Let \(T\) denote the number of groups of three who none got the right test but yet among the three of them that have each others tests. Find:

  1. \(\mathbf{E} (N)\)
  2. \(\mathbf{E} (D)\)
  3. \(\mathbf{E} (T)\)

Up by two

Suppose two teams play a series of  games, each producing a winner and a loser, until one time has won two more games than the other. Let \(G\) be the number of games played until this happens. Assuming your favorite team wins each game with probability \(p\), independently of the results of all previous games, find:

  1. \(P(G=n) \) for \(n=2,3,\dots\)
  2. \(\mathbf{E}(G)\)
  3. \(\mathrm{Var}(G)\)



[Pittman p220, #18]


A population contains \(X_n\) individuals  at time \(n=0,1,2,\dots\) . Suppose that \(X_0\) is distributed as \(\mathrm{Poisson}(\mu)\). Between time \(n\) and \(n+1\) each of the \(X_n\) individuals dies with probability \(p\) independent of the others. The population at time \(n+1\) is comprised of the survivors together with a random number of new immigrants who arrive independently in numbers distributed according to \(\mathrm{Poisson}(\mu)\).

  1. What is the distribution of \(X_n\) ?
  2. What happens to this distribution as \(n \rightarrow \infty\) ? Your answer should depended on \(p\) and \(\mu\). In particular, what is \( \mathbf{E} X_n\) as \(n \rightarrow \infty\) ?




[Pittman [236, #18]

Benford’s Law

Assume that the population in a city grows exponentially at rate \(r\). In other words, the number of people in the city, \(N(t)\), grows as \(N(t)=C e^{rt}\), where \(C<10^6\) is a constant.

1. Determine the time interval \(\Delta t_1\) during which \(N(t)\)  will be between 1 and 2 million people.

2. For \(k=1,…,9\), determine the time interval \(\Delta t_k\) during which \(N(t)\)  will be between k and k+1 million people.

3. Calculate the total time \(T\) it takes for \(N(t)\) to grow from 1 to 10 million people.

4. Now pick a time \(\hat t \in [0,T]\) uniformly at random, and use the above results to derive the following formula (also known as Benford’s law) $$p_k=\mathbb P(N(\hat t) \in [k, k+1] \,million)=\log_{10}(k+1)-\log_{10}(k).$$

Using the Cauchy–Schwarz inequality

Recall that the Cauchy–Schwarz inequality stats that for any two random variable \(X\) and \(Y\) one has that
\[ \mathbf E |XY| \leq \sqrt{\mathbf E [X^2]}\,\sqrt{  \mathbf E [Y^2]}\]

  1. Use it to show that
    \[ \mathbf E |X| \leq \sqrt{\mathbf E [X^2]}\]

Conditioning a Poisson Arrival Process

Consider a Poisson process with parameter  \(\lambda\). What is the conditional probability that \(N(1) = n\) given that \(N(3) = n\)? (Here, \(N(t) \) is the number of calls which arrive between time 0 and time \(t\). ) Do you understand why this probability does not depend on \(\lambda\)?


[Meester ex 7.5.4]

Poisson Thinning

Let \(N(t)\) be a Poisson process with intensity λ. For each occurrence, we flip a coin: if heads comes up we label the occurrence green, if tails comes up we label it red. The coin flips are independent and \(p\) is the probability to see heads.

  1.  Show that the green occurrence form a Poisson process with intensity λp.
  2. Connect this with example 2.2.5 from Meester.
  3. We claim that the red occurrences on the one hand, and the green occurrences on the other hand form independent Poisson processes. Can you formulate this formally, and prove it , using Example 2.2.5 from Meester once more?

[Meester ex. 7.5.7]

Geometric Branching Process

Consider a branching process with a geometric offspring distribution \( P(X=k) = (1-p)p^k\), for \(k=0,1,2,\dots\) . Show that the ultimate extinction is certain if \(p \leq \frac12\) and that the probability  of extinction is \((1-p)/p \) if \(p > \frac12\).


[Meester ex. 6.6.5]