Home » Basic probability (Page 2)
Category Archives: Basic probability
Classroom Surveys
A researcher is collecting data from 10 high school classrooms. Each classroom contains 30 people. The researcher asks each student to fill out a survey. Suppose each student has about a 40% chance of completing the survey (independent of other students). What is the probability that at least 4 classrooms have at least 15 students who complete the survey?
Coin flipping game
Your friend challenges you to a game in which you flip a fair coin until you get heads. If you flip an even number of times, you win. Let \(A\) be the event that you win. Let \(B\) be the event that you flip the coin 3 or more times. Let \(C\) be the event that you flip the coin 4 or more times.
- Compute \(\mathbb{P}(A)\).
- Are \(A\) and \(B\) independent?
- Are \(A\) and \(C\) independent?
Repeated Quiz Questions
Each week you get multiple attempts to take a two-question quiz. For each attempt, two questions are pulled at random from a bank of 100 questions. For a single attempt, the two questions are distinct.
- If you attempt the quiz 5 times, what is the probability that within those 5 attempts, you’ve seen at least one question two or more times?
- How many times do you need to attempt the quiz to have a greater than 50% chance of seeing at least one question two or more times?
Dice Rolling Events
Consider rolling a fair 6-sided die twice. Let \(A\) be the event that the first roll is less than or equal to 3. Let \(B\) be the event that the second roll is less than or equal to 3. Find an event \(C\) in the same outcome space as \(A\) and \(B\) with \(0<\mathbb{P}(C)<1\) and such that \(A\), \(B\) and \(C\) are mutually independent, or show that no such event exists.
Cognitive Dissonance Among Monkeys
Assume that each monkey has a strong preference between red, green, and blue M&M’s. Further, assume that the possible orderings of the preferences are equally distributed in the population. That is to say that each of the 6 possible orderings ( R>G>B or R>B>G or B>R>G or B>G>R or G>B>R or G>R>B) are found with equal frequency in the population. Lastly assume that when presented with two M&Ms of different colors they always eat the M&M with the color they prefer.
In an experiment, a random monkey is chosen from the population and presented with a Red and a Green M&M. In the first round, the monkey eats the one based on their personal preference between the colors. The remaining M&M is left on the table and a Blue M&M is added so that there are again two M&M’s on the table. In the second round, the monkey again chooses to eat one of the M&M’s based on their color preference.
- What is the chance that the red M&M is not eaten in the first round?
- What is the chance that the green M&M is not eaten in the first round?
- What is the chance that the Blue M&M is not eaten in the second round?
[Mattingly 2022]
Which deck is rigged ?
Two decks of cards are sitting on a table. One deck is a standard deck of 52 cards. The other deck (called the rigged deck) also has 52 cards but has had 4 of the 13 Harts replaced by Diamonds. (Recall that a standard deck has 4 suits: Diamonds, Harts, Spades, and Clubs. normal there are 13 of each suit.)
- What is the probability one chooses 4 cards from the rigged deck and gets exactly 2 diamonds and no hearts?
- What is the probability one chooses 4 cards from the standard deck and gets exactly 2 diamonds and no hearts?
- You randomly chose one of the decks and draw 4 cards. You obtain exactly 2 diamonds and no hearts.
- What is the probability you chose the cards from the rigged deck?
- What is the probability you chose the cards from the standard deck?
- If you had to guess which deck was used, which would you guess? The standard or the rigged ?
Match play golf problem
This problem is motivated by the new format for the PGA match play tournament. The 64 golfers are divided into 16 pools of four players. On the first three days each golfer plays one 18 hole match against the other three in his pool. After 18 holes the game continues until there is a winner. At the end of these three days the possible records of the golfers in a pool could be: (a) 3-0, 2-1, 1-2, 0-3; (b) 3-0, 1-2, 1-2, 1-2; (c) 2-1, 2-1, 1-2, 1-2; (d) 2-1, 2-1, 2-1, 0-3. What are the probabilities for each of these four possibilities?
Handing back tests
A professor randomly hands back test in a class of \(n\) people paying no attention to the names on the paper. Let \(N\) denote the number of people who got the right test. Let \(D\) denote the pairs of people who got each others tests. Let \(T\) denote the number of groups of three who none got the right test but yet among the three of them that have each others tests. Find:
- \(\mathbf{E} (N)\)
- \(\mathbf{E} (D)\)
- \(\mathbf{E} (T)\)
Up by two
Suppose two teams play a series of games, each producing a winner and a loser, until one time has won two more games than the other. Let \(G\) be the number of games played until this happens. Assuming your favorite team wins each game with probability \(p\), independently of the results of all previous games, find:
- \(P(G=n) \) for \(n=2,3,\dots\)
- \(\mathbf{E}(G)\)
- \(\mathrm{Var}(G)\)
[Pittman p220, #18]
Population
A population contains \(X_n\) individuals at time \(n=0,1,2,\dots\) . Suppose that \(X_0\) is distributed as \(\mathrm{Poisson}(\mu)\). Between time \(n\) and \(n+1\) each of the \(X_n\) individuals dies with probability \(p\) independent of the others. The population at time \(n+1\) is comprised of the survivors together with a random number of new immigrants who arrive independently in numbers distributed according to \(\mathrm{Poisson}(\mu)\).
- What is the distribution of \(X_n\) ?
- What happens to this distribution as \(n \rightarrow \infty\) ? Your answer should depended on \(p\) and \(\mu\). In particular, what is \( \mathbf{E} X_n\) as \(n \rightarrow \infty\) ?
[Pittman [236, #18]
Benford’s Law
Assume that the population in a city grows exponentially at rate \(r\). In other words, the number of people in the city, \(N(t)\), grows as \(N(t)=C e^{rt}\), where \(C<10^6\) is a constant.
1. Determine the time interval \(\Delta t_1\) during which \(N(t)\) will be between 1 and 2 million people.
2. For \(k=1,…,9\), determine the time interval \(\Delta t_k\) during which \(N(t)\) will be between k and k+1 million people.
3. Calculate the total time \(T\) it takes for \(N(t)\) to grow from 1 to 10 million people.
4. Now pick a time \(\hat t \in [0,T]\) uniformly at random, and use the above results to derive the following formula (also known as Benford’s law) $$p_k=\mathbb P(N(\hat t) \in [k, k+1] \,million)=\log_{10}(k+1)-\log_{10}(k).$$
Using the Cauchy–Schwarz inequality
Recall that the Cauchy–Schwarz inequality stats that for any two random variable \(X\) and \(Y\) one has that
\[ \mathbf E |XY| \leq \sqrt{\mathbf E [X^2]}\,\sqrt{ \mathbf E [Y^2]}\]
- Use it to show that
\[ \mathbf E |X| \leq \sqrt{\mathbf E [X^2]}\]
Conditioning a Poisson Arrival Process
Consider a Poisson process with parameter \(\lambda\). What is the conditional probability that \(N(1) = n\) given that \(N(3) = n\)? (Here, \(N(t) \) is the number of calls which arrive between time 0 and time \(t\). ) Do you understand why this probability does not depend on \(\lambda\)?
[Meester ex 7.5.4]
Poisson Thinning
Let \(N(t)\) be a Poisson process with intensity λ. For each occurrence, we flip a coin: if heads comes up we label the occurrence green, if tails comes up we label it red. The coin flips are independent and \(p\) is the probability to see heads.
- Show that the green occurrence form a Poisson process with intensity λp.
- Connect this with example 2.2.5 from Meester.
- We claim that the red occurrences on the one hand, and the green occurrences on the other hand form independent Poisson processes. Can you formulate this formally, and prove it , using Example 2.2.5 from Meester once more?
[Meester ex. 7.5.7]
Geometric Branching Process
Consider a branching process with a geometric offspring distribution \( P(X=k) = (1-p)p^k\), for \(k=0,1,2,\dots\) . Show that the ultimate extinction is certain if \(p \leq \frac12\) and that the probability of extinction is \((1-p)/p \) if \(p > \frac12\).
[Meester ex. 6.6.5]
Joint, Marginal and Conditioning
Let \( (X,Y)\) have joint density \(f(x,y) = e^{-y}\), for \(0<x<y\), and \(f(x,y)=0\) elsewhere.
- Are \(X\) and \(Y\) independent ?
- Compute the marginal density of \(Y\).
- Show that \(f_{X|Y}(x,y)=\frac1y \), for \(0<x<y\).
- Compute \(E(X|Y=y)\)
- Use the previous result to find \(E(X)\).
Three Random Variables
Let \(X\), \(Y\), and \(Z\) be independent uniform \( (0,1)\).
- Find the joint density of \(XY\) and \(Z^2\).
- Show that \(P(XY < Z^2) = \frac59\).
[Meester ex. 5.12.25]
A joint density example I
Let \( (X,Y) \) have joint density \(f(x,y)=x e^{-x-y}\) when \(x,y>0\) and \(f(x,y)=0\) elsewhere. Are \(X\) and \(Y\) independent ?
[Meester ex 5.12.30]
Memory and the Exponential
Let \(X\) have an exponential distribution with parameter \(\lambda\). Show that
\[ P( X> t+ s \,|\, X>s) = P(X>t) \]
for all \(s,t >0\). Explain why one might call this property of the exponential “the lack of memory”.
Hitting Zero
Consider a random walk making \(2n\) steps, and let \(T\) be the first
return to its starting point, that is
\[T =\ min\{1 ≤ k ≤ 2n : S_k = 0\},\]
and \(T = 0\) if the walk does not return to zero in the first \(2n steps.
Show that for all \( 1 ≤ k ≤ n\) we have,
\[P(T = 2k) =\frac1{2k − 1} \begin{pmatrix} 2k\\k\end{pmatrix} 2^{-2k}\]
[Meetster Ex 3.3.2]
Geometric probability
in each case, consider a point picked uniformly randomly from the interior of the region. Find the probability density function for the \(x\)-coordinate.
- The square with corner : \( (-2,0), (0,2), (2,0), (0,-2) \)
- The triangle with corners: \( (-2,0), (1,0), (0,2) \)
- The polygon with corners: \( (0,2),(2,1), (1,-1), (-1,0)\)
[Pitman p277, # 12]
Expected value of Random Walk
Consider a random walk \(S_i\), which begins at zero, and makes \(2n\) steps.
Let \(k < n\). Show that
- \(\mathbf{E}\Big( |S_{2k}| \ \Big|\ |S_{2k−1}| = r\Big) = r\);
- \( \mathbf{E}\Big(|S_{2k+1}| \ \Big|\ |S_{2k}| = r\Big) = 1 \text{ if } r = 0, \text{ and } \mathbf{E}\Big(|S_{2k+1}| \ \Big|\ |S_{2k}| = r\Big) = r\text{ otherwise}.\)
A p.m.f. and expectation example
Let \(X\) be a random variable with probability mass function
\[p(n) = \frac{c}{n!}\quad \text{for $\mathbf{N}=0,1,2\cdots$}\]
and \(p(x)=0\) otherwise.
- Find \(c\). Hint use the Taylor series expansion of \(e^x\).
- Compute the probability that \(X\) is even.
- Computer the expected value of \(X\)
[Meester ex 2.7.14]
Prime Dice
Suppose that we have a very special die which has exactly \(k\) faces where \(k\) is prime. The faces are numbered \(1,\dots,k\). We throw the die once and see which number comes up.
- What would be an appropriate outcome space and probability measure for this random experiment ?
- Suppose that the events \(A\) and \(B\) are independent. Show that \(\mathbf{P}(A)\) or \(\mathbf{P}(B)\) is always either 0 or 1. Or in other wards \(A\) or \(B\) is always either the full space or the empty set.
[ from Meester, ex 1.7.32]
The chance a coin is fair
Suppose that I have two coins in my pocket. One ordinary, fair coin and one coin which has heads on both sides. I pick a random coin out of my pocket, throw it, and it comes up heads.
- What is the probability that I have thrown the fair coin ?
- If I throw the same coin again, and heads comes up again, what is the probability that I have thrown the fair coin ?
- If instead of throwing the same coin again, I reach into my pocket and throw the second coin. If it comes up heads, what is the chance the first coin is the fair coin ?
[ Modified version of Meester, ex 1.7.35]
Independence of two hearts ?
Consider a deck of 52 cards. Let \(A\) be the event that the first card is a heart. Let \(B\) be the event that the 51st card is a heart.
What is \(\mathbf{P}(A)\) ? What is \(\mathbf{P}(B)\) ? Are \(A\) and \(B\) independent ?
Conditionally equally Likely
Let \(A\) and \(B\) be two events with positive probability. When does \(\mathbf{P}(A|B)=\mathbf{P}(B|A)\) ?
Making a biased coin fair
Jack and Jill want to use a coin to decide who gets the remaining piece of cake. However, since the coin is Jack’s, Jill is suspicious that the coin is a trick coin which produced head with a probability \(p\) which is not \(\frac12\). Can you devise a way to use this coin to come to a fair decision as to who gets the cake ?
conditional densities
Let \(X\) and \(Y\) have the following joint density:
\[ f(x,y)=\begin{cases}2x+2y -4xy & \text{for } 0 \leq x\leq 1 \ \text{and}\ 0 \leq y \leq 1\\ 0& \text{otherwise}\end{cases}\]
- Find the marginal densities of \(X\) and \(Y\)
- find \(f_{Y|X}( y \,|\, X=\frac14)\)
- find \( \mathbf{E}(Y \,|\, X=\frac14)\)
[Pitman p426 # 2]
Expected max/min given min/max
Let \(X_1\) and \(X_2\) be the numbers on two independent fair-die rolls. Let \(M\) be the maximum and \(N\) the minimum of \(X_1\) and \(X_2\). Calculate:
- \(\mathbf{E}( M| N=x) \)
- \(\mathbf{E}( N| M=x) \)
- \(\mathbf{E}( M| N) \)
- \(\mathbf{E}( N| M) \)
Difference between max and min
Let \(U_1,U_2,U_3,U_4,U_5\) be independent, each with uiform distribution on \((0,1)\). Let \(R\) be the distance between the max and the min of the \(U_i\)’s. Find
- \(\mathbf{E} R\)
- the joint density of the max and the min of the \(U_i\)’s.
- the \(\mathbf{P}(R> .5)\)
[pitman p355, #14]
A Joint density example II
If \(X\) and \(Y\) have joint density function
\[f(x,y)=\frac{1}{x^2y^2} \quad; \quad x \geq 1, y\geq 1\]
- Compute the joint density fiction of \(U=XY\), \(V=X/Y\).
- What are the marginal densities of \(U\) and \(V\) ?
[Ross p295, # 54]
Closest Point
Consider a Poisson random scatter of points in a plane with mean intensity \(\lambda\) per unit area. Let \(R\) be the distance from zero to the closest point of the scatter.
- Find a formula for the c.d.f. and the density of \(R\) and sketch their graphs.
- Show that \(\sqrt{2 \lambda \pi} R\) has the Rayleigh distribution.
- Find the mean and mode of \(R\).
[pitman p 389, # 21]
Joint Density of Arrival Times
Let \(T_1 < T_2<\cdots\) be the arrival times in a Poisson arrival process with rate \(\lambda\). What is the joint distribution of \((T_1,T_2,T_5)\) ?
Point of increase
Suppose \(U_1,U_2, …\) are independent uniform \( (0,1) \) random variables. Let \(N\) be the first point of increase. That is to say the first \(n \geq 2\) such that \(U_n > U_{n-1}\). Show that for \(u \in (0,1)\):
- \[\mathbf{P}(U_1 \leq u \ { and } \ N=n)= \frac{u^{n-1}}{(n-1)!}-\frac{u^{n}}{n!} \quad;\quad n \geq 2\]
- \( \mathbf{E}(N)=e \)
Some useful observations:
- \[\mathbf{P}(U_1 \leq u \ { and } \ N=n) = \mathbf{P}(U_1 \leq u \ { and } \ N \geq n) -\mathbf{P}(U_1 \leq u \ { and } \ N \geq n+1)\]
- The following events are equal
\[ \{U_1 \leq u \quad{ and } \quad N \geq n\} = \{U_{n-1}\leq U_{n-2} \leq \cdots \leq U_2\leq U_{1} \leq u \}\] - \[ \mathbf{P}\{U_2 \leq U_1 \leq u \}= \int_0^u \int_0^{u_1} du_2 du_1 \]