Home » Basic probability (Page 4)
Category Archives: Basic probability
High Card Wins
Bob and Alice each have a box containing 3 numbered cards. Bob’s box has cards numbered 2,5 and 9. Alice’s box has cards numbered 3,4 and 8. Notice that the average value of the card in each box is the same. If each draws a card uniformly from their box, find the probability Alice wins. What is the probability Bob wins ?
Algebras and Conditioning
Consider a deck with three cards number 1, 2, and 3. Furthermore, assume that 1 and 2 cards are colored red and the 3 card is colored black. Two of the cards are drawn with out replacement. Let \(D_1\) be the first card drawn and \(D_2\) be the second card drawn. Let \(T\) be the sum of the two cards drawn and let \(N\) be the number of red cards drawn.
- Write down the algebra of all possible event on this probability space.
- What is the algebra of events generated by \(T\), which we denote will \(\mathcal{A}(T)\) ?
- What is the algebra of events generated by \(N\) , which we denote will \(\mathcal{A}(N)\) ?
- Is \(T\) adapted to \(\mathcal{A}(N)\) ? explain in terms of the above algebras.
- Is \(N\) adapted to \(\mathcal{A}(T)\) ? explain in terms of the above algebras.
- What is \[ \mathbf{E} [ N \,|\, \mathcal{A}(T)] ? \]
- What is \[ \mathbf{E} [ T \,|\, \mathcal{A}(N)] ? \]
Binomial with a random parameter
Let \(X\) be binomial with parameters \(n\), which is constant, and \(\Theta\) which is distributed uniformly on \( (0,1)\).
- Find \(\mathbf{E}(s^X | \Theta)\) for any \(s\).
- Show that for any \(s\)
\[ \mathbf{E} ( s^X) = \frac{1}{n+1} \big(\frac{1-s^{n+1}}{1-s} \big)\]
use this to conclude that \(X\) is distributed uniformly on the set \(\{0,1,2, \cdots, n\}\)
Linear regression
Consider the following model:
\(X_1,…,X_n \stackrel{iid}{\sim} f(x), \quad Y_i = \theta X_i + \varepsilon_i, \quad \varepsilon_i \stackrel{iid}{\sim} \mbox{N}(0,\sigma^2).\)
- Compute \({\mathbf E }(Y \mid X)\)
- Compute \({\mathbf E }(\varepsilon \mid X)\)
- Compute \({\mathbf E }( \varepsilon)\)
- Show \( \theta = \frac{{\mathbf E}(XY)}{{\mathbf E}(X^2)}\)
Clinical trial
Let \(X\) be the number of patients in a clinical trial with a successful outcome. Let \(P\) be the probability of success for an individual patient. We assume before the trial begins that \(P\) is unifom on \([0,1]\). Compute
- \(f(P \mid X)\)
- \( {\mathbf E}( P \mid X)\)
- \( {\mathbf Var}( P \mid X)\)
Order statistics II
Suppose \(X_1, … , X_{17}\) are iid uniform on \( (.5,.8) \). What is \({\mathbf{E}} [X_{(k)}] \) ?
Order statistics I
Suppose \(X_1, … , X_n \stackrel{iid}{\sim} U(0,1) \). How large must \(n\) be to have that \({\mathbf{P}}(X_{(n)} \geq .95) \geq 1/2\) ?
Beta-binomial
You have a sequence of coins \(X_1,…,X_n\) drawn iid from a Bernouli distribution with unknown parameter \(p\) and known fixed \(n\). Assume a priori that the coins parameter \(p\) follows a Beta distribution with parameters \(\alpha,\beta\).
- Given the sequence \(X_1,…,X_n\) what is the posterior pdf of \(p\) ?
- For what value of \(p\) is the maximum of the posterior pdf attained.
Hint: If \(X\) is distributed Bernoulli(p) then for \(x=1,0\) one has \(P(X=x)=p^x(1-p)^{(1-x)}\). Furthermore, if \(X_1,X_2\) are i.i.d. Bernoulli(p) then
\[P(X_1=x_1, X_2=x_2 )=P(X_1=x_1)P(X_2=x_2 )=p^{x_1}(1-p)^{(1-x_1)}p^{x_2}(1-p)^{(1-x_2)}\]
Conditioning and Polya’s urn
An urn contains 1 black and 2 white balls. One ball is drawn at random and its color noted. The ball is replaced in the urn, together with an additional ball of its color. There are now four balls in the urn. Again, one ball is drawn at random from the urn, then replaced along with an additional ball of its color. The process continues in this way.
- Let \(B_n\) be the number of black balls in the urn just before the \(n\)th ball is drawn. (Thus \(B_1= 1\).) For \(n \geq 1\), find \(\mathbf{E} (B_{n+1} | B_{n}) \).
- For \(n \geq 1\), find \(\mathbf{E} (B_{n}) \). [Hint: Use induction based on the previous answer and the fact that \(\mathbf{E}(B_1) =1\)]
- For \(n \geq 1\), what is the expected proportion of black balls in the urn just before the \(n\)th ball is drawn ?
[From pitman p 408, #6]
Car tires
The air pressure in the left and right front tires of a car are random variables \(X\) and \(Y\), respectively. Tires should be filled to 26psi. The joint pdf is
\( f(x,y) = K(x^2+y^2), \quad 20 \leq x,y \leq 30 \)
- What is \(K\) ?
- Are the random variables independent ?
- What is the probability that both tires are underfilled ?
- What is the probability that \( |X-Y| \leq 3 \) ?
- What are the marginal densities ?
Joint of min and max
Let \(X_1,…,X_n \stackrel{iid}{\sim} \mbox{Exp}(\lambda) \)
Let \(V = \mbox{min}(X_1,…,X_n)\) and \(W = \mbox{max}(X_1,…,X_n)\).
What is the joint distribution of \(V,W\). Are they independent ?
Joint density part 1
Let \(X\) and \(Y\) have joint density
\(f(x,y) = 90(y-x)^8, \quad 0<x<y<1\)
- State the marginal distribution for \(X\)
- State the marginal distribution for \(Y\)
- Are these two random variables independent?
- What is \(\mathbf{P}(Y > 2X)\)
- Fill in the blanks “The density \(f(x,y)\) above is the joint density of the _________ and __________ of ten independent uniform \((0,1)\) random variables.”
[Adapted from Pitman pg 354]
Box-Muller I
Let \(U_1\) and \(U_2\) be independent random variables distributed uniformly on \( (0,1) \).
define \((Z_1,Z_2)\) by
\[Z_1=\sqrt{ -2 \log(U_1) }\cos( 2 \pi U_2) \]
\[Z_2=\sqrt{ -2 \log(U_1) }\sin( 2 \pi U_2) \]
- Find the joint density of \((Z_1, Z_2)\).
- Are \(Z_1\) and \(Z_2\) independent ? why ?
- What is the marginal density of \(Z_1\) and \(Z_2\) ? Do you recognize it ?
- Reflect on the implications of the previous answer for generating an often needed class of random variable on a computer.
Hint: To eliminate \(U_1\) write the formula for \(Z_1^2 + Z_2^2\).
Simple Joint density
Let \(X\) and \(Y\) have joint density
\[ f(x,y) = c e^{-2x -3 y} \quad (x,y>0)\]
for some \(c>0\) and \(f(x,y)=0\) otherwise. find:
- the correct value of \(c\).
- \(P( X \leq x, Y \leq y)\)
- \(f_X(x)\)
- \(f_Y(y)\)
- Are \(X\) and \(Y\) independent ? Explain your reasoning ?
Joint arrival times
Let \(T_1\) and \(T_5\) be the times of the first and fifth arrival in a Poisson process with rate \(\lambda\). Find joint density of \(T_1\) and \(T_5\).
[Pitman p355 #12]
Max and Min’s
Let \(X_1,\cdots, X_n\) be random variables which are i.i.d. \(\text{unifom}(0,1)\). Let \(X_{(1)},\cdots, X_{(n)}\) be the associated order statistics.
- Find the distribution of \(X_{(n/2)}\) when \(n\) is even.
- Find \(\mathbf{E} [ X_{(n)} – X_{(1)} ]\).
- Find the distribution of \(R=X_{(n)} – X_{(1)}\).
Change of Variable: Gaussian
Let \(Z\) be a standard Normal random variable (ie with distribution \(N(0,1)\)). Find the formula for the density of each of the following random variables.
- 3Z+5
- \(|Z|\)
- \(Z^2\)
- \(\frac1Z\)
- \(\frac1{Z^2}\)
[based on Pitman p. 310, #10]
Change of variable: Weibull distribution
A random variable \(T\) has the \(\text{Weibull}(\lambda,\alpha)\) if it has probability density function
\[f(t)=\lambda \alpha t^{\alpha-1} e^{-\lambda t^\alpha} \qquad (t>0)\]
where \(\lambda >0\) and \(\alpha>0\).
- Show that \(T^\alpha\) has an \(\text{exponential}(\lambda)\) distribution.
- Show that if \(U\) is a \(\text{uniform}(0,1)\) random variable, then
\[ T=\Big( – \frac{\log(U)}{\lambda}\Big)^{\frac1\alpha}\]
has a \(\text{Weibull}(\lambda,\alpha)\) distribution.
Change of Variable: Uniform
Find the density of :
- \(U^2\) if \(U\) is uniform(0,1).
- \(U^2\) if \(U\) is uniform(-1,1).
- \(U^2\) if \(U\) is uniform(-2,1).
Simple Poisson Calcuations
Let \(X\) have Poisson\((\lambda)\) distribution. Calculate:
- \(\mathbf{E}(3 X +5)\)
- \(\mathbf{Var}(3X +5)\)
- \(\mathbf{E}\big[\frac1{1+X} \big]\)
Mixing Poisson Random Variables 1
Assume that \(X\), \(Y\), and \(Z\) are independent Poisson random variables, each with mean 1. Find
- \(\mathbf{P}(X+Y = 4) \)
- \(\mathbf{E}[(X+Y)^2]\)
- \(\mathbf{P}(X+Y + Z= 4) \)
Random Errors in a Book
A book has 200 pages. The number of mistakes on each page is a Poisson random variable with mean 0.01, and is independent of the number of mistakes on all other pages.
- What is the expected number of pages with no mistakes ? What is the variance of the number of pages with no mistakes ?
- A person proofreading the book finds a given mistake with probability 0.9 . What is the expected number of pages where this person will find a mistake ?
- What, approximately, is the probability that the book has two or more pages with mistakes ?
[Pitman p235, #15]
Cards again
Given a well shuffled standard deck of 52 cards, what is the probability of what of the following events. (Think before you jump.)
- The 1st card is an ace.
- The 15th card is an ace.
- The 9th card is a diamond.
- The last 5 cards are hearts.
- The 17th card is the ace of diamonds and the 14 is the King of spades
- The 5th card is a diamond given that the 50th card is a diamond.
Expectation of min of exponentials
There are \(15\) stock brokers. The returns (in thousands of dollars) on each brokers is modeled as a separate independent exponential distribution \(X_1 \sim \mbox{Exp}(\lambda_1),…,X_{15} \sim \mbox{Exp}(\lambda_{15})\). Define \(Z = \min\{X_1,…,X_{15}\}\).
What is \(\mathbf{E}(Z)\) ?
Two normals
A sequence \(X_1,…,X_n\) is draw iid from either \(\mbox{N}(0,1)\) or \(\mbox{N}(0,10)\) with equal prior probability.
- State the formulae for the probabilities that the sequence came from the normal with mean \(1\) or mean \(10\).
- If you know the mean of the normal is \(1\) then what is the variance of \(S = \sum_i X_i\) and \( \hat{\mu} = \frac{1}{n} \sum_i X_i\).
- What is \(\mbox{Pr}(Z > \max\{x_1,…,x_n\})\) if \(\mu =1\) and \(\mu =10\).
Limit for mixtures
Consider the following mixture distribution.
- Draw \(X \sim \mbox{Be}(p=.3)\)
- If \(X=1\) then \(Y \sim \mbox{Geo}(p_1)\)
- If \(X= 0\) then \(Y \sim \mbox{Bin}(n,p_2)\)
Consider the sequence of random variables \(Y_1,…,Y_{200}\) drawn iid from the above random experiment.
Use the central limit theorem to state the distribution of \(S = \frac{1}{200} \sum_i^{200} Y_i\).
(Here \(\mbox{Be}(p)\) is the Bernoulli distribution with parameter \(p\) and \(\mbox{Geo}(p)\) is the geometric distribution with the parameter \(p\). )