Home » Basic probability (Page 4)

Category Archives: Basic probability

High Card Wins

Bob and Alice each have a box containing 3 numbered cards. Bob’s box has cards numbered 2,5 and 9. Alice’s box has cards numbered 3,4 and 8. Notice that the average value of the card in each box is the same. If each draws a card uniformly from their box, find the probability Alice wins. What is the probability Bob wins ?

Algebras and Conditioning

 

Consider a deck with three cards number 1, 2,  and 3. Furthermore, assume that 1 and 2 cards are colored red and the 3 card is colored black. Two of the cards are drawn with out replacement. Let \(D_1\)  be the first card drawn and \(D_2\) be the second card drawn. Let \(T\) be the sum of the two cards drawn and let \(N\) be the number of red cards drawn.

  1. Write down the algebra of all possible event on this probability space.
  2. What is the algebra of events generated by \(T\), which we denote will \(\mathcal{A}(T)\) ?
  3. What is the algebra of events generated by \(N\) , which we denote will \(\mathcal{A}(N)\) ?
  4. Is \(T\) adapted to  \(\mathcal{A}(N)\) ? explain in terms of the above algebras.
  5. Is \(N\) adapted to  \(\mathcal{A}(T)\) ? explain in terms of the above algebras.
  6. What is \[ \mathbf{E} [ N \,|\, \mathcal{A}(T)] ? \]
  7. What is \[ \mathbf{E} [ T \,|\, \mathcal{A}(N)] ? \]

 

 

Binomial with a random parameter

Let \(X\) be binomial with parameters \(n\), which is constant, and  \(\Theta\) which is  distributed  uniformly on \( (0,1)\).

  1. Find \(\mathbf{E}(s^X | \Theta)\)  for any \(s\).
  2. Show that for any \(s\)
    \[ \mathbf{E} ( s^X) = \frac{1}{n+1} \big(\frac{1-s^{n+1}}{1-s} \big)\]
    use this to conclude that \(X\) is distributed uniformly on the set \(\{0,1,2, \cdots, n\}\)

 

Limit theorems via generating functions

Use the results on generating functions and limit theorems which can be found here to answer the following questions.

  1. Let \(Y_n\) be uniform on the set  \(\{1,2,3,\cdots,n\}\). Find the moment generating function of \(\frac1n Y_n\) which we will call it \(M_n(t)\).  Then show that as \(n \rightarrow \infty\),
    \[ M_n(t) \rightarrow \frac{e^t -1}{t}\]
    Lastly, identify this limiting moment generating function that of a known random variable. Comment on why this make sense.
  2. Let \(X_n\) be distributed as a binomial with parameters \(n\) and \(p_n=\lambda/n\).   By using the probability generating function for \(X_n\), show that \(X_n\) converges to a Poisson random variable with parameter \(\lambda\) as \(n \rightarrow \infty\).

[Adapted from Stirzaker, p 318]

Basic generating functions

In each example below find the probability generating function (p.g.f.) or moment generating function (m.g.f.) of the random variable \(X\): (show your work!) ( When both are asked for, recall the relationship between the m.g.f. and p.g.f. You only need to do the calculation for one to find both.)

  1.  \(X\) is normal mean \(\mu\) and variance \(\sigma^2\). Find m.g.f.
  2. \(X\) is uniform on \( (0,a)\). Find m.g.f.
  3. \(X\) is Bernoulli with parameter \(p\). Find p.g.f. and m.g.f.
  4. \(X\) is exponential with parameter \(\lambda\). Find m.g.f.
  5.  \(X\) is geometric with parameter \(p\). Find p.g.f. and m.g.f.
  6. \(X=a+bY\) where \(Y\) has probability generating function \(G(s)\). Find m.g.f.

 

Random Sum of Random Variables

Let \(\{X_r : r=1,2,3,\cdots\}\) be a collection of i.i.d. random variables. Let \(G(s))\) be the generating function  of \(X_1\) ( i.e. \(G(s)=\mathbf{E} (s^{X_1})\) ), and hence; each of the \(X_r\)’s. Let \(N\) be an additional random variable taking values in the non-negative integers which is independent of all of the \(X_r\). Let \(H(s)\) be generating function of \(N\).

  1. Define the random variable \[ T=\sum_{k=1}^N X_k\] where \(T=0\) of \(N=0\).  For any fixed \(s>0\), calculate \( \mathbf{E}[ s^T | N]\). Show that the generating function of \(T\) is  \(H(G(s)) \).
  2. Assume that each claim that a given insurance company  pays is independent and distributed as an exponential random variable with parameter \(\lambda\). Let  the number of claims  in a given  year be distributed as geometric  random variable with parameter \(p\). What is the moment generating function of the total amount of money payed out in a given year ? Use your answer to identify the distribution of the total money payed out in a given year.
  3. Looking back at the previous part of the question, contrast your answer with the result of adding a non random number of exponential together.

Linear regression

Consider the following model:

\(X_1,…,X_n \stackrel{iid}{\sim} f(x), \quad Y_i = \theta X_i + \varepsilon_i, \quad \varepsilon_i \stackrel{iid}{\sim} \mbox{N}(0,\sigma^2).\)

  1. Compute \({\mathbf E }(Y \mid X)\)
  2. Compute \({\mathbf E }(\varepsilon \mid X)\)
  3. Compute \({\mathbf E }( \varepsilon)\)
  4. Show \( \theta = \frac{{\mathbf E}(XY)}{{\mathbf E}(X^2)}\)

Clinical trial

Let \(X\) be the number of patients in a clinical trial with a successful outcome. Let \(P\) be the probability of success for an individual patient. We assume before the trial begins that \(P\) is unifom on \([0,1]\). Compute

  1. \(f(P \mid X)\)
  2. \( {\mathbf E}( P \mid X)\)
  3. \( {\mathbf Var}( P \mid X)\)

Order statistics II

Suppose \(X_1, … , X_{17}\) are iid uniform on \( (.5,.8) \). What is \({\mathbf{E}} [X_{(k)}] \) ?

Order statistics I

Suppose \(X_1, … , X_n \stackrel{iid}{\sim} U(0,1) \). How large must \(n\) be to have that \({\mathbf{P}}(X_{(n)} \geq .95) \geq 1/2\) ?

Beta-binomial

You have a sequence of coins \(X_1,…,X_n\) drawn iid from a Bernouli distribution with unknown parameter \(p\) and known fixed \(n\). Assume a priori that the coins parameter \(p\) follows a Beta distribution with parameters \(\alpha,\beta\).

  1. Given the sequence  \(X_1,…,X_n\) what is the posterior pdf of \(p\) ?
  2. For what value of \(p\) is the maximum of the posterior pdf attained.

Hint: If \(X\) is distributed Bernoulli(p) then for \(x=1,0\) one has \(P(X=x)=p^x(1-p)^{(1-x)}\). Furthermore, if \(X_1,X_2\) are i.i.d. Bernoulli(p) then
\[P(X_1=x_1, X_2=x_2 )=P(X_1=x_1)P(X_2=x_2 )=p^{x_1}(1-p)^{(1-x_1)}p^{x_2}(1-p)^{(1-x_2)}\]

Conditioning and Polya’s urn

An urn contains 1 black and 2 white balls. One ball is drawn at random and its color noted. The ball is replaced in the urn, together with an additional ball of its color. There are now four balls in the urn. Again, one ball is drawn at random from the urn, then replaced along with an additional ball of its color. The process continues in this way.

  1. Let \(B_n\) be the number of black balls in the urn just before the \(n\)th ball is drawn. (Thus \(B_1= 1\).) For \(n \geq 1\), find \(\mathbf{E} (B_{n+1} | B_{n}) \).
  2. For \(n \geq 1\),  find \(\mathbf{E} (B_{n}) \). [Hint: Use induction based on the previous answer and the fact that \(\mathbf{E}(B_1) =1\)]
  3.   For \(n \geq 1\), what is the expected proportion of black balls in the urn just before the \(n\)th ball is drawn ?

 

[From pitman p 408, #6]

Car tires

The air pressure in the left and right front tires of a car are random variables \(X\) and \(Y\), respectively. Tires should be filled to 26psi. The joint pdf is

\( f(x,y) = K(x^2+y^2), \quad 20 \leq x,y \leq 30 \)

  1. What is \(K\) ?
  2. Are the random variables independent ?
  3. What is the probability that both tires are underfilled ?
  4. What is the probability that \( |X-Y| \leq 3 \) ?
  5. What are the marginal densities ?

Joint of min and max

Let \(X_1,…,X_n \stackrel{iid}{\sim} \mbox{Exp}(\lambda) \)

Let \(V = \mbox{min}(X_1,…,X_n)\) and  \(W = \mbox{max}(X_1,…,X_n)\).

What is the joint distribution of \(V,W\). Are they independent ?

Joint density part 1

Let \(X\) and \(Y\) have joint density

\(f(x,y) = 90(y-x)^8, \quad 0<x<y<1\)

  1. State the marginal distribution for \(X\)
  2. State the marginal distribution for \(Y\)
  3. Are these two random variables independent?
  4. What is \(\mathbf{P}(Y > 2X)\)
  5. Fill in the blanks “The density \(f(x,y)\) above   is the joint density of the  _________ and __________ of ten independent uniform \((0,1)\) random variables.”

[Adapted from Pitman pg 354]

 

Box-Muller I

Let \(U_1\) and \(U_2\) be independent random variables distributed uniformly on \( (0,1) \).

define \((Z_1,Z_2)\) by

\[Z_1=\sqrt{ -2 \log(U_1) }\cos( 2 \pi U_2) \]

\[Z_2=\sqrt{ -2 \log(U_1) }\sin( 2 \pi U_2) \]

  1. Find the joint density of \((Z_1, Z_2)\).
  2. Are \(Z_1\) and \(Z_2\) independent ? why ?
  3. What is the marginal density of \(Z_1\) and \(Z_2\) ? Do you recognize it ?
  4. Reflect on the implications of the previous answer for generating an often needed class of random variable on a computer.

Hint: To eliminate \(U_1\) write the formula for  \(Z_1^2 + Z_2^2\).

 

Simple Joint density

Let \(X\) and \(Y\) have joint density

\[ f(x,y) = c e^{-2x -3 y} \quad (x,y>0)\]

for some \(c>0\) and \(f(x,y)=0\) otherwise. find:

  1. the correct value of \(c\).
  2. \(P( X \leq x, Y \leq y)\)
  3. \(f_X(x)\)
  4. \(f_Y(y)\)
  5. Are \(X\) and \(Y\) independent ? Explain your reasoning ?

Joint arrival times

Let \(T_1\) and \(T_5\) be the times of the first and fifth arrival in a Poisson process with rate \(\lambda\). Find joint density of \(T_1\) and \(T_5\).

 

[Pitman p355 #12]

Max and Min’s

Let \(X_1,\cdots, X_n\) be random variables which are i.i.d. \(\text{unifom}(0,1)\). Let \(X_{(1)},\cdots, X_{(n)}\) be the associated order statistics.

  1. Find the distribution of \(X_{(n/2)}\) when \(n\) is even.
  2. Find \(\mathbf{E} [ X_{(n)} – X_{(1)} ]\).
  3. Find the distribution of \(R=X_{(n)} – X_{(1)}\).

Moment Generating Functions: Bernoulli and More

  1. Find the moment generating function for a \(\text{Bernoulli}(p)\) random variable.
  2. Recalling that  if \(X\) is distributed at \(Binomial(n,p)\) it can be written as the sum of appropriate Bernoulli random variables, find the  moment generating function for \(X\).
  3. Use the solution to the previous question to find the variance of \(X\). Show your work !

 

Change of Variable: Gaussian

Let \(Z\)  be a standard Normal random variable (ie with distribution \(N(0,1)\)). Find the formula for the density of each of the following random variables.

  1. 3Z+5
  2. \(|Z|\)
  3. \(Z^2\)
  4. \(\frac1Z\)
  5. \(\frac1{Z^2}\)

[based on Pitman p. 310, #10]

Change of variable: Weibull distribution

A random variable \(T\) has the \(\text{Weibull}(\lambda,\alpha)\) if it has probability density function

\[f(t)=\lambda \alpha t^{\alpha-1} e^{-\lambda t^\alpha} \qquad (t>0)\]

where \(\lambda >0\) and \(\alpha>0\).

  1. Show that \(T^\alpha\) has an \(\text{exponential}(\lambda)\) distribution.
  2. Show that if \(U\) is a \(\text{uniform}(0,1)\) random variable, then
    \[ T=\Big( – \frac{\log(U)}{\lambda}\Big)^{\frac1\alpha}\]
    has a \(\text{Weibull}(\lambda,\alpha)\)  distribution.

Change of Variable: Uniform

Find the density of :

  1. \(U^2\) if \(U\) is uniform(0,1).
  2. \(U^2\) if \(U\) is uniform(-1,1).
  3. \(U^2\) if \(U\) is uniform(-2,1).

Simple Poisson Calcuations

Let \(X\) have Poisson\((\lambda)\)  distribution. Calculate:

  1. \(\mathbf{E}(3 X +5)\)
  2. \(\mathbf{Var}(3X +5)\)
  3. \(\mathbf{E}\big[\frac1{1+X} \big]\)

Mixing Poisson Random Variables 1

Assume that  \(X\), \(Y\), and \(Z\) are independent Poisson random variables, each with mean 1. Find

  1. \(\mathbf{P}(X+Y = 4) \)
  2. \(\mathbf{E}[(X+Y)^2]\)
  3. \(\mathbf{P}(X+Y + Z= 4) \)

Random Errors in a Book

A book has 200 pages. The number of mistakes on each page is a Poisson random variable with mean 0.01, and is independent of the number of mistakes on all other pages.

  1. What is the expected number of pages with no mistakes ? What is the variance of the number of pages with no mistakes ?
  2. A person proofreading the book finds a given mistake with probability 0.9 . What is the expected number of pages where this person will find a mistake ?
  3. What, approximately, is the probability that the book has two or more pages with mistakes ?

 

[Pitman p235, #15]

Cards again

Given a well shuffled standard deck of 52 cards, what is the probability of what of the following events. (Think before you jump.)

  1. The 1st card is an ace.
  2. The 15th card is an ace.
  3. The 9th card is a diamond.
  4. The last 5 cards are hearts.
  5. The 17th card is the ace of diamonds and the 14 is the King of spades
  6. The 5th card is a diamond given that the 50th card is a diamond.

 

Expectation of min of exponentials

There are \(15\) stock brokers. The returns (in thousands of dollars) on each brokers is modeled as a separate independent exponential distribution \(X_1 \sim \mbox{Exp}(\lambda_1),…,X_{15} \sim \mbox{Exp}(\lambda_{15})\). Define \(Z = \min\{X_1,…,X_{15}\}\).

What is \(\mathbf{E}(Z)\) ?

Two normals

A sequence \(X_1,…,X_n\) is draw iid from either \(\mbox{N}(0,1)\) or   \(\mbox{N}(0,10)\) with equal prior probability.

  1. State the formulae for the probabilities that the sequence came from the normal with mean \(1\) or mean \(10\).
  2. If you know the mean of the normal is \(1\) then what is the variance of \(S = \sum_i X_i\) and  \( \hat{\mu} = \frac{1}{n} \sum_i X_i\).
  3. What is \(\mbox{Pr}(Z > \max\{x_1,…,x_n\})\) if \(\mu =1\) and \(\mu =10\).

Limit for mixtures

Consider the following mixture distribution.

  1. Draw \(X \sim \mbox{Be}(p=.3)\)
  2. If \(X=1\) then \(Y \sim \mbox{Geo}(p_1)\)
  3. If \(X= 0\) then  \(Y \sim \mbox{Bin}(n,p_2)\)

Consider the sequence of random variables \(Y_1,…,Y_{200}\) drawn iid from the above random experiment.

Use the central limit theorem to state the distribution of \(S = \frac{1}{200} \sum_i^{200} Y_i\).

(Here \(\mbox{Be}(p)\) is the Bernoulli distribution with parameter \(p\) and  \(\mbox{Geo}(p)\) is the geometric distribution with the parameter \(p\). )

Topics