Home » Basic probability (Page 3)

Category Archives: Basic probability

An example of min and change of variable

Suppose \(R_1\) and \(R_2\) are two independent random variables with the same density function

\[f(x)=x\exp(-{\textstyle \frac12 }x^2)\]

for \(x\geq 0\). Find

  1. the density of \(Y=\min(R_1,R_2)\);
  2. the density of \(Y^2\)

[Pitman p. 336 #21]

Calls arriving

Assume that calls arrive at a call centre according to a Poisson arrival process  with a rate of  15 calls per hour. For \(0 \leq s < t\), let \(N(s,t)\) denote the number of calls which arrive between time \(s\) and \(t\) where time is measured in hours.

  1. What is \( \mathbf{E}\big(\,N(3,5)\,\big)\) ?
  2. What is the second moment of \(N(2,4) \) ?
  3. What is \( \mathbf{E}\big(\,N(1,4)\,N(2,6)\,\big)\) ?

Tail-sum formula for continuous random variable

Let \(X\) be a positive random variable with c.d.f \(F\).

  1. Show using the representation \(X=F^{-1}(U)\) where \(U\) is \(\textrm{unif}(1,0)\) that \(\mathbf{E}(X)\) can be interpreted as the area above the graph on  \(y=F(x)\) but below the line \(y=1\). Using this deduce that
    \[\mathbf{E}(X)=\int_0^\infty [1-F(x)] dx = \int_0^\infty \mathbf{P}(X> x) dx \ .\]
  2. Deduce that if \(X\) has possible values \(0,1,2,\dots\) , then
    \[\mathbf{E}(X)=\sum_{k=1}^\infty \mathbf{P}(X\geq  k)\]

Min, Max, and Exponential

Let \(X_1\) and \(X_2\) be random variables and let \(M=\mathrm{max}(X_1,X_2)\) and \(N=\mathrm{min}(X_1,X_2)\).

  1. Argue that the event \(\{ M \leq x\}\) is the same as the event   \(\{X_1 \leq x, X_2 \leq x\}\) and similarly that t the event \(\{ N > x\}\) is the same as the event   \(\{X_1 > x, X_2 > x\}\).
  2. Now assume that the \(X_1\) and \(X_2\) are independent and distributed with c.d.f. \(F_1(x)\) and \(F_2(x)\) respectively . Find the c.d.f. of \(M\) and the c.d.f. of \(N\) using the proceeding observation.
  3. Now assume that \(X_1\) and \(X_2\) are independently and exponentially  distributed with parameters \(\lambda_1\) and \(\lambda_2\) respectively. Show that \(N\) is distributed exponentially and identify the parameter  in the exponential distribution of \(N\).
  4. The route to a certain remote island contains 4 bridges. If the time to collapse of each bridge is exponential distributed with mean 20 years and is independent of the other bridges, what is the distribution of the time until the road is impassable because one of the  bridges has collapsed.

 

[Jonathan Mattingly]

Approximating sums of uniform random variables

Suppose \(X_1,X_2,X_3,X_4\) are independent uniform \((0,1)\) and we set \(S_4=X_1+X_2+X_3+X_4\). Use the normal approximation to estimate \(\mathbf{P}( S_4 \geq 3) \).

geometric probability: marginal densities

Find the density of the random variable \(X\) when the pair \( (X,Y) \) is chosen uniformly from the specified region in the plane in each case below.

  1. The diamond with vertices at \( (0,2), (-2,0), (0,-2), (2,0) \).
  2. The triangle with vertices \( (-2,0), (1,0), (0,2) \).

[Pitman p 277, #12]

probability density example

Suppose  \(X\) takes values in\( (0,1) \) and has a density

\[f(x)=\begin{cases}c x^2 (1-x)^2 \qquad &x\in(0,1)\\  0 & x \not \in (0,1)\end{cases}\]

for some \(c>0\).

  1. Find \( c \).
  2. Find \(\mathbf{E}(X)\).
  3. Find \(\mathrm{Var}(X) \).

 

Infinite Mean

Suppose that \(X\) is a random variable whose density is

\[f(x)=\frac{1}{2(1+|x|)^2} \quad x \in (-\infty,\infty)\]

 

  1. Draw a graph of \(f(x)\).
  2. Find \(\mathbf{P}(-1 <X<2)\).
  3. Find \(\mathbf{P}(X>1)\).
  4. Is \(\mathbf{E}(X) \) defined ? Explain.

Raindrops are falling

Raindrops are falling at an average rate of 30 drops per square inch per minute.

  1. What is the chance that a particular square inch is not hit by any drops during a given 10-second period ?
  2. If one draws a circle of radius 2 inches on the ground, what is the chance that 4 or more drops hits inside the circle over a two-minute period?
  3. If each drop is a big drop with probability 2/3 and a small drop with probability 1/3, independent of the other drops, what is the chance that during 10 seconds a particular square inch gets hit by precisely four big drops and five small ones?

[Pitman p. 236, #17, Modified by Mattingly]

Overloading an Elevator

A new elevator in a large hotel is designed to carry about 30 people, with a total weight of up to 5000 lbs. More that 5000 lbs. overloads the elevator. The average weight of guests at the hotel is 150 lbs., with a standard deviation of  55 lbs. Suppose 30 of the hotel’s guests get into the elevator . Assuming the weights of the guests are independent random variables, what is the chance of overloading the elevator  ? Give your approximate answer as a decimal.

 

 

[Pitman p 204, # 19]

Indicator Functions and Expectations – II

Let \(A\) and \(B\) be two events and let \(\mathbf{1}_A\) and \(\mathbf{1}_B\) be the associated indicator functions. Answer the following questions in terms of \(\mathbf{P}(A)\), \(\mathbf{P}(B)\), \(\mathbf{P}(B \cup A)\) and \(\mathbf{P}(B \cap A)\).

  1. Describe the distribution of \( \mathbf{1}_A\).
  2. What is \(\mathbf{E} \mathbf{1}_A\) ?
  3. Describe the distribution of \(\mathbf{1}_A \mathbf{1}_B\).
  4. What is \(\mathbf{E}(\mathbf{1}_A \mathbf{1}_B)\) ?

The indicator function of an event \(A\) is the random variable which has range \(\{0,1\}\) such that

\[ \mathbf{1}_A(x) = \begin{cases} 1 &; \text{if $x \in A$}\\ 0 &; \text{if $x \not \in A$} \end{cases}\]

Ordered Random Variables

Suppose \(X\) and \(Y\) are two random variable such that \(X \geq Y\).

  1. For a fixed number \(T\), which would be greater, \(\mathbf{P}(X \leq T) \) or \(\mathbf{P}(Y \leq T) \).
  2. What if \(T\) is a random variable ? (If it helps you think about the problem, assume \(T\) takes values in \(\{1,\cdots,n\}\). )

Coin tosses: independence and sums

A fair coin is tossed three times. Let \(X\) be the number of heads on the first two tosses, \(Y\) the number of heads on the last two tosses.

  1. Make a table showing the joint distribution of \(X\) and \(Y\).
  2. Are \(X\) and \(Y\)  independent ?
  3. Find the distribution of \(X+Y\) ?

Defective Machines

 

Suppose that the probability that an item produced by a certain machine will be defective is 0.12.

  1.  Find the probability (exactly)  that a sample of 10 items will contain at most 1 defective item.
  2. Use the Poisson to approximate the preceding probability. Compare your two answers.

 

[Inspired Ross, p. 151,  example 7b ]

Boxes without toys

A cereal company advertises a prize in every box of its cereal. In fact, only about 95% of the boxes have a prize in them. If a family buys one box of this cereal every week for a year, estimate the chance that they will collect more than 45 prizes. What assumptions are you making ?

 

[Pitman p122, # 9]

Picking a box then a ball

Suppose that there are two boxes, labeled odd and even. The odd box contains three balls numbered 1,3,5 and the even box contains two balls labeled 2,4. One of the boxes is picked randomly by tossing a fair coin.

  1. What is the probability that a 3 is chosen ?
  2. What is the probability a number less than or equal to 2 is chosen ?
  3. The above procedure produces a distribution on \(\{1,2,3,4,5\}\) how does it compare to picking a number uniformly (with equal probability) ?

 

 

 

[Pitman p 37, example 5]

Finding a good phone

At the London station there are three pay phones which accept 20p coins. one never works, another works, while the third works with probability 1/2. On my way to London for the day, I wish to identify the reliable phone, so that I can use it on my return. The station is empty and I have just three 20p coins. I try one phone and it doesn’t work. I try another twice in succession and it works both times. What is the probability that this second phone is the reliable one ?

 

 

[Suhov and Kelbert, p.10, problem 1.9]

Meeting in a Tournament

A tennis tournament is organized for \(2^n\) players where each round is single elimination with \(n\) rounds. Two players are chosen at random.

  1. What is the chance that they meet in the first round or second round ?
  2. What is the chance they meet in the final or semi-final ?
  3. What is the chance they do not meet at all ?

 

[Sudov and Kelbert, p4 problem 1.2]

Betting with Coin Flips

Alice and Bob flip a coin repeatedly. Each time there is a head bob gets a dollar and each time there is a tail Alice gets a dollar.

  1. What is the probability that Bob and Alice have exactly the same amount of money after \(2n\) flips ?
  2. What is the chance that Alice has more money after \(2n+1\) flips ?

Chance of Testing Positive

In a certain population of people 5% have a disease. Bob’s road side clinic use a test  for the disease which has a 97% of  (correctly) returning positive if one has the disease and a 25% chance of  (incorrectly) returning a positive if one doesn’t have the disease. If a random person is given the test, what is the chance that the result is positive ?

Now let \(\alpha\) be the chance the test returns a positive if one doesn’t have the disease. (Leave the chance that the test correctly returns a positive is one has the disease at 97%). For what value of \(\alpha\) is the chance the test is correct equal to 5% for a randomly chosen person ?

High Card Wins

Bob and Alice each have a box containing 3 numbered cards. Bob’s box has cards numbered 2,5 and 9. Alice’s box has cards numbered 3,4 and 8. Notice that the average value of the card in each box is the same. If each draws a card uniformly from their box, find the probability Alice wins. What is the probability Bob wins ?

Algebras and Conditioning

 

Consider a deck with three cards number 1, 2,  and 3. Furthermore, assume that 1 and 2 cards are colored red and the 3 card is colored black. Two of the cards are drawn with out replacement. Let \(D_1\)  be the first card drawn and \(D_2\) be the second card drawn. Let \(T\) be the sum of the two cards drawn and let \(N\) be the number of red cards drawn.

  1. Write down the algebra of all possible event on this probability space.
  2. What is the algebra of events generated by \(T\), which we denote will \(\mathcal{A}(T)\) ?
  3. What is the algebra of events generated by \(N\) , which we denote will \(\mathcal{A}(N)\) ?
  4. Is \(T\) adapted to  \(\mathcal{A}(N)\) ? explain in terms of the above algebras.
  5. Is \(N\) adapted to  \(\mathcal{A}(T)\) ? explain in terms of the above algebras.
  6. What is \[ \mathbf{E} [ N \,|\, \mathcal{A}(T)] ? \]
  7. What is \[ \mathbf{E} [ T \,|\, \mathcal{A}(N)] ? \]

 

 

Binomial with a random parameter

Let \(X\) be binomial with parameters \(n\), which is constant, and  \(\Theta\) which is  distributed  uniformly on \( (0,1)\).

  1. Find \(\mathbf{E}(s^X | \Theta)\)  for any \(s\).
  2. Show that for any \(s\)
    \[ \mathbf{E} ( s^X) = \frac{1}{n+1} \big(\frac{1-s^{n+1}}{1-s} \big)\]
    use this to conclude that \(X\) is distributed uniformly on the set \(\{0,1,2, \cdots, n\}\)

 

Limit theorems via generating functions

Use the results on generating functions and limit theorems which can be found here to answer the following questions.

  1. Let \(Y_n\) be uniform on the set  \(\{1,2,3,\cdots,n\}\). Find the moment generating function of \(\frac1n Y_n\) which we will call it \(M_n(t)\).  Then show that as \(n \rightarrow \infty\),
    \[ M_n(t) \rightarrow \frac{e^t -1}{t}\]
    Lastly, identify this limiting moment generating function that of a known random variable. Comment on why this make sense.
  2. Let \(X_n\) be distributed as a binomial with parameters \(n\) and \(p_n=\lambda/n\).   By using the probability generating function for \(X_n\), show that \(X_n\) converges to a Poisson random variable with parameter \(\lambda\) as \(n \rightarrow \infty\).

[Adapted from Stirzaker, p 318]

Basic generating functions

In each example below find the probability generating function (p.g.f.) or moment generating function (m.g.f.) of the random variable \(X\): (show your work!) ( When both are asked for, recall the relationship between the m.g.f. and p.g.f. You only need to do the calculation for one to find both.)

  1.  \(X\) is normal mean \(\mu\) and variance \(\sigma^2\). Find m.g.f.
  2. \(X\) is uniform on \( (0,a)\). Find m.g.f.
  3. \(X\) is Bernoulli with parameter \(p\). Find p.g.f. and m.g.f.
  4. \(X\) is exponential with parameter \(\lambda\). Find m.g.f.
  5.  \(X\) is geometric with parameter \(p\). Find p.g.f. and m.g.f.
  6. \(X=a+bY\) where \(Y\) has probability generating function \(G(s)\). Find m.g.f.

 

Random Sum of Random Variables

Let \(\{X_r : r=1,2,3,\cdots\}\) be a collection of i.i.d. random variables. Let \(G(s))\) be the generating function  of \(X_1\) ( i.e. \(G(s)=\mathbf{E} (s^{X_1})\) ), and hence; each of the \(X_r\)’s. Let \(N\) be an additional random variable taking values in the non-negative integers which is independent of all of the \(X_r\). Let \(H(s)\) be generating function of \(N\).

  1. Define the random variable \[ T=\sum_{k=1}^N X_k\] where \(T=0\) of \(N=0\).  For any fixed \(s>0\), calculate \( \mathbf{E}[ s^T | N]\). Show that the generating function of \(T\) is  \(H(G(s)) \).
  2. Assume that each claim that a given insurance company  pays is independent and distributed as an exponential random variable with parameter \(\lambda\). Let  the number of claims  in a given  year be distributed as geometric  random variable with parameter \(p\). What is the moment generating function of the total amount of money payed out in a given year ? Use your answer to identify the distribution of the total money payed out in a given year.
  3. Looking back at the previous part of the question, contrast your answer with the result of adding a non random number of exponential together.

Linear regression

Consider the following model:

\(X_1,…,X_n \stackrel{iid}{\sim} f(x), \quad Y_i = \theta X_i + \varepsilon_i, \quad \varepsilon_i \stackrel{iid}{\sim} \mbox{N}(0,\sigma^2).\)

  1. Compute \({\mathbf E }(Y \mid X)\)
  2. Compute \({\mathbf E }(\varepsilon \mid X)\)
  3. Compute \({\mathbf E }( \varepsilon)\)
  4. Show \( \theta = \frac{{\mathbf E}(XY)}{{\mathbf E}(X^2)}\)

Clinical trial

Let \(X\) be the number of patients in a clinical trial with a successful outcome. Let \(P\) be the probability of success for an individual patient. We assume before the trial begins that \(P\) is unifom on \([0,1]\). Compute

  1. \(f(P \mid X)\)
  2. \( {\mathbf E}( P \mid X)\)
  3. \( {\mathbf Var}( P \mid X)\)

Order statistics II

Suppose \(X_1, … , X_{17}\) are iid uniform on \( (.5,.8) \). What is \({\mathbf{E}} [X_{(k)}] \) ?

Order statistics I

Suppose \(X_1, … , X_n \stackrel{iid}{\sim} U(0,1) \). How large must \(n\) be to have that \({\mathbf{P}}(X_{(n)} \geq .95) \geq 1/2\) ?

Beta-binomial

You have a sequence of coins \(X_1,…,X_n\) drawn iid from a Bernouli distribution with unknown parameter \(p\) and known fixed \(n\). Assume a priori that the coins parameter \(p\) follows a Beta distribution with parameters \(\alpha,\beta\).

  1. Given the sequence  \(X_1,…,X_n\) what is the posterior pdf of \(p\) ?
  2. For what value of \(p\) is the maximum of the posterior pdf attained.

Hint: If \(X\) is distributed Bernoulli(p) then for \(x=1,0\) one has \(P(X=x)=p^x(1-p)^{(1-x)}\). Furthermore, if \(X_1,X_2\) are i.i.d. Bernoulli(p) then
\[P(X_1=x_1, X_2=x_2 )=P(X_1=x_1)P(X_2=x_2 )=p^{x_1}(1-p)^{(1-x_1)}p^{x_2}(1-p)^{(1-x_2)}\]

Conditioning and Polya’s urn

An urn contains 1 black and 2 white balls. One ball is drawn at random and its color noted. The ball is replaced in the urn, together with an additional ball of its color. There are now four balls in the urn. Again, one ball is drawn at random from the urn, then replaced along with an additional ball of its color. The process continues in this way.

  1. Let \(B_n\) be the number of black balls in the urn just before the \(n\)th ball is drawn. (Thus \(B_1= 1\).) For \(n \geq 1\), find \(\mathbf{E} (B_{n+1} | B_{n}) \).
  2. For \(n \geq 1\),  find \(\mathbf{E} (B_{n}) \). [Hint: Use induction based on the previous answer and the fact that \(\mathbf{E}(B_1) =1\)]
  3.   For \(n \geq 1\), what is the expected proportion of black balls in the urn just before the \(n\)th ball is drawn ?

 

[From pitman p 408, #6]

Car tires

The air pressure in the left and right front tires of a car are random variables \(X\) and \(Y\), respectively. Tires should be filled to 26psi. The joint pdf is

\( f(x,y) = K(x^2+y^2), \quad 20 \leq x,y \leq 30 \)

  1. What is \(K\) ?
  2. Are the random variables independent ?
  3. What is the probability that both tires are underfilled ?
  4. What is the probability that \( |X-Y| \leq 3 \) ?
  5. What are the marginal densities ?

Joint of min and max

Let \(X_1,…,X_n \stackrel{iid}{\sim} \mbox{Exp}(\lambda) \)

Let \(V = \mbox{min}(X_1,…,X_n)\) and  \(W = \mbox{max}(X_1,…,X_n)\).

What is the joint distribution of \(V,W\). Are they independent ?

Joint density part 1

Let \(X\) and \(Y\) have joint density

\(f(x,y) = 90(y-x)^8, \quad 0<x<y<1\)

  1. State the marginal distribution for \(X\)
  2. State the marginal distribution for \(Y\)
  3. Are these two random variables independent?
  4. What is \(\mathbf{P}(Y > 2X)\)
  5. Fill in the blanks “The density \(f(x,y)\) above   is the joint density of the  _________ and __________ of ten independent uniform \((0,1)\) random variables.”

[Adapted from Pitman pg 354]

 

Topics