Home » Articles posted by Jonathan Mattingly (Page 4)

Author Archives: Jonathan Mattingly

probability density example

Suppose  \(X\) takes values in\( (0,1) \) and has a density

\[f(x)=\begin{cases}c x^2 (1-x)^2 \qquad &x\in(0,1)\\  0 & x \not \in (0,1)\end{cases}\]

for some \(c>0\).

  1. Find \( c \).
  2. Find \(\mathbf{E}(X)\).
  3. Find \(\mathrm{Var}(X) \).

 

Infinite Mean

Suppose that \(X\) is a random variable whose density is

\[f(x)=\frac{1}{2(1+|x|)^2} \quad x \in (-\infty,\infty)\]

 

  1. Draw a graph of \(f(x)\).
  2. Find \(\mathbf{P}(-1 <X<2)\).
  3. Find \(\mathbf{P}(X>1)\).
  4. Is \(\mathbf{E}(X) \) defined ? Explain.

Raindrops are falling

Raindrops are falling at an average rate of 30 drops per square inch per minute.

  1. What is the chance that a particular square inch is not hit by any drops during a given 10-second period ?
  2. If one draws a circle of radius 2 inches on the ground, what is the chance that 4 or more drops hits inside the circle over a two-minute period?
  3. If each drop is a big drop with probability 2/3 and a small drop with probability 1/3, independent of the other drops, what is the chance that during 10 seconds a particular square inch gets hit by precisely four big drops and five small ones?

[Pitman p. 236, #17, Modified by Mattingly]

Overloading an Elevator

A new elevator in a large hotel is designed to carry about 30 people, with a total weight of up to 5000 lbs. More that 5000 lbs. overloads the elevator. The average weight of guests at the hotel is 150 lbs., with a standard deviation of  55 lbs. Suppose 30 of the hotel’s guests get into the elevator . Assuming the weights of the guests are independent random variables, what is the chance of overloading the elevator  ? Give your approximate answer as a decimal.

 

 

[Pitman p 204, # 19]

Indicator Functions and Expectations – II

Let \(A\) and \(B\) be two events and let \(\mathbf{1}_A\) and \(\mathbf{1}_B\) be the associated indicator functions. Answer the following questions in terms of \(\mathbf{P}(A)\), \(\mathbf{P}(B)\), \(\mathbf{P}(B \cup A)\) and \(\mathbf{P}(B \cap A)\).

  1. Describe the distribution of \( \mathbf{1}_A\).
  2. What is \(\mathbf{E} \mathbf{1}_A\) ?
  3. Describe the distribution of \(\mathbf{1}_A \mathbf{1}_B\).
  4. What is \(\mathbf{E}(\mathbf{1}_A \mathbf{1}_B)\) ?

The indicator function of an event \(A\) is the random variable which has range \(\{0,1\}\) such that

\[ \mathbf{1}_A(x) = \begin{cases} 1 &; \text{if $x \in A$}\\ 0 &; \text{if $x \not \in A$} \end{cases}\]

Ordered Random Variables

Suppose \(X\) and \(Y\) are two random variable such that \(X \geq Y\).

  1. For a fixed number \(T\), which would be greater, \(\mathbf{P}(X \leq T) \) or \(\mathbf{P}(Y \leq T) \).
  2. What if \(T\) is a random variable ? (If it helps you think about the problem, assume \(T\) takes values in \(\{1,\cdots,n\}\). )

Coin tosses: independence and sums

A fair coin is tossed three times. Let \(X\) be the number of heads on the first two tosses, \(Y\) the number of heads on the last two tosses.

  1. Make a table showing the joint distribution of \(X\) and \(Y\).
  2. Are \(X\) and \(Y\)  independent ?
  3. Find the distribution of \(X+Y\) ?

Defective Machines

 

Suppose that the probability that an item produced by a certain machine will be defective is 0.12.

  1.  Find the probability (exactly)  that a sample of 10 items will contain at most 1 defective item.
  2. Use the Poisson to approximate the preceding probability. Compare your two answers.

 

[Inspired Ross, p. 151,  example 7b ]

Boxes without toys

A cereal company advertises a prize in every box of its cereal. In fact, only about 95% of the boxes have a prize in them. If a family buys one box of this cereal every week for a year, estimate the chance that they will collect more than 45 prizes. What assumptions are you making ?

 

[Pitman p122, # 9]

Picking a box then a ball

Suppose that there are two boxes, labeled odd and even. The odd box contains three balls numbered 1,3,5 and the even box contains two balls labeled 2,4. One of the boxes is picked randomly by tossing a fair coin.

  1. What is the probability that a 3 is chosen ?
  2. What is the probability a number less than or equal to 2 is chosen ?
  3. The above procedure produces a distribution on \(\{1,2,3,4,5\}\) how does it compare to picking a number uniformly (with equal probability) ?

 

 

 

[Pitman p 37, example 5]

Finding a good phone

At the London station there are three pay phones which accept 20p coins. one never works, another works, while the third works with probability 1/2. On my way to London for the day, I wish to identify the reliable phone, so that I can use it on my return. The station is empty and I have just three 20p coins. I try one phone and it doesn’t work. I try another twice in succession and it works both times. What is the probability that this second phone is the reliable one ?

 

 

[Suhov and Kelbert, p.10, problem 1.9]

Meeting in a Tournament

A tennis tournament is organized for \(2^n\) players where each round is single elimination with \(n\) rounds. Two players are chosen at random.

  1. What is the chance that they meet in the first round or second round ?
  2. What is the chance they meet in the final or semi-final ?
  3. What is the chance they do not meet at all ?

 

[Sudov and Kelbert, p4 problem 1.2]

Betting with Coin Flips

Alice and Bob flip a coin repeatedly. Each time there is a head bob gets a dollar and each time there is a tail Alice gets a dollar.

  1. What is the probability that Bob and Alice have exactly the same amount of money after \(2n\) flips ?
  2. What is the chance that Alice has more money after \(2n+1\) flips ?

Chance of Testing Positive

In a certain population of people 5% have a disease. Bob’s road side clinic use a test  for the disease which has a 97% of  (correctly) returning positive if one has the disease and a 25% chance of  (incorrectly) returning a positive if one doesn’t have the disease. If a random person is given the test, what is the chance that the result is positive ?

Now let \(\alpha\) be the chance the test returns a positive if one doesn’t have the disease. (Leave the chance that the test correctly returns a positive is one has the disease at 97%). For what value of \(\alpha\) is the chance the test is correct equal to 5% for a randomly chosen person ?

High Card Wins

Bob and Alice each have a box containing 3 numbered cards. Bob’s box has cards numbered 2,5 and 9. Alice’s box has cards numbered 3,4 and 8. Notice that the average value of the card in each box is the same. If each draws a card uniformly from their box, find the probability Alice wins. What is the probability Bob wins ?

Taylor Series

Write in first 4 terms of the Taylor series expansion around \(x=0\) for the following functions:

  1. \(\log(1+x)\)
  2. \(e^{a x}\)
  3. \[\frac{1}{1-x}\]

Algebras and Conditioning

 

Consider a deck with three cards number 1, 2,  and 3. Furthermore, assume that 1 and 2 cards are colored red and the 3 card is colored black. Two of the cards are drawn with out replacement. Let \(D_1\)  be the first card drawn and \(D_2\) be the second card drawn. Let \(T\) be the sum of the two cards drawn and let \(N\) be the number of red cards drawn.

  1. Write down the algebra of all possible event on this probability space.
  2. What is the algebra of events generated by \(T\), which we denote will \(\mathcal{A}(T)\) ?
  3. What is the algebra of events generated by \(N\) , which we denote will \(\mathcal{A}(N)\) ?
  4. Is \(T\) adapted to  \(\mathcal{A}(N)\) ? explain in terms of the above algebras.
  5. Is \(N\) adapted to  \(\mathcal{A}(T)\) ? explain in terms of the above algebras.
  6. What is \[ \mathbf{E} [ N \,|\, \mathcal{A}(T)] ? \]
  7. What is \[ \mathbf{E} [ T \,|\, \mathcal{A}(N)] ? \]

 

 

Binomial with a random parameter

Let \(X\) be binomial with parameters \(n\), which is constant, and  \(\Theta\) which is  distributed  uniformly on \( (0,1)\).

  1. Find \(\mathbf{E}(s^X | \Theta)\)  for any \(s\).
  2. Show that for any \(s\)
    \[ \mathbf{E} ( s^X) = \frac{1}{n+1} \big(\frac{1-s^{n+1}}{1-s} \big)\]
    use this to conclude that \(X\) is distributed uniformly on the set \(\{0,1,2, \cdots, n\}\)

 

Limit theorems via generating functions

Use the results on generating functions and limit theorems which can be found here to answer the following questions.

  1. Let \(Y_n\) be uniform on the set  \(\{1,2,3,\cdots,n\}\). Find the moment generating function of \(\frac1n Y_n\) which we will call it \(M_n(t)\).  Then show that as \(n \rightarrow \infty\),
    \[ M_n(t) \rightarrow \frac{e^t -1}{t}\]
    Lastly, identify this limiting moment generating function that of a known random variable. Comment on why this make sense.
  2. Let \(X_n\) be distributed as a binomial with parameters \(n\) and \(p_n=\lambda/n\).   By using the probability generating function for \(X_n\), show that \(X_n\) converges to a Poisson random variable with parameter \(\lambda\) as \(n \rightarrow \infty\).

[Adapted from Stirzaker, p 318]

Basic generating functions

In each example below find the probability generating function (p.g.f.) or moment generating function (m.g.f.) of the random variable \(X\): (show your work!) ( When both are asked for, recall the relationship between the m.g.f. and p.g.f. You only need to do the calculation for one to find both.)

  1.  \(X\) is normal mean \(\mu\) and variance \(\sigma^2\). Find m.g.f.
  2. \(X\) is uniform on \( (0,a)\). Find m.g.f.
  3. \(X\) is Bernoulli with parameter \(p\). Find p.g.f. and m.g.f.
  4. \(X\) is exponential with parameter \(\lambda\). Find m.g.f.
  5.  \(X\) is geometric with parameter \(p\). Find p.g.f. and m.g.f.
  6. \(X=a+bY\) where \(Y\) has probability generating function \(G(s)\). Find m.g.f.

 

Random Sum of Random Variables

Let \(\{X_r : r=1,2,3,\cdots\}\) be a collection of i.i.d. random variables. Let \(G(s))\) be the generating function  of \(X_1\) ( i.e. \(G(s)=\mathbf{E} (s^{X_1})\) ), and hence; each of the \(X_r\)’s. Let \(N\) be an additional random variable taking values in the non-negative integers which is independent of all of the \(X_r\). Let \(H(s)\) be generating function of \(N\).

  1. Define the random variable \[ T=\sum_{k=1}^N X_k\] where \(T=0\) of \(N=0\).  For any fixed \(s>0\), calculate \( \mathbf{E}[ s^T | N]\). Show that the generating function of \(T\) is  \(H(G(s)) \).
  2. Assume that each claim that a given insurance company  pays is independent and distributed as an exponential random variable with parameter \(\lambda\). Let  the number of claims  in a given  year be distributed as geometric  random variable with parameter \(p\). What is the moment generating function of the total amount of money payed out in a given year ? Use your answer to identify the distribution of the total money payed out in a given year.
  3. Looking back at the previous part of the question, contrast your answer with the result of adding a non random number of exponential together.

Conditioning and Polya’s urn

An urn contains 1 black and 2 white balls. One ball is drawn at random and its color noted. The ball is replaced in the urn, together with an additional ball of its color. There are now four balls in the urn. Again, one ball is drawn at random from the urn, then replaced along with an additional ball of its color. The process continues in this way.

  1. Let \(B_n\) be the number of black balls in the urn just before the \(n\)th ball is drawn. (Thus \(B_1= 1\).) For \(n \geq 1\), find \(\mathbf{E} (B_{n+1} | B_{n}) \).
  2. For \(n \geq 1\),  find \(\mathbf{E} (B_{n}) \). [Hint: Use induction based on the previous answer and the fact that \(\mathbf{E}(B_1) =1\)]
  3.   For \(n \geq 1\), what is the expected proportion of black balls in the urn just before the \(n\)th ball is drawn ?

 

[From pitman p 408, #6]

Box-Muller I

Let \(U_1\) and \(U_2\) be independent random variables distributed uniformly on \( (0,1) \).

define \((Z_1,Z_2)\) by

\[Z_1=\sqrt{ -2 \log(U_1) }\cos( 2 \pi U_2) \]

\[Z_2=\sqrt{ -2 \log(U_1) }\sin( 2 \pi U_2) \]

  1. Find the joint density of \((Z_1, Z_2)\).
  2. Are \(Z_1\) and \(Z_2\) independent ? why ?
  3. What is the marginal density of \(Z_1\) and \(Z_2\) ? Do you recognize it ?
  4. Reflect on the implications of the previous answer for generating an often needed class of random variable on a computer.

Hint: To eliminate \(U_1\) write the formula for  \(Z_1^2 + Z_2^2\).

 

Simple Joint density

Let \(X\) and \(Y\) have joint density

\[ f(x,y) = c e^{-2x -3 y} \quad (x,y>0)\]

for some \(c>0\) and \(f(x,y)=0\) otherwise. find:

  1. the correct value of \(c\).
  2. \(P( X \leq x, Y \leq y)\)
  3. \(f_X(x)\)
  4. \(f_Y(y)\)
  5. Are \(X\) and \(Y\) independent ? Explain your reasoning ?

Joint arrival times

Let \(T_1\) and \(T_5\) be the times of the first and fifth arrival in a Poisson process with rate \(\lambda\). Find joint density of \(T_1\) and \(T_5\).

 

[Pitman p355 #12]

Max and Min’s

Let \(X_1,\cdots, X_n\) be random variables which are i.i.d. \(\text{unifom}(0,1)\). Let \(X_{(1)},\cdots, X_{(n)}\) be the associated order statistics.

  1. Find the distribution of \(X_{(n/2)}\) when \(n\) is even.
  2. Find \(\mathbf{E} [ X_{(n)} – X_{(1)} ]\).
  3. Find the distribution of \(R=X_{(n)} – X_{(1)}\).

Moment Generating Functions: Bernoulli and More

  1. Find the moment generating function for a \(\text{Bernoulli}(p)\) random variable.
  2. Recalling that  if \(X\) is distributed at \(Binomial(n,p)\) it can be written as the sum of appropriate Bernoulli random variables, find the  moment generating function for \(X\).
  3. Use the solution to the previous question to find the variance of \(X\). Show your work !

 

Change of Variable: Gaussian

Let \(Z\)  be a standard Normal random variable (ie with distribution \(N(0,1)\)). Find the formula for the density of each of the following random variables.

  1. 3Z+5
  2. \(|Z|\)
  3. \(Z^2\)
  4. \(\frac1Z\)
  5. \(\frac1{Z^2}\)

[based on Pitman p. 310, #10]

Change of variable: Weibull distribution

A random variable \(T\) has the \(\text{Weibull}(\lambda,\alpha)\) if it has probability density function

\[f(t)=\lambda \alpha t^{\alpha-1} e^{-\lambda t^\alpha} \qquad (t>0)\]

where \(\lambda >0\) and \(\alpha>0\).

  1. Show that \(T^\alpha\) has an \(\text{exponential}(\lambda)\) distribution.
  2. Show that if \(U\) is a \(\text{uniform}(0,1)\) random variable, then
    \[ T=\Big( – \frac{\log(U)}{\lambda}\Big)^{\frac1\alpha}\]
    has a \(\text{Weibull}(\lambda,\alpha)\)  distribution.

Change of Variable: Uniform

Find the density of :

  1. \(U^2\) if \(U\) is uniform(0,1).
  2. \(U^2\) if \(U\) is uniform(-1,1).
  3. \(U^2\) if \(U\) is uniform(-2,1).

Simple Poisson Calcuations

Let \(X\) have Poisson\((\lambda)\)  distribution. Calculate:

  1. \(\mathbf{E}(3 X +5)\)
  2. \(\mathbf{Var}(3X +5)\)
  3. \(\mathbf{E}\big[\frac1{1+X} \big]\)

Mixing Poisson Random Variables 1

Assume that  \(X\), \(Y\), and \(Z\) are independent Poisson random variables, each with mean 1. Find

  1. \(\mathbf{P}(X+Y = 4) \)
  2. \(\mathbf{E}[(X+Y)^2]\)
  3. \(\mathbf{P}(X+Y + Z= 4) \)

Random Errors in a Book

A book has 200 pages. The number of mistakes on each page is a Poisson random variable with mean 0.01, and is independent of the number of mistakes on all other pages.

  1. What is the expected number of pages with no mistakes ? What is the variance of the number of pages with no mistakes ?
  2. A person proofreading the book finds a given mistake with probability 0.9 . What is the expected number of pages where this person will find a mistake ?
  3. What, approximately, is the probability that the book has two or more pages with mistakes ?

 

[Pitman p235, #15]

Cards again

Given a well shuffled standard deck of 52 cards, what is the probability of what of the following events. (Think before you jump.)

  1. The 1st card is an ace.
  2. The 15th card is an ace.
  3. The 9th card is a diamond.
  4. The last 5 cards are hearts.
  5. The 17th card is the ace of diamonds and the 14 is the King of spades
  6. The 5th card is a diamond given that the 50th card is a diamond.

 

2nd Moment of Shifted Random Variables

Let \(X\) be a random variable with \(\mathbf{E}(X)=\mu\) and \(\mathbf{Var}(X)=\sigma^2\). Show that for any constant \(a\)

\[\mathbf{E}\big[(X-a)^2\big]=\sigma^2+(\mu-a)^2\]

Topics