Home » Articles posted by Jonathan Mattingly (Page 3)
Author Archives: Jonathan Mattingly
Wandering Umbrellas
An individual has three umbrellas; some at her office and some at her home. If she is leaving home in the morning (or leaving work at night) and it is raining she will take an umbrella, if there is one. Otherwise, she gets wet. Assume that independent of the past it rains on each trip with probability 0.2 .
To formulate a Markov chain, let \(X_n\) be the number of umbrellas at her current location.
- What is the state space for this Markov Chain ?
- Find the transition probabilities for this Markov Chain.
- Calculate the limiting fraction of time she gets wet.
[Durrett, Section 4.7, ex 34]
Conditioning a Poisson Arrival Process
Consider a Poisson process with parameter \(\lambda\). What is the conditional probability that \(N(1) = n\) given that \(N(3) = n\)? (Here, \(N(t) \) is the number of calls which arrive between time 0 and time \(t\). ) Do you understand why this probability does not depend on \(\lambda\)?
[Meester ex 7.5.4]
Poisson Thinning
Let \(N(t)\) be a Poisson process with intensity λ. For each occurrence, we flip a coin: if heads comes up we label the occurrence green, if tails comes up we label it red. The coin flips are independent and \(p\) is the probability to see heads.
- Show that the green occurrence form a Poisson process with intensity λp.
- Connect this with example 2.2.5 from Meester.
- We claim that the red occurrences on the one hand, and the green occurrences on the other hand form independent Poisson processes. Can you formulate this formally, and prove it , using Example 2.2.5 from Meester once more?
[Meester ex. 7.5.7]
Covariance of a Branching Process
Show that for a branching process \( (Z_n)\) with expected offspring \(\mu\) one has
\[\mathbf{E}( Z_n Z_m)= \mu^{n-m} \mathbf{E}( Z_m^2)\]
for \(0\leq m\leq n\).
Basic Branching Process
Consider the branching process with offspring distribution given by \( P(X=0)=\alpha\), \( P(X=1)=\frac23-\alpha\) and \( P(X=2)=\frac13\) for some \(\alpha \in [0,\frac23]\).
- For what values of \(\alpha\) is the process certain to die out. ?
- For values where there is a probability of surviving forever, what is this probability as a function of \(\alpha\) ?
Geometric Branching Process
Consider a branching process with a geometric offspring distribution \( P(X=k) = (1-p)p^k\), for \(k=0,1,2,\dots\) . Show that the ultimate extinction is certain if \(p \leq \frac12\) and that the probability of extinction is \((1-p)/p \) if \(p > \frac12\).
[Meester ex. 6.6.5]
Joint, Marginal and Conditioning
Let \( (X,Y)\) have joint density \(f(x,y) = e^{-y}\), for \(0<x<y\), and \(f(x,y)=0\) elsewhere.
- Are \(X\) and \(Y\) independent ?
- Compute the marginal density of \(Y\).
- Show that \(f_{X|Y}(x,y)=\frac1y \), for \(0<x<y\).
- Compute \(E(X|Y=y)\)
- Use the previous result to find \(E(X)\).
Three Random Variables
Let \(X\), \(Y\), and \(Z\) be independent uniform \( (0,1)\).
- Find the joint density of \(XY\) and \(Z^2\).
- Show that \(P(XY < Z^2) = \frac59\).
[Meester ex. 5.12.25]
A joint density example I
Let \( (X,Y) \) have joint density \(f(x,y)=x e^{-x-y}\) when \(x,y>0\) and \(f(x,y)=0\) elsewhere. Are \(X\) and \(Y\) independent ?
[Meester ex 5.12.30]
Memory and the Exponential
Let \(X\) have an exponential distribution with parameter \(\lambda\). Show that
\[ P( X> t+ s \,|\, X>s) = P(X>t) \]
for all \(s,t >0\). Explain why one might call this property of the exponential “the lack of memory”.
Hitting Zero
Consider a random walk making \(2n\) steps, and let \(T\) be the first
return to its starting point, that is
\[T =\ min\{1 ≤ k ≤ 2n : S_k = 0\},\]
and \(T = 0\) if the walk does not return to zero in the first \(2n steps.
Show that for all \( 1 ≤ k ≤ n\) we have,
\[P(T = 2k) =\frac1{2k − 1} \begin{pmatrix} 2k\\k\end{pmatrix} 2^{-2k}\]
[Meetster Ex 3.3.2]
Geometric probability
in each case, consider a point picked uniformly randomly from the interior of the region. Find the probability density function for the \(x\)-coordinate.
- The square with corner : \( (-2,0), (0,2), (2,0), (0,-2) \)
- The triangle with corners: \( (-2,0), (1,0), (0,2) \)
- The polygon with corners: \( (0,2),(2,1), (1,-1), (-1,0)\)
[Pitman p277, # 12]
Expected value of Random Walk
Consider a random walk \(S_i\), which begins at zero, and makes \(2n\) steps.
Let \(k < n\). Show that
- \(\mathbf{E}\Big( |S_{2k}| \ \Big|\ |S_{2k−1}| = r\Big) = r\);
- \( \mathbf{E}\Big(|S_{2k+1}| \ \Big|\ |S_{2k}| = r\Big) = 1 \text{ if } r = 0, \text{ and } \mathbf{E}\Big(|S_{2k+1}| \ \Big|\ |S_{2k}| = r\Big) = r\text{ otherwise}.\)
A p.m.f. and expectation example
Let \(X\) be a random variable with probability mass function
\[p(n) = \frac{c}{n!}\quad \text{for $\mathbf{N}=0,1,2\cdots$}\]
and \(p(x)=0\) otherwise.
- Find \(c\). Hint use the Taylor series expansion of \(e^x\).
- Compute the probability that \(X\) is even.
- Computer the expected value of \(X\)
[Meester ex 2.7.14]
Prime Dice
Suppose that we have a very special die which has exactly \(k\) faces where \(k\) is prime. The faces are numbered \(1,\dots,k\). We throw the die once and see which number comes up.
- What would be an appropriate outcome space and probability measure for this random experiment ?
- Suppose that the events \(A\) and \(B\) are independent. Show that \(\mathbf{P}(A)\) or \(\mathbf{P}(B)\) is always either 0 or 1. Or in other wards \(A\) or \(B\) is always either the full space or the empty set.
[ from Meester, ex 1.7.32]
The chance a coin is fair
Suppose that I have two coins in my pocket. One ordinary, fair coin and one coin which has heads on both sides. I pick a random coin out of my pocket, throw it, and it comes up heads.
- What is the probability that I have thrown the fair coin ?
- If I throw the same coin again, and heads comes up again, what is the probability that I have thrown the fair coin ?
- If instead of throwing the same coin again, I reach into my pocket and throw the second coin. If it comes up heads, what is the chance the first coin is the fair coin ?
[ Modified version of Meester, ex 1.7.35]
Independence of two hearts ?
Consider a deck of 52 cards. Let \(A\) be the event that the first card is a heart. Let \(B\) be the event that the 51st card is a heart.
What is \(\mathbf{P}(A)\) ? What is \(\mathbf{P}(B)\) ? Are \(A\) and \(B\) independent ?
Conditionally equally Likely
Let \(A\) and \(B\) be two events with positive probability. When does \(\mathbf{P}(A|B)=\mathbf{P}(B|A)\) ?
Making a biased coin fair
Jack and Jill want to use a coin to decide who gets the remaining piece of cake. However, since the coin is Jack’s, Jill is suspicious that the coin is a trick coin which produced head with a probability \(p\) which is not \(\frac12\). Can you devise a way to use this coin to come to a fair decision as to who gets the cake ?
Selling the Farm
Two competing companies are trying to buy up all the farms in a certain area to build houses. In each year 10% of farmers sell to company 1, 20% sell to company 2, and 70% keep farming. Neither company ever sells any of the farms that they own. Eventually all of the farms will be sold. Assuming that there are a large number of farms initially, what fraction do you expect will be owned by company 1 ?
[Durrett “Elementary Probability”, p 159 # 39]
Computers on the Blink
A university computer room has 30 terminals. Each day there is a 3% chance that a given terminal will break and a 72% chance that that a given broken terminal will be repaired. Assuming that the fates of the various terminals are independent, in the long run what is the distribution of the number of terminals that are broken ?
[Durrett “Elementary Probability” p. 155 # 24]
Basic Markov Chains
In each of the graphs pictured, assume that each arrow leaving a vertex has an equal chance of being followed. Hence if there are thee arrows leaving a vertex then there is a 1/3 chance of each being followed.
- For each of the six pictures, find the Markov transition matrix.
- State if the Markov chain given by this matrix is irreducible and if the matrix is doubly stochastic.
- If the Matrix is irreducible, state if it is aperiodic.
- When possible (given what you know), state if each chain has a unique stationary distribution. If it is obvious that the system does not possess a unique stationary distribution, please state why.
- For two of the chains it is easy to state what is this unique stationary distribution . Which two and what are the two stationary distributions?
conditional densities
Let \(X\) and \(Y\) have the following joint density:
\[ f(x,y)=\begin{cases}2x+2y -4xy & \text{for } 0 \leq x\leq 1 \ \text{and}\ 0 \leq y \leq 1\\ 0& \text{otherwise}\end{cases}\]
- Find the marginal densities of \(X\) and \(Y\)
- find \(f_{Y|X}( y \,|\, X=\frac14)\)
- find \( \mathbf{E}(Y \,|\, X=\frac14)\)
[Pitman p426 # 2]
Expected max/min given min/max
Let \(X_1\) and \(X_2\) be the numbers on two independent fair-die rolls. Let \(M\) be the maximum and \(N\) the minimum of \(X_1\) and \(X_2\). Calculate:
- \(\mathbf{E}( M| N=x) \)
- \(\mathbf{E}( N| M=x) \)
- \(\mathbf{E}( M| N) \)
- \(\mathbf{E}( N| M) \)
Difference between max and min
Let \(U_1,U_2,U_3,U_4,U_5\) be independent, each with uiform distribution on \((0,1)\). Let \(R\) be the distance between the max and the min of the \(U_i\)’s. Find
- \(\mathbf{E} R\)
- the joint density of the max and the min of the \(U_i\)’s.
- the \(\mathbf{P}(R> .5)\)
[pitman p355, #14]
A Joint density example II
If \(X\) and \(Y\) have joint density function
\[f(x,y)=\frac{1}{x^2y^2} \quad; \quad x \geq 1, y\geq 1\]
- Compute the joint density fiction of \(U=XY\), \(V=X/Y\).
- What are the marginal densities of \(U\) and \(V\) ?
[Ross p295, # 54]
Closest Point
Consider a Poisson random scatter of points in a plane with mean intensity \(\lambda\) per unit area. Let \(R\) be the distance from zero to the closest point of the scatter.
- Find a formula for the c.d.f. and the density of \(R\) and sketch their graphs.
- Show that \(\sqrt{2 \lambda \pi} R\) has the Rayleigh distribution.
- Find the mean and mode of \(R\).
[pitman p 389, # 21]
Joint Density of Arrival Times
Let \(T_1 < T_2<\cdots\) be the arrival times in a Poisson arrival process with rate \(\lambda\). What is the joint distribution of \((T_1,T_2,T_5)\) ?
Point of increase
Suppose \(U_1,U_2, …\) are independent uniform \( (0,1) \) random variables. Let \(N\) be the first point of increase. That is to say the first \(n \geq 2\) such that \(U_n > U_{n-1}\). Show that for \(u \in (0,1)\):
- \[\mathbf{P}(U_1 \leq u \ { and } \ N=n)= \frac{u^{n-1}}{(n-1)!}-\frac{u^{n}}{n!} \quad;\quad n \geq 2\]
- \( \mathbf{E}(N)=e \)
Some useful observations:
- \[\mathbf{P}(U_1 \leq u \ { and } \ N=n) = \mathbf{P}(U_1 \leq u \ { and } \ N \geq n) -\mathbf{P}(U_1 \leq u \ { and } \ N \geq n+1)\]
- The following events are equal
\[ \{U_1 \leq u \quad{ and } \quad N \geq n\} = \{U_{n-1}\leq U_{n-2} \leq \cdots \leq U_2\leq U_{1} \leq u \}\] - \[ \mathbf{P}\{U_2 \leq U_1 \leq u \}= \int_0^u \int_0^{u_1} du_2 du_1 \]
An example of min and change of variable
Suppose \(R_1\) and \(R_2\) are two independent random variables with the same density function
\[f(x)=x\exp(-{\textstyle \frac12 }x^2)\]
for \(x\geq 0\). Find
- the density of \(Y=\min(R_1,R_2)\);
- the density of \(Y^2\)
[Pitman p. 336 #21]
Calls arriving
Assume that calls arrive at a call centre according to a Poisson arrival process with a rate of 15 calls per hour. For \(0 \leq s < t\), let \(N(s,t)\) denote the number of calls which arrive between time \(s\) and \(t\) where time is measured in hours.
- What is \( \mathbf{E}\big(\,N(3,5)\,\big)\) ?
- What is the second moment of \(N(2,4) \) ?
- What is \( \mathbf{E}\big(\,N(1,4)\,N(2,6)\,\big)\) ?
Tail-sum formula for continuous random variable
Let \(X\) be a positive random variable with c.d.f \(F\).
- Show using the representation \(X=F^{-1}(U)\) where \(U\) is \(\textrm{unif}(1,0)\) that \(\mathbf{E}(X)\) can be interpreted as the area above the graph on \(y=F(x)\) but below the line \(y=1\). Using this deduce that
\[\mathbf{E}(X)=\int_0^\infty [1-F(x)] dx = \int_0^\infty \mathbf{P}(X> x) dx \ .\] - Deduce that if \(X\) has possible values \(0,1,2,\dots\) , then
\[\mathbf{E}(X)=\sum_{k=1}^\infty \mathbf{P}(X\geq k)\]
Min, Max, and Exponential
Let \(X_1\) and \(X_2\) be random variables and let \(M=\mathrm{max}(X_1,X_2)\) and \(N=\mathrm{min}(X_1,X_2)\).
- Argue that the event \(\{ M \leq x\}\) is the same as the event \(\{X_1 \leq x, X_2 \leq x\}\) and similarly that t the event \(\{ N > x\}\) is the same as the event \(\{X_1 > x, X_2 > x\}\).
- Now assume that the \(X_1\) and \(X_2\) are independent and distributed with c.d.f. \(F_1(x)\) and \(F_2(x)\) respectively . Find the c.d.f. of \(M\) and the c.d.f. of \(N\) using the proceeding observation.
- Now assume that \(X_1\) and \(X_2\) are independently and exponentially distributed with parameters \(\lambda_1\) and \(\lambda_2\) respectively. Show that \(N\) is distributed exponentially and identify the parameter in the exponential distribution of \(N\).
- The route to a certain remote island contains 4 bridges. If the time to collapse of each bridge is exponential distributed with mean 20 years and is independent of the other bridges, what is the distribution of the time until the road is impassable because one of the bridges has collapsed.
[Jonathan Mattingly]
Approximating sums of uniform random variables
Suppose \(X_1,X_2,X_3,X_4\) are independent uniform \((0,1)\) and we set \(S_4=X_1+X_2+X_3+X_4\). Use the normal approximation to estimate \(\mathbf{P}( S_4 \geq 3) \).
geometric probability: marginal densities
Find the density of the random variable \(X\) when the pair \( (X,Y) \) is chosen uniformly from the specified region in the plane in each case below.
- The diamond with vertices at \( (0,2), (-2,0), (0,-2), (2,0) \).
- The triangle with vertices \( (-2,0), (1,0), (0,2) \).
[Pitman p 277, #12]
