Home » Basic Stochastic Processes
Category Archives: Basic Stochastic Processes
Product Chain
Let \(Z_n\) be a collection of independent random variables with \(P(Z_n=1)=\frac12\) and \(P(Z_n=\frac12)=\frac12\) . Define \(X_0=1\) and \(X_{n+1}=Z_n X_n\).
- What is \(E( X_n | X_{n-1})\) ?
- What is \(E(X_n)\) ?
- What is \(\mathrm{Cov}(X_n,X_{n-1})\) ?
Basic Markov Chain I
In each of the graphs pictured, assume that each arrow leaving a vertex has an equal chance of being followed. Hence if there are thee arrows leaving a vertex then there is a 1/3 chance of each being followed.
- For each of the six pictures, find the Markov transition matrix.
- State if the Markov chain given by this matrix is irreducible.
- If the Matrix is irreducible, state if it is aperiodic.
Wandering Umbrellas
An individual has three umbrellas; some at her office and some at her home. If she is leaving home in the morning (or leaving work at night) and it is raining she will take an umbrella, if there is one. Otherwise, she gets wet. Assume that independent of the past it rains on each trip with probability 0.2 .
To formulate a Markov chain, let \(X_n\) be the number of umbrellas at her current location.
- What is the state space for this Markov Chain ?
- Find the transition probabilities for this Markov Chain.
- Calculate the limiting fraction of time she gets wet.
[Durrett, Section 4.7, ex 34]
Covariance of a Branching Process
Show that for a branching process \( (Z_n)\) with expected offspring \(\mu\) one has
\[\mathbf{E}( Z_n Z_m)= \mu^{n-m} \mathbf{E}( Z_m^2)\]
for \(0\leq m\leq n\).
Basic Branching Process
Consider the branching process with offspring distribution given by \( P(X=0)=\alpha\), \( P(X=1)=\frac23-\alpha\) and \( P(X=2)=\frac13\) for some \(\alpha \in [0,\frac23]\).
- For what values of \(\alpha\) is the process certain to die out. ?
- For values where there is a probability of surviving forever, what is this probability as a function of \(\alpha\) ?
Geometric Branching Process
Consider a branching process with a geometric offspring distribution \( P(X=k) = (1-p)p^k\), for \(k=0,1,2,\dots\) . Show that the ultimate extinction is certain if \(p \leq \frac12\) and that the probability of extinction is \((1-p)/p \) if \(p > \frac12\).
[Meester ex. 6.6.5]
Selling the Farm
Two competing companies are trying to buy up all the farms in a certain area to build houses. In each year 10% of farmers sell to company 1, 20% sell to company 2, and 70% keep farming. Neither company ever sells any of the farms that they own. Eventually all of the farms will be sold. Assuming that there are a large number of farms initially, what fraction do you expect will be owned by company 1 ?
[Durrett “Elementary Probability”, p 159 # 39]
Computers on the Blink
A university computer room has 30 terminals. Each day there is a 3% chance that a given terminal will break and a 72% chance that that a given broken terminal will be repaired. Assuming that the fates of the various terminals are independent, in the long run what is the distribution of the number of terminals that are broken ?
[Durrett “Elementary Probability” p. 155 # 24]
Basic Markov Chains
In each of the graphs pictured, assume that each arrow leaving a vertex has an equal chance of being followed. Hence if there are thee arrows leaving a vertex then there is a 1/3 chance of each being followed.
- For each of the six pictures, find the Markov transition matrix.
- State if the Markov chain given by this matrix is irreducible and if the matrix is doubly stochastic.
- If the Matrix is irreducible, state if it is aperiodic.
- When possible (given what you know), state if each chain has a unique stationary distribution. If it is obvious that the system does not possess a unique stationary distribution, please state why.
- For two of the chains it is easy to state what is this unique stationary distribution . Which two and what are the two stationary distributions?