Home » Stochastic Calculus (Page 2)
Category Archives: Stochastic Calculus
Ballistic Growth
Consider the SDE
\[
dX(t)=b(X(t))dt +\sigma(X(t))dB(t)
\]
with \(b(x)\to b_0 >0\) as \(x\to\infty\) and with \(\sigma\) bounded and positive. Suppose that \(b\) and \(\sigma\) are such that
\[\lim_{t\to\infty}X(t)=\infty\], with probability one for any starting point. Show that
\[
P_x\Big\{\lim_{t\to\infty}\frac{X(t)}{b_0 t}=1\Big\}=1 \ .
\]
From
\[
X(t)=x+\int_0^{t}b(X(s))ds +\int_0^{t}\sigma(X(s))dB(s)
\]
and the hypotheses, note that the result follows from showing that
\begin{align*}
\mathbf P_x\Big\{\lim_{t\to\infty}\frac{1}{t}\int_0^{t}\sigma(X(s))dB(s)=0\Big\}=1 \ .
\end{align*}
There are a number of ways of thinking about this. In the end they all come down to essentially the same calculations. One way is to show that for some fixed \(\delta \in(0,1)\) the following statement holds with probability one:
There exist a constants \(C(\omega)\) so that
\begin{align*}
\int_0^{t}\sigma(X(s))dB(s) \leq Ct^\delta
\end{align*}
for all \(t >0\).
To show this partition \([0,\infty]\) into blocks and use the Doob-Kolmogorov inequality to estimate the probability that the max of \( \int_0^{t}\sigma(X(s))ds\) on each block excess \(t^\delta\) on that block. Then use the Borel-Cantelli to show that this happens only a finite number of times.
A different way to organize the same calculation is to estimate
\[
\mathbf P_x\Big\{\sup_{t>a}\frac{1}{t}|\int_0^t \sigma(X(s))dB(s)|>\epsilon\Big\}
\]
by breaking the interval \(t>a\) into the union of intervals of the form \(a2^k <t\leq a2^{k+1}\) for \(k=0,1,\dots\) and using Doob-Kolmogorov Martingale inequality. Then let \(a\to\infty\).
Entry and Exit through boundaries
Consider the following one dimensional SDE.
\begin{align*}
dX_t&= \cos( X_t )^\alpha dW(t)\\
X_0&=0
\end{align*}
Consider the equation for \(\alpha >0\). On what interval do you expect to find the solution at all times ? Classify the behavior at the boundaries.
For what values of \(\alpha < 0\) does it seem reasonable to define the process ? any ? justify your answer.
Martingale Exit from an Interval – I
Let \(\tau\) be the first time that a continuous martingale \(M_t\) starting from \(x\) exits the interval \((a,b)\), with \(a<x<b\). In all of the following, we assume that \(\mathbf P(\tau < \infty)=1\). Let \(p=\mathbf P_x\{M(\tau)=a\}\).
Find and analytic expression for \(p\) :
- For this part assume that \(M_t\) is the solution to a time homogeneous SDE. That is that \[dM_t=\sigma(M_t)dB_t.\] (with \(\sigma\) bounded and smooth.) What PDE should you solve to find \(p\) ? with what boundary data ? Assume for a moment that \(M_t\) is standard Brownian Motion (\(\sigma=1\)). Solve the PDE you mentioned above in this case.
- A probabilistic way of thinking: Return to a general martingale \(M_t\). Let us assume that \(dM_t=\sigma(t,\omega)dB_t\) again with \(\sigma\) smooth and uniformly bounded from above and away from zero. Assume that \(\tau < \infty\) almost surely and notice that \[\mathbf E_x M(\tau)=a \mathbf P_x\{M_\tau=a\} + b \mathbf P_x\{M_\tau=b.\}\] Of course the process has to exit through one side or the other, so \[\mathbf P_x\{M_\tau=a\} = 1 – \mathbf P_x\{M_\tau=b\}\]. Use all of these facts and the Optimal Stopping Theorem to derive the equation for \(p\).
- Return to the case when \[dM_t=\sigma(M_t)dB_t\]. (with \(\sigma\) bounded and smooth.) Write down the equations that \(v(x)= \mathbf E_x\{\tau\}\), \(w(x,t)=\mathbf P_x\{ \tau >t\}\), and \(u(x)=\mathbf E_x\{e^{-\lambda\tau}\}\) with \(\lambda > 0\) satisfy. ( For extra credit: Solve them for \(M_t=B_t\) in this one dimensional setting and see what happens as \(b \rightarrow \infty\).)
Around the Circle
Consider the equation
\begin{align}
dX_t &= -Y_t dB_t – \frac12 X_t dt\\
dY_t &= X_t dB_t – \frac12 Y_t dt
\end{align}
Let \((X_0,Y_0)=(x,y)\) with \(x^2+y^2=1\). Show that \(X_t^2 + Y_t^2 =1\) for all \(t\) and hence the SDE lives on the unit circle. Does this make intuitive sense ?
Shifted Brownian Motion and a PDE
Let \(f \in C_0^2(\mathbf R^n)\) and \(\alpha(x)=(\alpha_1(x),\dots,\alpha_n(x))\) with \(\alpha_i \in C_0^2(\mathbf R^n)\) be given functions and consider the partial differential equations
\begin{align*}
\frac{\partial u}{\partial t} &= \sum_{i=1}^n \alpha_i(x)
\frac{\partial u}{\partial x_i} + \frac{1}{2} \frac{\partial^2
u}{\partial x_i^2} \ \text{ for } t >0 \text{ and }x\in \mathbf R^n \\
u(0,x)&=f(x) \ \text{ for } \ x \in \mathbf R^n
\end{align*}
Use the Girsonov theorem to show that the unique bounded solution \(u(t,x)\) of this equation can be expressed by
\begin{align*}
u(t,x) = \mathbf E_x \left[ \exp\left(\int_0^t \alpha(B(s))\cdot dB(s) –
\frac{1}{2}\int_0^t |\alpha(B(s))|^2 ds \right)f(B(t))\right]
\end{align*}
where \(\mathbf E_x\) is the with respect to \(\mathbf P_x\) when the Brownian Motion starts at \(x\). (Note there maybe a sign error in the above exponential term. Use what ever sign is right.) For the remainder, assume that \(\alpha\)
is a fixed constant \(\alpha_0\). Now using what you know about the distribution of \(B_t\) write the solution to the above equation as an integral kernel integrated against \(f(x)\). (In other words, write \(u(t,x)\) so that your your friends who don’t know any probability might understand it. ie \(u(t,x)=\int K(x,y,t)f(y)dy\) for some \(K(x,y,t)\))
Probability Bridge
For fixed \(\alpha\) and \(\beta\) consider the stochastic differential equation
\[
dY(t)=\frac{\beta-Y(t)}{1-t} dt + dB(t) ~,~~ 0\leq t < 1 ~,~~Y(0)=\alpha.
\]
Verify that \(\lim_{t\to 1}Y(t)=\beta\) with probability one. ( This is called the Brownian bridge from \(\alpha\) to \(\beta\).)
Hint: In the problem “Solving a class of SDEs“, you found that this equation had the solution
\begin{equation*}
Y_t = a(1-t) + bt + (1-t)\int_0^t \frac{dB_s}{1-s} \quad 0 \leq t <1\; .
\end{equation*}
To answer the question show that
\begin{equation*}
\lim_{t \rightarrow 1^-} (1-t) \int_0^t\frac{dB_s}{1-s} =0 \quad \text{a.s.}
\end{equation*}
Making the Cube of Brownian Motion a Martingale
Let \(B_t\) be a standard one dimensional Brownian
Motion. Find the function \(F:\mathbf{R}^5 \rightarrow \mathbf R\) so that
\begin{align*}
B_t^3 – F\Big(t,B_t,B_t^2,\int_0^t B_s ds, \int_0^t B_s^2 ds\Big)
\end{align*}
is a Martingale.
Hint: It might be useful to introduce the processes
\[X_t=B_t^2\qquad Y_t=\int_0^t B_s ds \qquad Z_t=\int_0^t B_s^2 ds\]
Correlated SDEs
Let \(B_t\) and \(W_t\) be standard Brownian motions which are
independent. Consider
\begin{align*}
dX_t&= (-X_t +1)dt + \rho dB_t + \sqrt{1-\rho^2} dW_t\\
dY_t&= -Y_t dt + dB_t \ .
\end{align*}
Find the covariance of \(\text{Cov}(X_t,Y_t)=\mathbf{E} (X_t Y_t) – \mathbf{E} (X_t) \mathbf{E}( Y_t)\).
Hyperbolic SDE
Consider
\begin{align*}
dX_t=& Y_t dB_t + \frac12 X_t dt\\
dY_t=& X_t dB_t + \frac12 Y_t dt
\end{align*}
Show that \(X_t^2-Y_t^2\) is constant for all \(t\).
Diffusion and Brownian motion
Let \(B_t\) be a standard Brownian Motion starting from zero and define
\[ p(t,x) = \frac1{\sqrt{2\pi t}}e^{-\frac{x^2}{2t} } \]
Given any \(x \in \mathbf R \), define \(X_t=x + B_t\) . Of course \(X_t\) is just a Brownian Motion stating from \(x\) at time 0. Fixing a smooth, bounded, compactly supported function \(f:\mathbf R \rightarrow \mathbf R\), we define the function \(u(x,t)\) by
\[u(x,t) = \mathbf E_x f(X_t)\]
where we have decorated the expectation with the subscript \(x\) to remind us that we are starting from the point \(x\).
- Explain why \[ u(x,t) = \int_{\infty}^\infty f(y)p(t,x-y)dy\]
- Show by direct calculation using the formula from the previous question that for \(t>0\), \(u(x,t)\) satisfies the diffusion equation
\[ \frac{\partial u}{\partial t}= c\frac{\partial^2 u}{\partial x^2}\]
for some constant \(c\). (Find the correct \(c\) !) - Again using the formula from part 1), show that
\[ \lim_{t \rightarrow 0} u(t,x) = f(x)\]
and hence the initial condition for the diffusion equation is \(f\).
Levy’s construction of Brownian Motion
Let \( \{ \xi_k^{(n)} : n =0,1,\dots ; k =1,\dots,2^n\} \) be a collection of independent Gaussian random variables with \(\xi_k^{(n)}\) having mean zero and variance \(2^{-n}\). Define the random variable \( \eta_k^{(n)}\) recursively by
\[\eta_1^{(0)} = Z \qquad\text{with}\quad Z\sim N(0,1) \quad\text{and independent of the \(\xi\)’s}\]
\[ \eta_{2k}^{(n+1)} = \frac12\eta_{k}^{(n)} -\frac12 \xi_{k}^{(n)}\]
\[ \eta_{2k-1}^{(n+1)} = \frac12\eta_{k}^{(n)} +\frac12 \xi_{k}^{(n)}\]
For any time \(t \in [0,1]\) of the form \(t=k 2^{-n}\) define
\[W^{(n)}_t = \sum_{j=1}^k \eta_{j}^{(n)}\]
For \(t \in [0,1]\) not of this form we connect the two nearest defined points with a line.
- Follow given steps to show that for fixed \(n\), \(W^{(n)}_t\) is random walk on \(\mathbf R\) with Gaussian steps.
- Show \(\mathbf E \eta_{k}^{(n)} = 0\) and \(\mathbf E \big[ (\eta_{k}^{(n)})^2\big] = 2^{-n}\)
- Argue that \(\eta_{k}^{(n)} \) is Gaussian and that for any fixed \(n\),
\[ \{ \eta_{k}^{(n)} : k=1,\dots, 2^n\} \]
are a collection of mutually independent random variables. (To show independence show that they are mean zero Gaussians with correlation \(\mathbf E [\eta_{k}^{(n)}\eta_{j}^{(n)}]=0\) when \(j\neq k\).)
- To understand the relationship between \(W^{(n)}\) and \(W^{(n+1)}\), simulate a collection of random \(\xi_k^{(n)}\) and plot \[W^{(0)}, W^{(1)}, W^{(2)}, W^{(3)}, W^{(4)}\]
over the time interval \([0,1]\). Notice that at \(n\) increases the functions seem to converge. Try a few different realizations to get a feeling for how the limiting function might look.
Ito Moments
Use Ito’s formula to show that if \(\sigma(t,\omega)\) is a bounded nonanticipating functional, \(|\sigma|\leq M\), then for the stochastic integral \(I(t,\omega)=\int_0^t \sigma(s,\omega) dB(s,\omega)\) we have the moment estimates
\[
\mathbf E\{ |I(t)|^{2p}\}\leq 1\cdot 3\cdot 5 \cdots (2p-1) (M^2 t)^p
\]
for \(p=1,2,3,…\). Follow the steps below to achieve this result.
- First assume that
\[\mathbf E \int_0^t |I_s|^{k} \sigma_s dB_s =0\]
for all positive , integer \(k\). Under this assumption, prove the result. Why don’t we know a priori that this expectation is zero ? - Now define for positive \(L\), define \(\chi^{p}_L(x)\) as \(x^p\) for \(|x| < L\) and 0 if \(|x| > L+1\) and connected monotonically in between such that the whole function is smooth. Now define
\[\psi^{(p)}_L(x) = p\int_0^x \chi^{(p-1)}_L(y) dy\qquad\text{and}\qquad\phi^{(p)}_L(x)= p\int_0^x \psi^{(p-1)}_L(y) dy\]
Observe that all three of the functions are globally bounded for a given \(L\). Apply Ito’s formula to \(\phi^{(p)}_L(I_t)\) and the fact that Fatou’s lemma implies that
\[\mathbf E |I_t|^{2p} \leq \lim_{L \rightarrow \infty} \mathbf E \phi^{(2p)}_L(I_t)\]
to prove the estimate started at the start by following a similar induction step as used above.
Ito to Stratonovich
Let’s think about different ways to make sense of \[\int_0^t W(s)dW(s)\] were \(W(t)\) is a standard Brownian motion. Fix any \(\alpha \in [0,1]\)define
\begin{equation*}
I_N^\alpha(t)=\sum_{j=0}^{N-1} W(t_j^\alpha)[W(t_{j+1})-W(t_j)]
\end{equation*}
were \(t_j=\frac{j t}N\) and \(t_j^\alpha=\alpha t_j + (1-\alpha)t_{j+1}\).
Calculate
- \[\lim_{N\rightarrow \infty}\mathbf E I_N^\alpha(t) \ .\]
- * \[\lim_{N\rightarrow \infty}\mathbf E \big( I_N^\alpha(t)\big)^2\]
- * For which choice of \(\alpha\) is \(I_N^\alpha(t)\) a martingale ?
What choice of \(\alpha\) is the standard It\^o integral ? What choice is the Stratonovich integral ?
Calculating with Brownian Motion
Let \(W_t\) be a standard brownian motion. Fixing an integer \(n\) and a terminal time \(T >0\), let \(\{t_i\}_{i=1}^n\) be a partition of the interval \([0,T]\) with
\[0=t_0 < t_1< \cdots< t_{n-1} < t_n=T\]
Calculate the following two expressions:
- \[ \mathbf{E} \Big(\sum_{k=1}^n W_{t_k} \big[ W_{t_{k}} – W_{t_{k-1}} \big] \Big)\]
Hint: you might want to do the second part of the problem first and then return to this question and write
\[W_{t_k} \big[ W_{t_{k}} – W_{t_{k-1}} \big]= W_{t_{k-1}} \big[ W_{t_{k}} – W_{t_{k-1}} \big]+ \big[W_{t_{k}} -W_{t_{k-1}}\big]\big[ W_{t_{k}} – W_{t_{k-1}}\big]\] - \[ \mathbf{E} \Big(\sum_{k=1}^n W_{t_{k-1}} \big[ W_{t_{k}} – W_{t_{k-1}} \big] \Big)\]
Simple Numerical Exercise
Let \(\omega_i\) , \(i=1,\cdots\) be a collection of mutually independent, uniform on \([0,1]\) random variables. Define
\[\eta_i(\omega)= \omega_i -\frac12\]
and
\[X_n(\omega) = \sum_{i=1}^n \eta_i(\omega)\,.\]
- What is \(\mathbf{E}\,X_n\) ?
- What is \(\mathrm{Var}(X_n)\) ?
- What is \(\mathbf{E}\,X_{n+k} | X_n \) for \(n, k >0\) ?
- What is \(\mathbf{E}(\,X_5^2 \,|\, X_3)\) ?
- [optional] Write a computer program to simulate some realizations of this process viewing \(n\) as time. Plot some plots of \(n\) vs \(X_n\).
- [optional] How do you simulations agree with the first two parts ?
Moment Bounds on Ito Integrals
Use Ito’s formula to show that if \(\sigma(t,\omega)\) is a
nonanticipating random function which is bounded. That is to say
\[ |\sigma(t,\omega)|\leq M\]
for all \(t \geq 0\) and all \(\omega\).
- Under this assumption show that the stochastic integral
\[I(t,\omega)=\int_0^t \sigma(s,\omega) dB(s,\omega)\]
satisfies the following moment estimates
\[\mathbf E\{ |I(t)|^{2p}\}\leq 1\cdot 3\cdot 5 \cdots (2p-1) (M^2 t)^p\]
for \(p=1,2,3,…\) if one assumes
\[ \mathbf E \int_0^t |I(s)|^k \sigma(s) dB(s) =0\]
for any integer \(k\). - Prove the above result without assuming that
\[ \mathbf E \int_0^t |I(s)|^k \sigma(s) dB(s) =0\]
since this requires that
\[ \mathbf E \int_0^t |I(s)|^{2k} \sigma^2(s) ds < \infty\]
which we do not know a priory.