Category Archives: Stochastic Calculus

Gaussian Ito Integrals

In this problem, we will show that the Ito integral of a deterministic function is a Gaussian Random Variable.

Let \(\phi\) be deterministic elementary functions. In other words there exists a  sequence of  real numbers \(\{c_k : k=1,2,\dots,N\}\) so that

\[ \sum_{k=1}^\infty c_k^2 < \infty\]

and there exists a partition

\[0=t_0 < t_1< t_2 <\cdots<t_N=T\]

so that

\[ \phi(t) = \sum_{k=1}^N c_k \mathbf{1}_{[t_{k-1},t_k)}(t) \]

 

  1. Show that if \(W(t)\) is a standard brownian motion then the Ito integral
    \[ \int_0^T \phi(t) dW(t)\]
    is a Gaussian random variable with mean zero and variance
    \[ \int_0^T \phi(t)^2 dt \]
  2. * Let \(f\colon [0,T] \rightarrow \mathbf R\) be a deterministic function such that
    \[\int_0^T f(t)^2 dt < \infty\]
    Then it can be shown that there exists a sequence of  deterministic elementary functions \(\phi_n\) as above such that
    \[\int_0^T (f(t)-\phi_n(t))^2 dt \rightarrow 0\qquad\text{as}\qquad n \rightarrow \infty\]
    Assuming this fact, let \(\psi_n\) be the characteristic function  of the random variable
    \[ \int_0^T \phi_n(t) dW(t)\]
    Show that for all \(\lambda \in \mathbf R\), show that
    \[ \lim_{n \rightarrow \infty} \psi_n(\lambda) = \exp \Big( -\frac{\lambda^2}2 \big( \int_0^T f(t)^2 dt \big)  \Big)\]
    Then use the the convergence result here to conclude that
    \[ \int_0^T f(t) dW(t)\]
    is a Gaussian Random Variable with mean zero and variance
    \[\int_0^T f(t)^2 dt \]
    by identifying the limit of the characteristic functions above.

    Note: When Probabilistic say the “characteristic function” of a random distribution they just mean the Fourier transform of the random variable. See here.

Solving a class of SDEs

Let us try a systematic procedure for solving SDEs which works for a class of SDEs. Let
\begin{align*}
X(t)=a(t)\left[ x_0 + \int_0^t b(s) dB(s) \right] +c(t) \ .
\end{align*}
Assuming \(a\), \(b\), and \(c\) are differentiable, use Ito’s formula to find the equation for \(dX(t)\) of the form
\begin{align*}
dX(t)=[ F(t) X(t) + H(t)] dt + G(t)dB(t)
\end{align*}
were \(F(t)\), \(G(t)\), and \(H(t)\) are some functions of time depending on \(a,b\) and maybe their derivatives. Solve the following equations by matching the coefficients. Let \(\alpha\), \(\gamma\) and \(\beta\) be fixed numbers.

Notice that
\begin{align*}
X(t)=a(t)\left[ x_0 + \int_0^t b(s) dB(s) \right] +c(t)=F(t,Y(t)) \ .
\end{align*}
where \(dY(t)=b(t) dB(t)\). Then you can apply Ito’s formula to this definition to find \(dX(t)\).

  1. First consider
    \[dX_t = (-\alpha X_t + \gamma) dt + \beta dB_t\]
    with \(X_0 =x_0\)
    . Solve this for \( t \geq 0\)
  2. Now consider
    \[dY(t)=\frac{\beta-Y(t)}{1-t} dt + dB(t) ~,~~ 0\leq t < 1 ~,~~Y(0)=\alpha.\]
    Solve this for \( t\in[0,1] \).
  3. \begin{align*}
    dX_t = -2 \frac{X_t}{1-t} dt + \sqrt{2 t(1-t)} dB_t ~,~~X(0)=\alpha
    \end{align*}
    Solve this for \( t\in[0,1] \).

 

Homogeneous Martingales and BDG Inequality

Part I

  1. Let \(f(x,y):\mathbb{R}^2 \rightarrow \mathbb{R}\) be a twice differentiable function in both \(x\) and \(y\). Let \(M(t)\) be defined by \[M(t)=\int_0^t \sigma(s,\omega) dB(s,\omega)\]. Assume that \(\sigma(t,\omega)\) is adapted and that \(\mathbb{E} M^2 < \infty\) for all \(t\) a.s. .(Here \(B(t)\) is standard Brownian Motion.) Let \(\langle M \rangle(t)\) be the quadratic variation process of \(M(t)\). What equation does \(f\) have to satisfy so that \(Y(t)=f(M(t),\langle M \rangle(t))\) is again a martingale if we assume that \(\mathbf E\int_0^t \sigma(s,\omega)^2 ds < \infty\).
  2. Set
    \begin{align*}
    f_n(x,y) = \sum_{0 \leq m \leq \lfloor n/2 \rfloor} C_{n,m} x^{n-2m}y^m
    \end{align*}
    here \(\lfloor n/2 \rfloor\) is the largest integer less than or equal to \(n/2\). Set \(C_{n,0}=1\) for all \(n\). Then find a recurrence relation for \(C_{n,m+1}\) in terms of \(C_{n,m}\), so that \(Y(t)=f_n(B(t),t)\) will be a martingale.Write out explicitly \(f_1(B(t),), \cdots, f_4(B(t),t)\) as defined in the previous item.

Part II

Now consider \(I(t)\) defined by \[I(t)=\int_0^t \sigma(s,\omega)dB(s,\omega)\] where \(\sigma\) is adapted and \(|\sigma(t,\omega)| \leq K\) for all \(t\) with probability one. In light of the above let us set
\begin{align*}Y(t,\omega)=I(t)^4 – 6 I(t)^2\langle I \rangle(t) + 3 \langle I \rangle(t)^2 \ .\end{align*}

  1. Quote  the problem “Ito Moments” to show that \(\mathbb{E}\{ |Y(t)|^2\} < \infty\) for all \(t\). Then use the first part of this problem to conclude that \(Y\) is a martingale.
  2. Show that \[\mathbb{E}\{ I(t)^4 \} \leq 6 \mathbb{E} \big\{ \{I(t)^2\langle I \rangle(t) \big\}\]
  3. Recall the Cauchy-Schwartz inequality. In our language it states that
    \begin{align*}
    \mathbb{E} \{AB\} \leq (\mathbb{E}\{A^2\})^{1/2} (\mathbb{E}\{B^2\})^{1/2}
    \end{align*}
    Combine this with the previous inequality to show that\begin{align*}\mathbb{E}\{ I(t)^4 \} \leq 36 \mathbb{E} \big\{\langle I \rangle(t)^2 \big\} \end{align*}
  4. As discussed in class \(I^4\) is a submartingale (because \(x \mapsto x^4\) is convex). Use the Kolmogorov-Doob inequality and all that we have just derived to show that
    \begin{align*}
    \mathbb{P}\left\{ \sup_{0\leq s \leq T}|I(s)|^4 \geq \lambda \right\} \leq ( \text{const}) \frac{ \mathbb{E}\left( \int_0^T \sigma(s,\omega)^2 ds\right)^2 }{\lambda}
    \end{align*}

 

Associated PDE

Show that if

  1. \[I(t,\omega)=\int_0^t \sigma(s,\omega) dB(s,\omega)\]is a stochastic integral then \[I^2(t)-\int_0^t \sigma^2(s)ds\] is a martingale.
  2. What equation must \(u(t,x)\) satisfy so that
    \[ t \mapsto u(t,B(t))e^{\int_0^t V(B(s))ds} \]
    is a martingale? Here \(V\) is a bounded function. Hint: Set \(Y(t)=\int_0^t V(B(s))ds\) and apply It\^0’s formula to \(Z(t,B(t),Y(t))=u(t,B(t))\exp(Y(t))\).

Exponential Martingale Bound

Let \(\sigma(t,\omega)\) be nonanticipating with \(|\sigma(x,\omega)| < M\) for some  bound  \(M\) . Let \(I(t,\omega)=\int_0^t \sigma(s,\omega) dB(s,\omega)\). Use the exponential martingale \[\exp\big\{\alpha I(t)-\frac{\alpha^2}{2}\int_0^t \sigma^2(s)ds \big\}\] (see the problem here)  and the Kolmogorov-Doob inequality to get the estimate
\[
P\Big\{ \sup_{0\leq t\leq T}|I(t)| \geq \lambda \Big\}\leq 2
\exp\left\{\frac{-\lambda^2}{2M^2 T}\right\}
\]
First express the event of interest in terms of the exponential martingale, then use the Kolmogorov-Doob inequality and after this choose the parameter \(\alpha\) to get the best bound.

Ballistic Growth

Consider the SDE
\[
dX(t)=b(X(t))dt +\sigma(X(t))dB(t)
\]
with \(b(x)\to b_0 >0\) as \(x\to\infty\) and with \(\sigma\) bounded and positive. Suppose that \(b\) and \(\sigma\) are such that
\[\lim_{t\to\infty}X(t)=\infty\], with probability one for any starting point. Show that
\[
P_x\Big\{\lim_{t\to\infty}\frac{X(t)}{b_0 t}=1\Big\}=1 \ .
\]
From
\[
X(t)=x+\int_0^{t}b(X(s))ds +\int_0^{t}\sigma(X(s))dB(s)
\]
and the hypotheses, note that the result follows from showing that
\begin{align*}
\mathbf P_x\Big\{\lim_{t\to\infty}\frac{1}{t}\int_0^{t}\sigma(X(s))dB(s)=0\Big\}=1 \ .
\end{align*}

There are a number of ways of thinking about this. In the end they all come down to essentially the same calculations. One way is to show that for some fixed \(\delta \in(0,1)\) the following statement holds with probability one:

There exist a constants \(C(\omega)\) so that
\begin{align*}
\int_0^{t}\sigma(X(s))dB(s) \leq Ct^\delta
\end{align*}
for all \(t >0\).

To show this partition \([0,\infty]\) into blocks and use the Doob-Kolmogorov inequality to estimate the probability that the max of \( \int_0^{t}\sigma(X(s))ds\) on each block excess \(t^\delta\) on that block. Then use the Borel-Cantelli to show that this happens only a finite number of times.

A different way to organize the same calculation is to estimate
\[
\mathbf P_x\Big\{\sup_{t>a}\frac{1}{t}|\int_0^t \sigma(X(s))dB(s)|>\epsilon\Big\}
\]
by breaking the interval \(t>a\) into the union of intervals of the form \(a2^k <t\leq a2^{k+1}\) for \(k=0,1,\dots\) and using Doob-Kolmogorov Martingale inequality. Then let \(a\to\infty\).

Entry and Exit through boundaries

Consider the following one dimensional SDE.
\begin{align*}
dX_t&= \cos( X_t )^\alpha dW(t)\\
X_0&=0
\end{align*}
Consider the equation for \(\alpha >0\). On what interval do you expect to find the solution at all times ? Classify the behavior at the boundaries.

For what values of \(\alpha < 0\) does it seem reasonable to define the process ? any ? justify your answer.

Martingale Exit from an Interval – I

Let \(\tau\) be the first time that a continuous martingale \(M_t\) starting from \(x\) exits the interval \((a,b)\), with \(a<x<b\). In all of the following, we assume that \(\mathbf P(\tau < \infty)=1\). Let \(p=\mathbf P_x\{M(\tau)=a\}\).

Find and analytic expression for \(p\) :

  1. For this part assume that \(M_t\) is the solution to a time homogeneous SDE. That is that \[dM_t=\sigma(M_t)dB_t.\] (with \(\sigma\) bounded and smooth.) What PDE should you solve to find \(p\) ? with what boundary data ? Assume for a moment that \(M_t\) is standard Brownian Motion (\(\sigma=1\)). Solve the PDE you mentioned above in this case.
  2. A probabilistic way of thinking: Return to a general martingale \(M_t\). Let us assume that \(dM_t=\sigma(t,\omega)dB_t\) again with \(\sigma\) smooth and uniformly bounded  from above and away from zero. Assume that \(\tau < \infty\) almost surely and notice that \[\mathbf E_x M(\tau)=a \mathbf P_x\{M_\tau=a\} + b \mathbf P_x\{M_\tau=b.\}\] Of course the process has to exit through one side or the other, so \[\mathbf P_x\{M_\tau=a\} = 1 – \mathbf P_x\{M_\tau=b\}\]. Use all of these facts and the Optimal Stopping Theorem to derive the equation for \(p\).
  3. Return to the case when \[dM_t=\sigma(M_t)dB_t\]. (with \(\sigma\) bounded and smooth.) Write down the equations that \(v(x)= \mathbf E_x\{\tau\}\), \(w(x,t)=\mathbf P_x\{ \tau >t\}\), and \(u(x)=\mathbf E_x\{e^{-\lambda\tau}\}\) with \(\lambda > 0\) satisfy. ( For extra credit: Solve them for \(M_t=B_t\) in this one dimensional setting and see what happens as \(b \rightarrow \infty\).)

Around the Circle

Consider the equation
\begin{align}
dX_t &= -Y_t dB_t – \frac12 X_t dt\\
dY_t &= X_t dB_t – \frac12 Y_t dt
\end{align}
Let \((X_0,Y_0)=(x,y)\) with \(x^2+y^2=1\). Show that \(X_t^2 + Y_t^2 =1\) for all \(t\) and hence the SDE lives on the unit circle. Does this make intuitive sense ?

Shifted Brownian Motion and a PDE

Let \(f \in C_0^2(\mathbf R^n)\) and \(\alpha(x)=(\alpha_1(x),\dots,\alpha_n(x))\) with \(\alpha_i \in C_0^2(\mathbf R^n)\) be given functions and consider the partial differential equations
\begin{align*}
\frac{\partial u}{\partial t} &= \sum_{i=1}^n \alpha_i(x)
\frac{\partial u}{\partial x_i} + \frac{1}{2} \frac{\partial^2
u}{\partial x_i^2} \ \text{ for } t >0 \text{ and }x\in \mathbf R^n \\
u(0,x)&=f(x) \ \text{ for } \ x \in \mathbf R^n
\end{align*}
Use the Girsonov theorem to show that the unique bounded solution \(u(t,x)\) of this equation can be expressed by
\begin{align*}
u(t,x) = \mathbf E_x \left[ \exp\left(\int_0^t \alpha(B(s))\cdot dB(s) –
\frac{1}{2}\int_0^t |\alpha(B(s))|^2 ds \right)f(B(t))\right]
\end{align*}

where \(\mathbf E_x\) is the with respect to \(\mathbf P_x\) when the Brownian Motion starts at \(x\). (Note there maybe a sign error in the above exponential term. Use what ever sign is right.) For the remainder, assume that \(\alpha\)
is a fixed constant \(\alpha_0\). Now using what you know about the distribution of \(B_t\) write the solution to the above equation as an integral kernel integrated against \(f(x)\). (In other words, write \(u(t,x)\) so that your your friends who don’t know any probability might understand it. ie \(u(t,x)=\int K(x,y,t)f(y)dy\) for some \(K(x,y,t)\))

Probability Bridge

For fixed \(\alpha\) and \(\beta\) consider the stochastic differential equation
\[
dY(t)=\frac{\beta-Y(t)}{1-t} dt + dB(t) ~,~~ 0\leq t < 1 ~,~~Y(0)=\alpha.
\]
Verify that \(\lim_{t\to 1}Y(t)=\beta\) with probability one. ( This is called the Brownian bridge from \(\alpha\) to \(\beta\).)
Hint: In the problem “Solving a class of SDEs“,  you found that this equation had the solution
\begin{equation*}
Y_t = a(1-t) + bt + (1-t)\int_0^t \frac{dB_s}{1-s} \quad 0 \leq t <1\; .
\end{equation*}
To answer the question show that
\begin{equation*}
\lim_{t \rightarrow 1^-} (1-t) \int_0^t\frac{dB_s}{1-s} =0 \quad \text{a.s.}
\end{equation*}

Making the Cube of Brownian Motion a Martingale

Let \(B_t\) be a standard one dimensional Brownian
Motion. Find the function \(F:\mathbf{R}^5 \rightarrow \mathbf R\) so that
\begin{align*}
B_t^3 – F\Big(t,B_t,B_t^2,\int_0^t B_s ds, \int_0^t B_s^2 ds\Big)
\end{align*}
is a Martingale.

Hint: It might be useful to introduce the processes
\[X_t=B_t^2\qquad Y_t=\int_0^t B_s ds \qquad Z_t=\int_0^t B_s^2 ds\]

Correlated SDEs

Let \(B_t\) and \(W_t\) be standard Brownian motions which are
independent. Consider
\begin{align*}
dX_t&= (-X_t +1)dt + \rho dB_t + \sqrt{1-\rho^2} dW_t\\
dY_t&= -Y_t dt + dB_t \ .
\end{align*}
Find the covariance of \(\text{Cov}(X_t,Y_t)=\mathbf{E} (X_t Y_t) – \mathbf{E} (X_t) \mathbf{E}( Y_t)\).

Hyperbolic SDE

Consider
\begin{align*}
dX_t=& Y_t dB_t + \frac12 X_t dt\\
dY_t=& X_t dB_t + \frac12 Y_t dt
\end{align*}
Show that \(X_t^2-Y_t^2\) is constant for all \(t\).

Diffusion and Brownian motion

Let \(B_t\) be a standard Brownian Motion  starting from zero and define

\[ p(t,x) = \frac1{\sqrt{2\pi t}}e^{-\frac{x^2}{2t} } \]

Given any \(x \in \mathbf R \), define \(X_t=x + B_t\) . Of course \(X_t\) is just a Brownian Motion stating from \(x\) at time 0. Fixing a smooth, bounded, compactly supported function \(f:\mathbf R \rightarrow \mathbf R\), we define the function \(u(x,t)\) by

\[u(x,t) = \mathbf E_x f(X_t)\]

where we have decorated the expectation with the subscript \(x\) to remind us that we are starting from the point \(x\).

  1. Explain why \[ u(x,t) = \int_{\infty}^\infty f(y)p(t,x-y)dy\]
  2. Show by direct calculation using the formula from the previous question that for \(t>0\), \(u(x,t)\) satisfies the diffusion equation
    \[ \frac{\partial u}{\partial t}= c\frac{\partial^2 u}{\partial x^2}\]
    for some constant \(c\). (Find the correct \(c\) !)
  3. Again using the formula from part 1), show that
    \[ \lim_{t \rightarrow 0} u(t,x) = f(x)\]
    and hence the initial condition for the diffusion equation is \(f\).

Levy’s construction of Brownian Motion

Let \( \{ \xi_k^{(n)} : n =0,1,\dots ; k =1,\dots,2^n\} \) be a collection of independent Gaussian random variables with  \(\xi_k^{(n)}\) having mean zero and variance \(2^{-n}\). Define the random variable \( \eta_k^{(n)}\) recursively by

\[\eta_1^{(0)} = Z \qquad\text{with}\quad Z\sim N(0,1) \quad\text{and independent of the \(\xi\)’s}\]

\[ \eta_{2k}^{(n+1)} = \frac12\eta_{k}^{(n)} -\frac12 \xi_{k}^{(n)}\]

\[ \eta_{2k-1}^{(n+1)} = \frac12\eta_{k}^{(n)} +\frac12 \xi_{k}^{(n)}\]

For any time \(t \in [0,1]\) of the form \(t=k 2^{-n}\) define

\[W^{(n)}_t = \sum_{j=1}^k  \eta_{j}^{(n)}\]

For \(t \in [0,1]\) not of this form we connect the two nearest defined points with a line.

  1. Follow given steps to show that for fixed \(n\), \(W^{(n)}_t\) is random walk on \(\mathbf R\) with Gaussian steps.
    1. Show \(\mathbf E \eta_{k}^{(n)} = 0\) and  \(\mathbf E \big[ (\eta_{k}^{(n)})^2\big] = 2^{-n}\)
    2. Argue that \(\eta_{k}^{(n)} \) is Gaussian and that for any fixed \(n\),
      \[ \{ \eta_{k}^{(n)} : k=1,\dots, 2^n\} \]
      are a collection of mutually independent random variables. (To show independence show that they are mean zero Gaussians  with correlation \(\mathbf E  [\eta_{k}^{(n)}\eta_{j}^{(n)}]=0\) when \(j\neq k\).)
  2. To understand the relationship between \(W^{(n)}\) and \(W^{(n+1)}\), simulate a collection of random \(\xi_k^{(n)}\) and plot \[W^{(0)}, W^{(1)}, W^{(2)}, W^{(3)}, W^{(4)}\]
    over the time interval \([0,1]\). Notice that at \(n\) increases the functions seem to converge. Try a few different realizations to get a feeling for how the limiting function might look.

 

 

Ito Moments

Use Ito’s formula to show that if \(\sigma(t,\omega)\) is a bounded nonanticipating functional, \(|\sigma|\leq M\), then for the stochastic integral \(I(t,\omega)=\int_0^t \sigma(s,\omega) dB(s,\omega)\) we have the moment estimates
\[
\mathbf E\{ |I(t)|^{2p}\}\leq 1\cdot 3\cdot 5 \cdots (2p-1) (M^2 t)^p
\]
for \(p=1,2,3,…\). Follow the steps below to achieve this result.

 

  1. First assume that
    \[\mathbf E \int_0^t |I_s|^{k} \sigma_s dB_s =0\]
    for all positive , integer \(k\). Under this assumption, prove the result. Why don’t we know a priori that this expectation is zero ?
  2. Now define  for positive \(L\), define \(\chi^{p}_L(x)\) as \(x^p\) for \(|x| < L\) and 0 if \(|x| > L+1\) and connected monotonically in between such that the whole function is  smooth. Now define
    \[\psi^{(p)}_L(x) = p\int_0^x \chi^{(p-1)}_L(y) dy\qquad\text{and}\qquad\phi^{(p)}_L(x)= p\int_0^x \psi^{(p-1)}_L(y) dy\]
    Observe that  all three of the functions are globally bounded for a given \(L\). Apply Ito’s formula to \(\phi^{(p)}_L(I_t)\) and the fact that Fatou’s lemma implies that
    \[\mathbf E |I_t|^{2p} \leq \lim_{L \rightarrow \infty} \mathbf E \phi^{(2p)}_L(I_t)\]
    to prove the estimate started at the start by following a similar induction step as used above.

 

Ito to Stratonovich

Let’s think about different ways to make sense of \[\int_0^t W(s)dW(s)\] were \(W(t)\) is a standard Brownian motion. Fix any \(\alpha \in [0,1]\)define

\begin{equation*}
I_N^\alpha(t)=\sum_{j=0}^{N-1} W(t_j^\alpha)[W(t_{j+1})-W(t_j)]
\end{equation*}
were \(t_j=\frac{j t}N\) and \(t_j^\alpha=\alpha t_j + (1-\alpha)t_{j+1}\).
Calculate

  1. \[\lim_{N\rightarrow \infty}\mathbf E I_N^\alpha(t) \ .\]
  2. * \[\lim_{N\rightarrow \infty}\mathbf E \big( I_N^\alpha(t)\big)^2\]
  3. * For which choice of \(\alpha\) is \(I_N^\alpha(t)\) a martingale ?

What choice of \(\alpha\) is the standard It\^o integral ? What choice is the Stratonovich integral ?

Calculating with Brownian Motion

Let \(W_t\) be a standard brownian motion. Fixing an integer \(n\) and a terminal time \(T >0\), let \(\{t_i\}_{i=1}^n\) be a partition of the interval \([0,T]\) with

\[0=t_0 < t_1< \cdots< t_{n-1} < t_n=T\]

Calculate the following two expressions:

  1. \[ \mathbf{E} \Big(\sum_{k=1}^n W_{t_k} \big[ W_{t_{k}} – W_{t_{k-1}} \big] \Big)\]
    Hint: you might want to do the second part of the problem first and then return to this question and write
    \[W_{t_k} \big[ W_{t_{k}} – W_{t_{k-1}} \big]= W_{t_{k-1}} \big[ W_{t_{k}} – W_{t_{k-1}} \big]+ \big[W_{t_{k}} -W_{t_{k-1}}\big]\big[ W_{t_{k}} – W_{t_{k-1}}\big]\]
  2. \[ \mathbf{E} \Big(\sum_{k=1}^n W_{t_{k-1}} \big[ W_{t_{k}} – W_{t_{k-1}} \big] \Big)\]

Simple Numerical Exercise

Let \(\omega_i\) , \(i=1,\cdots\) be a collection of mutually independent, uniform on \([0,1]\) random variables. Define

\[\eta_i(\omega)= \omega_i -\frac12\]

and

\[X_n(\omega) = \sum_{i=1}^n \eta_i(\omega)\,.\]

 

  1. What is \(\mathbf{E}\,X_n\) ?
  2. What is \(\mathrm{Var}(X_n)\) ?
  3. What is \(\mathbf{E}\,X_{n+k} | X_n \) for \(n, k >0\) ?
  4. What is \(\mathbf{E}(\,X_5^2 \,|\, X_3)\) ?
  5. [optional] Write a computer program to simulate some realizations of this process viewing \(n\) as time. Plot some plots of \(n\) vs \(X_n\).
  6. [optional] How do you simulations agree with the first two parts ?

Moment Bounds on Ito Integrals

Use Ito’s formula to show that if \(\sigma(t,\omega)\) is a
nonanticipating random function which is bounded. That is to say

\[ |\sigma(t,\omega)|\leq M\]

for all \(t \geq 0\) and all \(\omega\).

  1. Under this assumption show that the stochastic integral
    \[I(t,\omega)=\int_0^t \sigma(s,\omega) dB(s,\omega)\]
    satisfies  the following moment estimates
    \[\mathbf E\{ |I(t)|^{2p}\}\leq 1\cdot 3\cdot 5 \cdots (2p-1) (M^2 t)^p\]
    for \(p=1,2,3,…\) if one assumes
    \[ \mathbf E \int_0^t |I(s)|^k \sigma(s) dB(s) =0\]
    for any integer \(k\).
  2. Prove the above result without assuming that
    \[ \mathbf E \int_0^t |I(s)|^k \sigma(s) dB(s) =0\]
    since this requires that
    \[ \mathbf E \int_0^t |I(s)|^{2k} \sigma^2(s) ds  < \infty\]
    which we do not know a priory.