Category Archives: Ito Integrals

BDG Inequality

Consider \(I(t)\) defined by \[I(t)=\int_0^t \sigma(s,\omega)dB(s,\omega)\] where \(\sigma\) is adapted and \(|\sigma(t,\omega)| \leq K\) for all \(t\) with probability one. Inspired by   problem “Homogeneous Martingales and Hermite Polynomials”  Let us set
\begin{align*}Y(t,\omega)=I(t)^4 – 6 I(t)^2\langle I \rangle(t) + 3 \langle I \rangle(t)^2 \ .\end{align*}

  1. Quote  the problem “Ito Moments” to show that \(\mathbb{E}\{ |Y(t)|^2\} < \infty\) for all \(t\). Then  verify that \(Y_t\) is  a martingale.
  2. Show that \[\mathbb{E}\{ I(t)^4 \} \leq 6 \mathbb{E} \big\{ \{I(t)^2\langle I \rangle(t) \big\}\]
  3. Recall the Cauchy-Schwartz inequality. In our language it states that
    \begin{align*}
    \mathbb{E} \{AB\} \leq (\mathbb{E}\{A^2\})^{1/2} (\mathbb{E}\{B^2\})^{1/2}
    \end{align*}
    Combine this with the previous inequality to show that\begin{align*}\mathbb{E}\{ I(t)^4 \} \leq 36 \mathbb{E} \big\{\langle I \rangle(t)^2 \big\} \end{align*}
  4. We know that  \(I^4\) is a submartingale (because \(x \mapsto x^4\) is convex). Use the Kolmogorov-Doob inequality and all that we have just derived to show that
    \begin{align*}
    \mathbb{P}\left\{ \sup_{0\leq s \leq T}|I(s)|^4 \geq \lambda \right\} \leq ( \text{const}) \frac{ \mathbb{E}\left( \int_0^T \sigma(s,\omega)^2 ds\right)^2 }{\lambda}
    \end{align*}

Paley-Wiener-Zygmund Integral

Definition of stochastic integrals by integration by parts

In 1959, Paley, Wiener, and Zygmund gave a definition of the stochastic integral based on integration by parts. The resulting integral will agree with the Ito integral when both are defined. However the Ito integral will have a much large domain of definition. We will now follow the develop the integral as outlined by Paley, Wiener, and Zygmund:

  1. Let \(f(t)\) be a deterministic function with \(f'(t)\) continuous. Prove that \begin{align*} \int_0^1 f(t)dW(t) = f(1)W(1) – \int_0^1 f'(t)W(t) dt\end{align*}
    where the first integral is the Ito integral and the last integral is defined path-wise as the standard Riemann integral since the integrands are a.s. continuous.
  2. Now let \(f\) we as above with in addition \(f(1)=0\) and “define” the stochastic integral \(\int_0^1 f(t) * dW(t)\) by the relationship
    \begin{align*}
    \int_0^1 f(t) *dW(t) = – \int_0^1 f'(t) W(t) dt\;.
    \end{align*}
    Where the integral on the right hand side is the standard Riemann integral.

    If the condition \(f(1)=0\) seems unnatural to you, what this is really saying is that \(f\) is supported on \([0,1)\). In many ways it would be most natural to consider \(f\) on \([0,\infty)\) with compact support. Then \(f(\infty)=0\). We consider the unit interval for simplicity.

  3. Show by direct calculation (not by the Ito isometry) that
    \begin{align*}
    \mathbf E \left[ \left(\int_0^1 f(t)* dW(t)\right)^2\right]=\int_0^1 f^2(t) dt\;,
    \end{align*}
    Paley, Wiener, and Zygmund then used this isometry to extend the integral to any deterministic function in \(L^2[0,1]\). This can be done since for any \(f \in L^2[0,1]\), one can find a sequence of deterministic functions in \(\phi_n \in C^1[0,1]\) with \(\phi_n(1)=0\) so that
    \begin{equation*}
    \int_0^1 (f(s) – \phi_n(s))^2ds \rightarrow 0 \text{ as } n \rightarrow 0\,.
    \end{equation*}

 

Stratanovich integral

Let \(X_t\) be an Ito processes with
\begin{align*}
dX_t&=f_tdt + g_tdW_t
\end{align*}
and \(B_t\) be a second (possibly correlated with \(W\) ) Brownian
motion. We define the Stratanovich integral \(\int X_t \circ dB_t\)  by
\begin{align*}
\int_0^T X_t \circ dB_t = \int_0^T X_t dB_t + \frac12 \int_0^T \;d\langle X, B \rangle_t
\end{align*}
Recall that if \(B_t=W_t\) then \(d\langle B, W \rangle_t =dt\) and it is zero if they are independent. Use this definition to calculate:

  1. \(\int_0^t B_t \circ dB_t\) (Explain why this agrees with the answer you obtained here).
  2. Let \(F\) be a smooth function. Find equation satisfied by \(Y_t=F(B_t)\) written in terms of Stratanovich integrals. (Use Ito’s formula to find the equation for \(dY_t\) in terms of Ito integrals and then use the above definition to rewrite the Ito integrals as Stratanovich integrals“\(\circ dB_t\)”.) How does this compare to classical calculus ?
  3. (Integration by parts) Let \(Z_t\) be a second Ito process satisfying
    \begin{align*}
    dZ_t&=b_tdt + \sigma_tdW_t\;.
    \end{align*}
    Calculate \(d(X_t Z_t)\) using Ito’s formula and then write it in terms of Stratanovich integrals. Why is this part of the problem labeled integration by parts ? (Write the integral form of the expression you derived for \(d(X_t Z_t)\) in the two cases. What are the differences ?)

 

A simple Ito Integral

Let \(\mathcal F_t\) be a filtration of \(\sigma\)-algebra and \(W_t\) a standard Brownian Motion adapted to the filtration. Define the adapted stochastic process \(X_t\) by

\[ X_t = \alpha_0 \mathbf 1_{[0,\frac12]}(t) +  \alpha_{\frac12} \mathbf 1_{(\frac12,1]}(t) \]

where \(\alpha_0\) is a random variable adapted to  \(\mathcal F_0\) and  \(\alpha_{\frac12}\) is a random variable adapted to  \(\mathcal F_{\frac12}\).

Write explicitly the Ito integral

\[\int_0^t X_s dW_s\]

and show by direct calculation that

\[\mathbf E \Big( \int_0^t X_s dW_s\Big) = 0\]

and

\[\mathbf E \Big[\Big( \int_0^t X_s dW_s\Big)^2\Big] = \int_0^t \mathbf E X_s^2 ds\]

 

 

Quadratic Variation of Ito Integrals

Given a stochastic process  \(f_t\) and \(g_t\) adapted to a filtration \(\mathcal F_t\) satisfying

\[\int_0^T\mathbf E f_t^2 dt < \infty\quad\text{and}\quad \int_0^T\mathbf E g_t^2 dt < \infty\]

define

\[M_t =\int_0^t f_s dW_s \quad \text{and}\quad N_t =\int_0^t g_s dW_s\]

for some standard Brownian Motion also adapted to the  filtration \(\mathcal F_t\) . Though it is not necessary, assume that  there exists a \(K>0\) so that  \(|f_t|\) and \(|g_t|\)  are less than some \(K\)  for all \(t\) almost surely.

Let \(\{ t_i^{(n)} : i=0,\dots,N(n)\}\) be sequence of partitions of \([0,T]\) of the form

\[ 0 =t_0^{(n)} < t_1^{(n)} <\cdots<t_N^{(n)}=T\]

such that

\[ \lim_{n \rightarrow \infty} \sup_i |t_{i+1}^{(n)} – t_i^{(n)}| = 0\]

Defining

\[V_n[M]=\sum_{i=1}^{N(n)} \big(M_{t_i} -M_{t_{i-1}}\big)^2\]

and

\[Q_n[M,N]= \sum_{i=1}^{N(n)} \big(M_{t_i} -M_{t_{i-1}}\big)\big(N_{t_i} -N_{t_{i-1}}\big)\]

Clearly \(V_n[M]= Q_n[M,M]\). Show  that the following points hold.

  1. The “polarization equality” holds:
    \[ 4 Q_n[M,N] =V_n[M+N] -V_n[M-N]\]
    Hence it is enough to understand the limit of \(n \rightarrow \infty\) of \(Q_n\) or \(V_n\).
  2. \[\mathbf E V_n[M]= \int_0^T \mathbf E f_t^2 dt\]
  3. * \(V_n[M]\rightarrow \int_0^T  f_t^2 dt\) as \(n \rightarrow \infty\) in \(L^2\). That is to say
    \[ \lim_{n \rightarrow \infty}\mathbf E \Big[ \big( V_n[M] –   \int_0^T  f_t^2 dt  \big)^2 \Big]=0\]
    This limit is called the Quadratic Variation of the Martingale \(M\).
  4. Using the results above, show that \(Q_n[M,N]\rightarrow \int_0^T  f_t g_t dt\) as \(n \rightarrow \infty\) in \(L^2\). This is called the cross-quadratic variation of \(M\) and \(N\).
  5. * Prove by direct calculation that  in the spirit of 3) from above that   \(Q_n[M,N]\rightarrow \int_0^T  f_t g_t dt\) as \(n \rightarrow \infty\) in \(L^2\).

 

In this context, one writes \(\langle M \rangle_T\) for the limit of the \(V_n[M]\)  which is called the quadratic variation process of \(M_T\). Similarly  one writes  \(\langle M,N \rangle_T\) for the  limit of \(Q_n[M,N]\)  which is called the cross-quadratic variation process of \(M_T\) and \(N_T\). Clearly \(\langle M \rangle_T = \langle M,M \rangle_T\) and \( \langle M+N,M \rangle_T = \langle M,  M+N\rangle_T= \langle M \rangle_T + \langle M,  N\rangle_T\).

 

 

Covariance of Ito Integrals

Let \(f_t\) and \(f_t\) be two stochastic processes adapted to a filtration \(\mathcal F_t\) such that

\[\int_0^\infty \mathbf E (f_t^2) dt < \infty \qquad \text{and} \qquad \int_0^\infty \mathbf E (g_t^2) dt < \infty\]

Let \(W_t\) be a standard brownian motion  also adapted to the filtration \(\mathcal F_t\) and define the stochastic processes

\[ X_t =\int_0^t f_s dW_s \qquad \text{and} \qquad Y_t=\int_0^t g_s dW_s\]

Calculate the following:

  1. \( \mathbf E (X_t  X_s ) \)
  2. \( \mathbf E (X_t  Y_t ) \)
    Hint: You know how to compute \( \mathbf E (X_t^2 ) \) and \( \mathbf E (Y_t^2 ) \). Use the fact that \((a+b)^2 = a^2 +2ab + b^2\) to answer the question. Simplify the result to get a compact expression for the answer.
  3. Show that if \(f_t=\sin(2\pi t)\) and \(g_t=\cos(2\pi t)\) then \(X_1\) and \(Y_1\) are independent random variables.(Hint: use the result here  to deduce that \(X_1\) and \(Y_1\) are mean zero gaussian random variables. Now use the above results to show that the covariance of \(X_1\) and \(Y_1\) is zero. Combining these two facts implies that the random variables are independent.)

Gaussian Ito Integrals

In this problem, we will show that the Ito integral of a deterministic function is a Gaussian Random Variable.

Let \(\phi\) be deterministic elementary functions. In other words there exists a  sequence of  real numbers \(\{c_k : k=1,2,\dots,N\}\) so that

\[ \sum_{k=1}^\infty c_k^2 < \infty\]

and there exists a partition

\[0=t_0 < t_1< t_2 <\cdots<t_N=T\]

so that

\[ \phi(t) = \sum_{k=1}^N c_k \mathbf{1}_{[t_{k-1},t_k)}(t) \]

 

  1. Show that if \(W(t)\) is a standard brownian motion then the Ito integral
    \[ \int_0^T \phi(t) dW(t)\]
    is a Gaussian random variable with mean zero and variance
    \[ \int_0^T \phi(t)^2 dt \]
  2. * Let \(f\colon [0,T] \rightarrow \mathbf R\) be a deterministic function such that
    \[\int_0^T f(t)^2 dt < \infty\]
    Then it can be shown that there exists a sequence of  deterministic elementary functions \(\phi_n\) as above such that
    \[\int_0^T (f(t)-\phi_n(t))^2 dt \rightarrow 0\qquad\text{as}\qquad n \rightarrow \infty\]
    Assuming this fact, let \(\psi_n\) be the characteristic function  of the random variable
    \[ \int_0^T \phi_n(t) dW(t)\]
    Show that for all \(\lambda \in \mathbf R\), show that
    \[ \lim_{n \rightarrow \infty} \psi_n(\lambda) = \exp \Big( -\frac{\lambda^2}2 \big( \int_0^T f(t)^2 dt \big)  \Big)\]
    Then use the the convergence result here to conclude that
    \[ \int_0^T f(t) dW(t)\]
    is a Gaussian Random Variable with mean zero and variance
    \[\int_0^T f(t)^2 dt \]
    by identifying the limit of the characteristic functions above.

    Note: When Probabilistic say the “characteristic function” of a random distribution they just mean the Fourier transform of the random variable. See here.

Homogeneous Martingales and BDG Inequality

Part I

  1. Let \(f(x,y):\mathbb{R}^2 \rightarrow \mathbb{R}\) be a twice differentiable function in both \(x\) and \(y\). Let \(M(t)\) be defined by \[M(t)=\int_0^t \sigma(s,\omega) dB(s,\omega)\]. Assume that \(\sigma(t,\omega)\) is adapted and that \(\mathbb{E} M^2 < \infty\) for all \(t\) a.s. .(Here \(B(t)\) is standard Brownian Motion.) Let \(\langle M \rangle(t)\) be the quadratic variation process of \(M(t)\). What equation does \(f\) have to satisfy so that \(Y(t)=f(M(t),\langle M \rangle(t))\) is again a martingale if we assume that \(\mathbf E\int_0^t \sigma(s,\omega)^2 ds < \infty\).
  2. Set
    \begin{align*}
    f_n(x,y) = \sum_{0 \leq m \leq \lfloor n/2 \rfloor} C_{n,m} x^{n-2m}y^m
    \end{align*}
    here \(\lfloor n/2 \rfloor\) is the largest integer less than or equal to \(n/2\). Set \(C_{n,0}=1\) for all \(n\). Then find a recurrence relation for \(C_{n,m+1}\) in terms of \(C_{n,m}\), so that \(Y(t)=f_n(B(t),t)\) will be a martingale.Write out explicitly \(f_1(B(t),), \cdots, f_4(B(t),t)\) as defined in the previous item.

Part II

Now consider \(I(t)\) defined by \[I(t)=\int_0^t \sigma(s,\omega)dB(s,\omega)\] where \(\sigma\) is adapted and \(|\sigma(t,\omega)| \leq K\) for all \(t\) with probability one. In light of the above let us set
\begin{align*}Y(t,\omega)=I(t)^4 – 6 I(t)^2\langle I \rangle(t) + 3 \langle I \rangle(t)^2 \ .\end{align*}

  1. Quote  the problem “Ito Moments” to show that \(\mathbb{E}\{ |Y(t)|^2\} < \infty\) for all \(t\). Then use the first part of this problem to conclude that \(Y\) is a martingale.
  2. Show that \[\mathbb{E}\{ I(t)^4 \} \leq 6 \mathbb{E} \big\{ \{I(t)^2\langle I \rangle(t) \big\}\]
  3. Recall the Cauchy-Schwartz inequality. In our language it states that
    \begin{align*}
    \mathbb{E} \{AB\} \leq (\mathbb{E}\{A^2\})^{1/2} (\mathbb{E}\{B^2\})^{1/2}
    \end{align*}
    Combine this with the previous inequality to show that\begin{align*}\mathbb{E}\{ I(t)^4 \} \leq 36 \mathbb{E} \big\{\langle I \rangle(t)^2 \big\} \end{align*}
  4. As discussed in class \(I^4\) is a submartingale (because \(x \mapsto x^4\) is convex). Use the Kolmogorov-Doob inequality and all that we have just derived to show that
    \begin{align*}
    \mathbb{P}\left\{ \sup_{0\leq s \leq T}|I(s)|^4 \geq \lambda \right\} \leq ( \text{const}) \frac{ \mathbb{E}\left( \int_0^T \sigma(s,\omega)^2 ds\right)^2 }{\lambda}
    \end{align*}

 

Ito Moments

Use Ito’s formula to show that if \(\sigma(t,\omega)\) is a bounded nonanticipating functional, \(|\sigma|\leq M\), then for the stochastic integral \(I(t,\omega)=\int_0^t \sigma(s,\omega) dB(s,\omega)\) we have the moment estimates
\[
\mathbf E\{ |I(t)|^{2p}\}\leq 1\cdot 3\cdot 5 \cdots (2p-1) (M^2 t)^p
\]
for \(p=1,2,3,…\). Follow the steps below to achieve this result.

 

  1. First assume that
    \[\mathbf E \int_0^t |I_s|^{k} \sigma_s dB_s =0\]
    for all positive , integer \(k\). Under this assumption, prove the result. Why don’t we know a priori that this expectation is zero ?
  2. Now define  for positive \(L\), define \(\chi^{p}_L(x)\) as \(x^p\) for \(|x| < L\) and 0 if \(|x| > L+1\) and connected monotonically in between such that the whole function is  smooth. Now define
    \[\psi^{(p)}_L(x) = p\int_0^x \chi^{(p-1)}_L(y) dy\qquad\text{and}\qquad\phi^{(p)}_L(x)= p\int_0^x \psi^{(p-1)}_L(y) dy\]
    Observe that  all three of the functions are globally bounded for a given \(L\). Apply Ito’s formula to \(\phi^{(p)}_L(I_t)\) and the fact that Fatou’s lemma implies that
    \[\mathbf E |I_t|^{2p} \leq \lim_{L \rightarrow \infty} \mathbf E \phi^{(2p)}_L(I_t)\]
    to prove the estimate started at the start by following a similar induction step as used above.

 

Ito to Stratonovich

Let’s think about different ways to make sense of \[\int_0^t W(s)dW(s)\] were \(W(t)\) is a standard Brownian motion. Fix any \(\alpha \in [0,1]\)define

\begin{equation*}
I_N^\alpha(t)=\sum_{j=0}^{N-1} W(t_j^\alpha)[W(t_{j+1})-W(t_j)]
\end{equation*}
were \(t_j=\frac{j t}N\) and \(t_j^\alpha=\alpha t_j + (1-\alpha)t_{j+1}\).
Calculate

  1. \[\lim_{N\rightarrow \infty}\mathbf E I_N^\alpha(t) \ .\]
  2. * \[\lim_{N\rightarrow \infty}\mathbf E \big( I_N^\alpha(t)\big)^2\]
  3. * For which choice of \(\alpha\) is \(I_N^\alpha(t)\) a martingale ?

What choice of \(\alpha\) is the standard It\^o integral ? What choice is the Stratonovich integral ?

Moment Bounds on Ito Integrals

Use Ito’s formula to show that if \(\sigma(t,\omega)\) is a
nonanticipating random function which is bounded. That is to say

\[ |\sigma(t,\omega)|\leq M\]

for all \(t \geq 0\) and all \(\omega\).

  1. Under this assumption show that the stochastic integral
    \[I(t,\omega)=\int_0^t \sigma(s,\omega) dB(s,\omega)\]
    satisfies  the following moment estimates
    \[\mathbf E\{ |I(t)|^{2p}\}\leq 1\cdot 3\cdot 5 \cdots (2p-1) (M^2 t)^p\]
    for \(p=1,2,3,…\) if one assumes
    \[ \mathbf E \int_0^t |I(s)|^k \sigma(s) dB(s) =0\]
    for any integer \(k\).
  2. Prove the above result without assuming that
    \[ \mathbf E \int_0^t |I(s)|^k \sigma(s) dB(s) =0\]
    since this requires that
    \[ \mathbf E \int_0^t |I(s)|^{2k} \sigma^2(s) ds  < \infty\]
    which we do not know a priory.