Home » Posts tagged 'JCM_math545_HW7_S14'
Tag Archives: JCM_math545_HW7_S14
Discovering the Bessel Process
Let \(W_t=(W^{(1)}_t,\dots,W^{(n)}_t) \) be an \(n\)-dimensional Brownian motion with \( W^{(i)}_t\) standard independent 1-dim brownian motions and \(n \geq 2\).
Let
\[X_t = \|W_t\| = \Big(\sum_{i=1}^n (W^{(i)}_t)^2\Big)^{\frac12}\]
be the norm of the brownian motions. Even though the absolute value is not differentiable at zero we can still apply Itos formula since Brownian motion never visits the origin if the dimension is greater than zeros.
- Use Ito’s formula to show that \(X_t\) satisfies the Ito process
\[ dX_t = \frac{n-1}{2 X_t} dt + \sum_{i=1}^n \frac{W^{(i)}_t }{X_t} dW^{(i)}_t \] - Using the Levy-Doob Theorem show that
\[Z_t =\sum_{i=1}^n \int_0^t \frac{W^{(i)}_t }{X_t} dW^{(i)}_t \]
is a standard Brownian Motion. - In light of the above discussion argue that \(X_t\) and \(Y_t\) have the same distribution if \(Y_t\) is defined by
\[ dY_t = \frac{n-1}{2 Y_t} dt + dB_t\]
where \(B_t\) is a standard Brownian Motion.
Take a moment to reflect on what has been shown. \(W_t\) is a \(\mathbf R^n\) dimensional Markov Process. However, there is no guarantee that the one dimensional process \(X_t\) will again be a Markov process, much less a diffusion. The above calculation shows that the distribution of \(X_{t+h}\) is determined completely by \(X_t\) . In particular, it solves a one dimensional SDE. We were sure that \(X_t\) would be an Ito process but we had no guarantee that it could be written as a single closed SDE. (Namely that the coefficients would be only functions of \(X_t\) and not of the details of the \(W^{(i)}_t\)’s.
One dimensional stationary measure
Consider the one dimensional SDE
\[dX_t = f(X_t) dt + g(X_t) dW_t\]
which we assume has a unique global in time solution. For simplicity let us assume that there is a positive constant \(c\) so that \( 1/c < g(x)<c\) for all \(x\) and that \(f\) and \(g\) are smooth.
A stationary measure for the problem is a probability measure \(\mu\) so that if the initial distribution \(X_0\) is distributed according to \(\mu\) and independent of the Brownian Motion \(W\) then \(X_t\) will be distributed as \(\mu\) for any \(t \geq 0\).
If the functions \(f\) and \(g\) are “nice” then the distribution at time \(t\) has a density with respect to Lebesgue measure (“dx”). Which is to say there is a function \(p_x(t,y)\) do that for any \(\phi\)
\[\mathbf E_x \phi(X_t) = \int_{-\infty}^\infty p_x(t,y)\phi(y) dy\]
and \(p_\phi(t,y)\) solves the following equation
\[\frac{\partial p_\phi}{\partial t}(t,y) = (L^* p_\phi)(t,y)\]
with \( p_\phi(0,y) = \phi(y)\) where \(\phi(z)\) is the density with respect to Lebesgue of the initial density. (The pdf of \(X_0\) .)
\(L^*\) is the formal adjoint of the generator \(L\) of \(X_t\) and is defined by
\[(L^*\phi)(y) = – \frac{\partial\ }{\partial y}( f \phi)(y) + \frac12 \frac{\partial^2\ }{\partial y^2}( g^2 \phi)(y) \]
Since we want \(p_x(t,y)\) not to change when it is evolved forward with the above equation we want \( \frac{\partial p}{\partial t}=0\) or in other words
\[(L^* p_\phi)(t,y) =0\]
- Let \(F\) be such that \(-F’ = f/g^2\). Show that \[ \rho(y)=\frac{K}{g^2(y)}\exp\Big( – 2F(y) \Big)\] is an invariant density where \(K\) is a normalization constant which ensures that
\[\int \rho(y) dy =1\] - Find the stationary measure for each of the following SDEs:
\[dX_t = (X_t – X^3_t) dt + \sqrt{2} dW_t\]\[dX_t = – F'(X_t) dt + \sqrt{2} dW_t\] - Assuming that the formula derived above make sense more generally, compare the invariant measure of
\[ dX_t = -X_t + dW_t\]
and
\[ dX_t = -sign(X_t) dt + \frac{1}{\sqrt{|X_t|}} dW_t\] - Again, proceding fromally assuming everything is well defined and makes sense find the stationary density of \[dX_t = – 2\frac{sign(X_t)}{|X_t|} dt + \sqrt{2} dW_t\]
Numerical SDEs – Euler–Maruyama method
If we wanted to simulate the SDE
\[dX_t = b(X_t) dt + \sigma(X_t) dW_t\]
on a computer then we truly want to approximate the associated integral equation over a sort time time interval of length \(h\). Namely,
\[X_{t+h} – X_t= \int_t^{t+h}b(X_s) ds + \int_t^{t+h}\sigma(X_s) dW_s\]
It is reasonable for \(s \in [t,t+h]\) to use the approximation
\begin{align}b(X_s) \approx b(X_t) \qquad\text{and}\qquad\sigma(X_s)\approx \sigma(X_t)\end{align}
which implies
\[X_{t+h} – X_t\approx b(X_t) h + \sigma(X_t) \big( W_{t+h}-W_{t}\big)\]
Since \(W_{t+h}-W_{t}\) is a Gaussian random variable with mean zero and variance \(h\), this discussion suggests the following numerical scheme:
\[ X_{n+1} = X_n + b(X_n) h + \sigma(X_n) \sqrt{h} \eta_n\]
where \(h\) is the time step and the \(\{ \eta_n : n=0,\dots\}\) are a collection of mutually independent standard Gaussian random variable (i.e. mean zero and variance 1). This is called the Euler–Maruyama method.
Use this method to numerically approximate (and plot) several trajectories of the following SDEs numerically.
- \[dX_t = -X_t dt + dW_t\]
- For \(r=-1, 0, 1/4, 1/2, 1\)
\[ dX_t = r X_t dt + X_t dW_t\]Does it head to infinity or approach zero ?Look at different small \(h\). Does the solution go negative ? Should it ? - \[dX_t = X dt -X_t^3 dt + \alpha dW_t\]
Try different values of \(\alpha\). For example, \(\alpha = 1/10, 1, 2 ,10\). How does it act ? What is the long time behavior ? How does it compare with what is learned in the “One dimensional stationary measure” problem about the stationary measure of this equation.
Solving a class of SDEs
Let us try a systematic procedure for solving SDEs which works for a class of SDEs. Let
\begin{align*}
X(t)=a(t)\left[ x_0 + \int_0^t b(s) dB(s) \right] +c(t) \ .
\end{align*}
Assuming \(a\), \(b\), and \(c\) are differentiable, use Ito’s formula to find the equation for \(dX(t)\) of the form
\begin{align*}
dX(t)=[ F(t) X(t) + H(t)] dt + G(t)dB(t)
\end{align*}
were \(F(t)\), \(G(t)\), and \(H(t)\) are some functions of time depending on \(a,b\) and maybe their derivatives. Solve the following equations by matching the coefficients. Let \(\alpha\), \(\gamma\) and \(\beta\) be fixed numbers.
Notice that
\begin{align*}
X(t)=a(t)\left[ x_0 + \int_0^t b(s) dB(s) \right] +c(t)=F(t,Y(t)) \ .
\end{align*}
where \(dY(t)=b(t) dB(t)\). Then you can apply Ito’s formula to this definition to find \(dX(t)\).
- First consider
\[dX_t = (-\alpha X_t + \gamma) dt + \beta dB_t\]
with \(X_0 =x_0\). Solve this for \( t \geq 0\) - Now consider
\[dY(t)=\frac{\beta-Y(t)}{1-t} dt + dB(t) ~,~~ 0\leq t < 1 ~,~~Y(0)=\alpha.\]
Solve this for \( t\in[0,1] \). - \begin{align*}
dX_t = -2 \frac{X_t}{1-t} dt + \sqrt{2 t(1-t)} dB_t ~,~~X(0)=\alpha
\end{align*}
Solve this for \( t\in[0,1] \).
Around the Circle
Consider the equation
\begin{align}
dX_t &= -Y_t dB_t – \frac12 X_t dt\\
dY_t &= X_t dB_t – \frac12 Y_t dt
\end{align}
Let \((X_0,Y_0)=(x,y)\) with \(x^2+y^2=1\). Show that \(X_t^2 + Y_t^2 =1\) for all \(t\) and hence the SDE lives on the unit circle. Does this make intuitive sense ?
Correlated SDEs
Let \(B_t\) and \(W_t\) be standard Brownian motions which are
independent. Consider
\begin{align*}
dX_t&= (-X_t +1)dt + \rho dB_t + \sqrt{1-\rho^2} dW_t\\
dY_t&= -Y_t dt + dB_t \ .
\end{align*}
Find the covariance of \(\text{Cov}(X_t,Y_t)=\mathbf{E} (X_t Y_t) – \mathbf{E} (X_t) \mathbf{E}( Y_t)\).