Example: barber

Stochastic Di erential Equations - Uni Ulm Aktuelles

Stochastic Differential Equations Lecture Notes SummerTerm 2012. by Markus Kunze PREFACE iii Preface The present manuscript contains the notes for a lecture given at the University of Ulm in the summer term 2012. It was my goal to give an overview of existence and uniqueness results for Stochastic differential Equations . Moreover, I wanted to give a presentation of the results which is more or less self-contained, thus I wanted to avoid merely quoting results, even if the results are somewhat technical. As prerequisites, I assumed basic knowledge from measure theory, probability theory and functional analysis, as well as some familarity with ordinary differential Equations . Some more advanced results are recalled and (with the exception of Prokhorov's theorem) also proved in the Appendices.

results for stochastic di erential equations. Moreover, I wanted to give a presentation of the results which is more or less self-contained, thus I wanted to avoid merely quoting results, even if the results are somewhat technical. As prerequisites, I assumed basic knowledge from …

Tags:

  Equations, Stochastic, Erential, Stochastic di erential equations

Information

Domain:

Source:

Link to this page:

Please notify us if you found a problem with this document:

Other abuse

Transcription of Stochastic Di erential Equations - Uni Ulm Aktuelles

1 Stochastic Differential Equations Lecture Notes SummerTerm 2012. by Markus Kunze PREFACE iii Preface The present manuscript contains the notes for a lecture given at the University of Ulm in the summer term 2012. It was my goal to give an overview of existence and uniqueness results for Stochastic differential Equations . Moreover, I wanted to give a presentation of the results which is more or less self-contained, thus I wanted to avoid merely quoting results, even if the results are somewhat technical. As prerequisites, I assumed basic knowledge from measure theory, probability theory and functional analysis, as well as some familarity with ordinary differential Equations . Some more advanced results are recalled and (with the exception of Prokhorov's theorem) also proved in the Appendices.

2 In the preparation of this manuscript I used the following monographs which I also recommend for further reading: The book by Kallenberg [4] gives an overview of all of probability and is a source of concise and elegant proofs. The books of Karatzas and Shreve [5], ksendal [6] and Revuz and Yor [7] are standard intruductions to the topic, with the book by ksendal maybe being the most student-friendly . The books by Stroock and Varadhan [8] and Ethier Kurtz [2] are more focussed around the martingale problem, with [8] more focussed on diffusion processes (and thus partial differential Equations ) whereas [2] also treats more general Markov processes (which not necessarily continuous paths). The current manuscript is a preliminary version.

3 It might be changed during the semes- ter. If you find any mistakes, please let me know. Contents Preface iii Chapter 1. A First Glance at Stochastic Integration 1. Brownian Motion 1. The Wiener Integral 2. It . o's Integral 5. Exercises 8. Chapter 2. Continuous Local Martingales 11. Martingales: Basic results 11. Quadratic Variation 15. Covariation 19. Exercises 21. Chapter 3. Stochastic Calculus 23. The It o Integral 23. It . o's Formula 28. First applications of It . o's Formula 30. Exercises 33. Chapter 4. Stochastic Differential Equations with Locally Lipschitz Coefficients 37. Solutions via Banach's Fixed Point Theorem 37. Extension to locally Lipschitz Coefficients 39. Examples 41. Exercises 44.

4 Chapter 5. Yamada-Watanabe Theory 47. Different notions of existence and uniqueness 47. On strong and weak uniqueness 49. Pathwise uniqueness for some one-dimensional Equations 52. Exercises 54. Chapter 6. Martingale Problems 57. The Martingale Problem associated to an SDE 57. Existence of weak solutions 63. Uniqueness of solutions 66. Exercises 68. Appendix A. Continuity of Paths for Brownian Motion 69. Appendix B. Stochastic Processes as Random Elements 73. Appendix C. Stieltjes Integrals 77. Appendix D. Measures on Topolgical Spaces 81. Bibliography 83. v CHAPTER 1. A First Glance at Stochastic Integration Brownian Motion Definition Let ( , , P) be a probability space, I R. A Stochastic process is a family (X(t))t I of random variables X(t) : R.

5 Definition Let ( , , P) be a probability space. A Brownian motion or Wiener process is a Stochastic process (W (t))t 0 such that (1) W (0) = 0 almost surely. (2) W (t + s) W (t) is independent of (W (r) : 0 r t) for all t, s 0. (3) W (t + s) W (t) has distribution N (0, s), Gaussian distribution with mean 0 and variance s. Given a filtration F = (Ft )t 0 , a family of sub -algebras Ft with Fs Ft for t s, we will say that a process (W (t))t 0 is an F-Brownian motion, if it is adapted, W (t) is Ft measurable and (1), (2') and (3) hold, where (20 ) W (t) W (s) is independent of Fs for all 0 t < s. Remark A Brownian motion (W (t))t 0 is an FW -Brownian motion where FW is the -algebra generated by (W (t))t 0 , Ft = (W (s) : s t).

6 To construct Brownian motion, we make use of so-called isonormal Gaussian processes. Definition Let ( , , P) be a probability space, H be a Hilbert space. An H- isonormal Gaussian process is a map W : H L2 ( , , P) such that (1) W (h) is a (centered) Gaussian random variable for all h H, for some q [0, ). q 2. we have W (h) (t) := EeitW (h) = e 2 t . Thus, W (h) is either constantly zero (q = 0). or has distribution N (0, q) (q > 0). (2) (h1 h2 )H = (W (h1 ) W (h2 ))L2 ( ) for all h1 , h2 H. q 2. Remark If X is centered Gaussian with X = e 2 t , then q 2 q 2. 0X (t) = qte 2 t 00X (t) = ( q + q 2 t2 )e 2 t (3) q 2 (4) q 2. X (t) = (3q 2 t q 3 t3 )e 2 t X (t) = (3q 2 6q 3 t2 + q 4 t4 )e 2 t (3). Hence, for the first and third moment of X we have EX = i 0X (0) = 0 = i3 X (0) = EX 3 , for the second moment we obtain EX 2 = i2 00X (0) = q and for the forth moment EX 4 =.]

7 (4). i4 X (0) = 3q 2 . Lemma Let ( , , P) be a probability space, H be a Hilbert space and W : H . be an H-isonormal Gaussian process. Then W is linear. L2 ( , , P). Proof. For , R and h1 , h2 H we have kW ( h1 + h2 ) W (h1 ) + W (h2 )k2L2 ( ) = 0. which proves that W is linear.. 1. 2 1. A FIRST GLANCE AT Stochastic INTEGRATION. Proposition Let H be a separable Hilbert space, ( , , P) be a probability space on which Q a sequence N of independent standard Gaussian random variables ( n )n N is defined ( ( k N R, k N B(R), n N ) with k ((xn )n N ) = xk ). N. Then there exists an H-isonormal Gaussian process W : H L2 ( ). Proof. Let (en )n N be an orthonormal basis of H and define X. W (h) := n (h en ).

8 N N. Then W is an H-isonormal Gaussian PN process. Indeed, by independence, k=1 k (h ek ) is a Gaussian random variable with variance PN. k=1 |(h ek )| for every N N. Upon N , we see that W (h) is Gaussian with variance 2. P 2 2. k=1 |(h ek )| = khkH .. Theorem There exists a Brownian motion. Proof. Let W be an L2 ([0, ))-isonormal Gaussian process and put W (t) := W (1[0,t) ). Then kW (0)kL2 ( ) = k0kL2 ([0, )) = 0, hence W (0) = 0 almost surely. Given 0 t1 <. t2 < tn = t < t + s, observe that the vectors 1[t1 ,t2 ) , .. , 1[tn 1 ,tn ) , 1[t,t+s) are orthogonal in L2 ([0, )). Since W is isometric, W (t2 ) W (t1 ) = W (1[t1 ,t2 ) ), .. W (tn ) W (tn 1 ) =. W (1[tn 1 ,tn ) ), W (t+s) W (t) = W (1[t,t+s) ) are orthogonal in L2 ( ); since the latter random variables are jointly Gaussian, they are independent.]]]]]]]]]]

9 Since (W (r) : 0 r t) = (W (r) . W (q) : 0 q r t), it follows that W (t + s) W (s) is independent of (W (r) : 0 r t). Finally, W (t + s) W (t) is Gaussian with mean 0 and covariance k1[t,t+s) kL2 ([0, )) = s.. In Appendix A, we prove that every Brownian motion (W (t))t 0 has a continuous version, there exists a family W (t) : R of random variables such that (1) t 7 W (t, ) is continuous for all . (2) P(W (t) = W (t)) = 1 for all t 0. Note that (W (t))t 0 is also a Brownian motion. Indeed, (1) and (3) are clear since W (0) = W (0) and W (t + s) W (s) = W (t + s) W (s) almost surely for all t, s 0. As for (2), note that if t1 < t2 < tn = t < t + s, then (W (t + s) W (t), W (tn ) W (tn 1 ), .. , W (t2 ) W (t1 )).]]

10 (t + s) W. = (W (t), W. (tn ) W. (tn 1 ), .. , W. (t2 ) W. (t1 )). almost surely. Hence these vectors are identically distributed. Now (2) follows as in the proof of Theorem From now on, we will always use Brownian motions with continuous paths. The Wiener Integral Definition Let W be an L2 ([0, ))-isonormal Gaussian process and (W (t))t 0. be the Brownian motion constructed from this isonormal process as in the proof of Theorem For L2 ([0, )), the Wiener integral of is defined as Z . (s) dW (s) := W ( ) . 0. (s)1[0,t) (s) dW (s). Rt R . For t > 0, we define 0 (s) dW (s) := 0. Remark Let t > 0 and 0 = t0 < t1 < < tn = t be a partition of [0, t]. If = k=1 ak 1[tk 1 ,tk ) , then Pn Z t X n Xn n ak 1[tk 1 ,tk ) = ak W (1tk 1 ,tk ) ) =.]]]]]


Related search queries