Example: bachelor of science

LECTURE 5 - UC Davis Mathematics

128 LECTURE 5 stochastic ProcessesWe may regard the present state of the universe as the effectof its past and the cause of its future. An intellect which at acertain moment would know all forces that set nature in motion,and all positions of all items of which nature is composed, if thisintellect were also vast enough to submit these data to analysis, itwould embrace in a single formula the movements of the greatestbodies of the universe and those of the tiniest atom; for such anintellect nothing would be uncertain and the future just like thepast would be present before its many problems that involve modeling the behavior of some system, we lacksufficiently detailed information to determine how the system behaves, or the be-havior of the system is so complicated that an exact description of it becomesirrelevant or impossible. In that case, a probabilistic model is often and randomness have many different philosophical interpretations,but, whatever interpretation one adopts, there is a clear mathematical formulationof probability in terms of measure theory, due to is an enormous field with applications in many different areas.

LECTURE 5. STOCHASTIC PROCESSES 133 We say that random variables X 1;X 2;:::X n: !R are jointly continuous if there is a joint probability density function p(x

Tags:

  Lecture, Processes, Probability, Stochastic, Stochastic processes, Lecture 5

Information

Domain:

Source:

Link to this page:

Please notify us if you found a problem with this document:

Other abuse

Transcription of LECTURE 5 - UC Davis Mathematics

1 128 LECTURE 5 stochastic ProcessesWe may regard the present state of the universe as the effectof its past and the cause of its future. An intellect which at acertain moment would know all forces that set nature in motion,and all positions of all items of which nature is composed, if thisintellect were also vast enough to submit these data to analysis, itwould embrace in a single formula the movements of the greatestbodies of the universe and those of the tiniest atom; for such anintellect nothing would be uncertain and the future just like thepast would be present before its many problems that involve modeling the behavior of some system, we lacksufficiently detailed information to determine how the system behaves, or the be-havior of the system is so complicated that an exact description of it becomesirrelevant or impossible. In that case, a probabilistic model is often and randomness have many different philosophical interpretations,but, whatever interpretation one adopts, there is a clear mathematical formulationof probability in terms of measure theory, due to is an enormous field with applications in many different areas.

2 Herewe simply aim to provide an introduction to some aspects that are useful in appliedmathematics. We will do so in the context of stochastic processes of a continuoustime variable, which may be thought of as a probabilistic analog of deterministicODEs. We will focus on Brownian motion and stochastic differential equations,both because of their usefulness and the interest of the concepts they discussing Brownian motion in Section 3, we provide a brief review ofsome basic concepts from probability theory and stochastic ProbabilityMathematicians are like Frenchmen: whatever you say to themthey translate into their own language and forthwith it is some-thing entirely space( ,F,P) consists of: (a) a sample space , whose pointslabel all possible outcomes of a random trial; (b) a -algebraFof measurablesubsets of , whose elements are the events about which it is possible to obtaininformation; (c) a probability measureP:F [0,1], where 0 P(A) 1 is theprobability that the eventA Foccurs.

3 IfP(A) = 1, we say that an eventA1 Pierre Simon Laplace, inA Philosophical Essay on Goethe. It has been suggested that Goethe should have said Probabilists are likeFrenchmen (or Frenchwomen). 129130occursalmost surely. When the -algebraFand the probability measurePareunderstood from the context, we will refer to the probability space as .In this definition, we say thatFis -algebra on if it is is a collection of subsetsof such that and belong toF, the complement of a set inFbelongs toF, anda countable union or intersection of sets inFbelongs toF. A probability measurePonFis a functionP:F [0,1] such thatP( ) = 0,P( ) = 1, and for anysequence{An}of pairwise disjoint sets (meaning thatAi Aj= fori6=j) wehaveP( n=1An)= n=1P(An).Example be a set andFa -algebra on . Suppose that{ n :n N}is a countable subset of and{pn}is a sequence of numbers 0 pn 1 such thatp1+p2+p3+ = 1. Then we can define a probability measureP:F [0,1] byP(A) = n a collection of subsets of a set , then the -algebra generated byE,denoted (E), is the smallest -algebra that open subsets ofRgenerate a -algebraBcalled the Borel -algebra ofR.

4 This algebra is also generated by the closed sets, or by the collectionof intervals. The interval [0,1] equipped with the -algebraBof its Borel subsetsand Lebesgue measure, which assigns to an interval a measure equal to its length,forms a probability space. This space corresponds to the random trial of picking auniformly distributed real number from [0,1]. Random variablesA functionX: Rdefined on a set with a -algebraFis said to beF-measurable, or simply measurable whenFis understood, ifX 1(A) Ffor everyBorel setA BinR. Arandom variableon a probability space ( ,F,P) is areal-valuedF-measurable functionX: R. Intuitively, a random variable is areal-valued quantity that can be measured from the outcome of a random :R Ris a Borel measurable function, meaning thatf 1(A) Bfor everyA B, andXis a random variable, thenY=f X, defined byY( ) =f(X( )),is also a random denote the expected value of a random variableXwith respect to theprobability measurePbyEP[X], orE[X] when the measurePis expected value is a real number which gives the mean value of the randomvariableX.

5 Here, we assume thatXisintegrable, meaning that the expected valueE[|X|]< is finite. This is the case if large values ofXoccur with sufficientlylow a random variable with mean =E[X], thevariance 2ofXis defined by 2=E[(X )2],assuming it is finite. The standard deviation provides a measure of the departureofXfrom its mean . Thecovarianceof two random variablesX1,X2with meansLECTURE 5. stochastic PROCESSES131 1, 2, respectively, is defined bycov (X1,X2) =E[(X1 1) (X2 2)].We will also loosely refer to this quantity as a correlation function, although strictlyspeaking the correlation function ofX1,X2is equal to their covariance divided bytheir standard expectation is a linear functional on random variables, meaning that forintegrable random variablesX,Yand real numberscwe haveE[X+Y] =E[X] +E[Y],E[cX] =cE[X].The expectation of an integrable random variableXmay be expressed as anintegral with respect to the probability measurePasE[X] = X( )dP( ).

6 In particular, the probability of an eventA Fis given byP(A) = AdP( ) =E[1A]where 1A: {0,1}is the indicator function ofA,1A( ) ={1 if A,0 if / will say that two random variables are equalP-almost surely, or almost surelywhenPis understood, if they are equal on an eventAsuch thatP(A) = 1. Sim-ilarly, we say that a random variableX:A Ris defined almost surelyifP(A) = 1. Functions of random variables that are equal almost surely havethe same expectations, and we will usually regard such random variables as that{X : }is a collection of functionsX : R. The -algebra generated by{X : }, denoted (X : ), is the smallest -algebraGsuch thatX isG-measurable for every . Equivalently,G= (E)whereE={X 1 (A) : ,A B(R)}. Absolutely continuous and singular measuresSuppose thatP,Q:F [0,1] are two probability measures defined on the same -algebraFof a sample space .We say thatQisabsolutely continuouswith respect toPis there is an integrablerandom variablef: Rsuch that for everyA Fwe haveQ(A) = Af( )dP( ).}

7 We will write this relation asdQ=fdP,and callfthe density ofQwith respect toP. It is definedP-almost surely. In thatcase, ifEPandEQdenote the expectations with respect toPandQ, respectively,andXis a random variable which is integrable with respect toQ, thenEQ[X] = X dQ= fX dP=EP[fX].132We say that probability measuresPandQonFaresingularif there is aneventA Fsuch thatP(A) = 1 andQ(A) = 0 (or, equivalently,P(Ac) = 0andQ(Ac) = 1). This means that events which occur with finite probability withrespect toPalmost surely do not occur with respect toQ, and the Lebesgue probability measure on ([0,1],B) describedin Example Iff: [0,1] [0, ) is a nonnegative, integrable function with 10f( )d = 1,whered denotes integration with respect to Lebesgue measure, then we can definea measureQon ([0,1],B) byQ(A) = Af( )d .The measureQis absolutely continuous with respect toPwith densityf. NotethatPis not necessarily absolutely continuous with respect toQ; this is the caseonly iff6= 0 almost surely and 1/fis integrable.]

8 IfRis a measure on ([0,1],B)of the type given in Example thenRandP(orRandQ) are singular becausethe Lebesgue measure of any countable set is equal to probability densitiesThe distribution functionF:R [0,1] of a random variableX: Ris definedbyF(x) =P{ :X( ) x}or, in more concise notation,F(x) =P{X x}.We say that a random variable is continuous if the probability measure itinduces onRis absolutely continuous with respect to Lebesgue ofthe random variables we consider here will be a continuous random variable with distribution functionF, thenFisdifferentiable andp(x) =F (x)is the probability density function ofX. IfA B(R) is a Borel subset ofR, thenP{X A}= Ap(x) density satisfiesp(x) 0 and p(x)dx= , iff:R Ris any Borel-measurable function such thatf(X) is inte-grable, thenE[f(X)] = f(x)p(x) random variableXis Gaussian with mean and variance 2ifit has the probability densityp(x) =1 2 2e (x )2/(2 2).

9 3 This excludes, for example, counting-type random variables that take only integer 5. stochastic PROCESSES133We say that random variablesX1,X2,..Xn: Rare jointly continuous ifthere is a joint probability density functionp(x1,x2,..,xn) such thatP{X1 A1,X1 A1,.. ,Xn An}= Ap(x1, ,xn) A2 An. Thenp(x1,x2,..,xn) 0 and Rnp(x1,x2,..,xn) values of functions of theXiare given byE[f(X1,X2,..,Xn)] = Rnf(x1,x2,..,xn)p(x1,x2,..,xn) can obtain the joint probability density of a subset of theXi s by integratingout the other variables. For example, ifp(x,y) is the joint probability density ofrandom variablesXandY, then the marginal probability densitiespX(x) andpY(y)ofXandY, respectively, are given bypX(x) = p(x,y)dy, pY(y) = p(x,y) course, in general, we cannot obtain the joint densityp(x,y) from the marginaldensitiespX(x),pY(y), since the marginal densities do not contain any informationabout howXandYare random vector~X= (X1.)

10 ,Xn) is Gaussian with mean~ =( 1,.., n) and invertible covariance matrixC= (Cij), where i=E[Xi], Cij=E[(Xi i) (Xj j)],if it has the probability densityp(~x) =1(2 )n/2(detC)1/2exp{ 12(~x ~ )>C 1(~x ~ )}.Gaussian random variables are completely specified by their mean and IndependenceRandom variablesX1,X2,..,Xn: Rare said to beindependentifP{X1 A1,X2 A2, .. ,Xn An}=P{X1 A1}P{X2 A2}..P{Xn An}for arbitrary Borel setsA1,A2,.. ,A3 R. IfX1,X2,.. ,Xnare independentrandom variables, thenE[f1(X1)f2(X2)..fn(Xn)] =E[f1(X1)]E[f2(X2)]..E[fn(Xn)].Jointly continuous random variables are independent if their joint probability den-sity distribution factorizes into a product:p(x1,x2,..,xn) =p1(x1)p2(x2)..pn(xn).If the densitiespi=pjare the same for every 1 i,j n, then we say thatX1,X2,.. ,Xnare independent, identically distributed random , each random variable in a collection of independent random vari-ables defines a different coordinate axis of the probability space on which they aredefined.


Related search queries