Example: stock market

A Conditional expectation - Department of Mathematics

A conditionaldensities,expectationsWe ; de nedtheconditionaldensity ofXgivenYto befXjY(xjy) =fX;Y(x; y)fY(y)ThenP(a X bjY=y) =ZbafX;Y(xjy)dxConditioningonY=yis conditioningonanevent withprobability notde ned,so we make senseof theleftsideabove by a limitingprocedure:P(a X bjY=y) = lim !0+P(a X bjjY yj< )We thende netheconditionalexpectationofXgivenY=yto beE[XjY=y] =Z1 1x fXjY(xjy)dxWe have thefollowingcontinuousanalogof [Y] =Z1 1E[YjX=x]fX(x)dxNow we somesenseitis thevery rstde eventsAandBP(AjB) =P(A\B)P(B)assumingthatP(B)> a discreteRV,theconditionaldensity ofXgiventheeventBisf(xjB) =P(X=xjB) =P(X=x; B)P(B)andtheconditionalexpectationofXgiv enBisE[XjB] =Xxx f(xjB)1 Thepartitiontheoremsays thatifBnis a partitionof thesamplespacethenE[X] =XnE[XjBn]P(Bn)Now supposethatXandYarediscreteRV' in therangeofYthenY=yisa event withnonzeroprobability, so we canuseit as theBin theabove.

The partition theorem says that if Bn is a partition of the sample space then E[X] = X n E[XjBn]P(Bn) Now suppose that X and Y are discrete RV’s. If y is in the range of Y then Y = y is a event with nonzero probability, so we can use it as the B in the above.

Tags:

  Expectations, Conditional, Conditional expectation

Information

Domain:

Source:

Link to this page:

Please notify us if you found a problem with this document:

Other abuse

Transcription of A Conditional expectation - Department of Mathematics

1 A conditionaldensities,expectationsWe ; de nedtheconditionaldensity ofXgivenYto befXjY(xjy) =fX;Y(x; y)fY(y)ThenP(a X bjY=y) =ZbafX;Y(xjy)dxConditioningonY=yis conditioningonanevent withprobability notde ned,so we make senseof theleftsideabove by a limitingprocedure:P(a X bjY=y) = lim !0+P(a X bjjY yj< )We thende netheconditionalexpectationofXgivenY=yto beE[XjY=y] =Z1 1x fXjY(xjy)dxWe have thefollowingcontinuousanalogof [Y] =Z1 1E[YjX=x]fX(x)dxNow we somesenseitis thevery rstde eventsAandBP(AjB) =P(A\B)P(B)assumingthatP(B)> a discreteRV,theconditionaldensity ofXgiventheeventBisf(xjB) =P(X=xjB) =P(X=x; B)P(B)andtheconditionalexpectationofXgiv enBisE[XjB] =Xxx f(xjB)1 Thepartitiontheoremsays thatifBnis a partitionof thesamplespacethenE[X] =XnE[XjBn]P(Bn)Now supposethatXandYarediscreteRV' in therangeofYthenY=yisa event withnonzeroprobability, so we canuseit as theBin theabove.

2 Sof(xjY=y)is de canchangethenotationto make it look like thecontinuouscaseandwritef(xjY=y) asfXjY(xjy). Ofcourseit is givenbyfXjY(xjy) =P(X=x; Y=y)P(Y=y)=fX;Y(x; y)fY(y)Thislooksidenticaltotheformulain thecontinuouscase,butit is reallya di theabovefX;YandfYarepmf's; in thecontinuouscasetheyarepdf' haveE[XjY=y] =Xxx fXjY(xjy)andthepartitiontheoremisE[X] =XyE[XjY=y]P(Y=y) a RandomVariableConditionalexpectationssuc h asE[XjY= 2] orE[XjY= 5] we considerE[XjY=y], it is a number thatdependsony. Soit is a functionofy. In thissectionwewillstudya newobjectE[XjY] thatis a :Rolla dieuntilwe geta thetotalnumber of rollsandXthenumber of 1'swe computeE[XjY=y].

3 TheeventY=ymeansthattherewerey 1 rollsthatwerenota 6 andthentheyth rollwas a ,Xhasa binomialdistributionwithn=y 1 trialsandprobability of successp= 1=5. SoE[XjY=y] =np=15(y 1)Now dotheexperiment andgetanoutcome!. (Inthisexample,!wouldbe a stringof 1;2;3;4;5'sendingwitha 6.)Thenwe computey=Y(W). (Inthisexampleywouldjustbe thenumber of rolls.) Thenwe computeE[XjY=y]. Thisprocessgives a function!!E[XjY=y]2So thisis a is usuallywrittenasE[XjY]. In ourexample!is mappedto (y 1)=5 wherey=Y(!). So!is mappedto (Y(!) 1)=5. SotherandomvariableE[XjY] is just(Y 1)= [XjY] is a functionofY. Thiswillbe tryanotherconditionalexpectationin thesameexample:E[X2jY].

4 Again,givenY=y,Xhasa binomialdistributionwithn=y 1 trialsandp= 1=5. Thevarianceofsuch a randomvariableisnp(1 p) = (y 1)4= [X2jY=y] (E[XjY=y])2= (y 1)425 Usingwhatwe foundbefore,E[X2jY=y] (15(y 1))2= (y 1)425 AndsoE[X2jY=y] =125(y 1)2+425(y 1)ThusE[X2jY] =125(Y 1)2+425(Y 1) =125(Y2+ 2Y 3)Onceagain,E[X2jY] is a :E[XjY] is thefunctionofYthatbestsapproximatesX. Thisis a vaguestatement sincewe have notsaidwhat\best" considertwo itselfa functionofY, ,Y2oreY. ThenthefunctionofYthatbestapproximatesXi sXitself.(Whatever bestmeans,youcan'tdoany betterthanthis.)Theotherextremecaseis whenXandYareindependent. In thiscase,knowingYtellsus nothingaboutX.

5 Sowe might expectthatE[XjY] willnotdependonY. Indeed,we conditionalexpectationBeforewe listallthepropertiesofE[XjY], we needto ; Y; Zbe [XjY=y; Z=z]makes canthinkof it as a functionof therandomoutcome!:!!E[XjY=Y(!); Z=Z(!)]Soit is a denoteit byE[XjY; Z]. In thecontinuouscasewe needto de neE[XjY=y; Z=z] by a a functionofyandzthatwe canonceagaininterpretas a ; Y; Zbe randomvariables,a; b2R, andg:R!R. Assumingall thefollowingexpectationsexist,wehave(i)E [ajY] =a(ii)E[aX+bZjY] =aE[XjY] +bE[ZjY](iii)E[XjY] 0ifX 0.(iv)E[XjY] =E[X]ifXandYare independent.(v)E[E[XjY]] =E[X](vi)E[Xg(Y)jY] =g(Y)E[XjY]. In particular,E[g(Y)jY] =g(Y).(vii)E[XjY; g(Y)] =E[XjY](viii)E[E[XjY; Z]jY] =E[XjY]Partialproofs:The rstthreearenothardto prove, andwe leave themto (iv).

6 We prove thecontinuouscaseandleave thediscretecaseto thenfXjY(xjy) =fX;Y(x; y)fY(y)=fX(x)fY(y)fY(y)=fX(x)SoE[XjY=y] =Zx fXjY(xjy)dx=Zx fX(x)dx=E[X]Consider(v). needto computetheexpectedvalueof therandomvariableE[XjY]. It is a functionofYandit takes onthevalueE[XjY=y] whenY=y. Soby thelaw of theunconsciouswhatever,E[E[XjY]] =XyE[XjY=y]P(Y=y)Bythepartitiontheoremth isis equaltoE[X]. Soin thediscretecase,(iv)is reallythepartitiontheoremin thecontinuouscaseit is (vi).We mustcomputeE[Xg(Y)jY=y]. GiventhatY=y, thepossiblevaluesofXg(Y) arexg(y) wherexvariesover therangeofX. Theprobability of thevaluexg(y) giventhatY=yis justP(X=xjY=y). SoE[Xg(Y)jY=y] =Xxxg(y)P(X=xjY=y)=g(y)Xxx P(X=xjY=y) =g(y)E[XjY=y]Thisproves (vi).

7 4 Consider(viii).Again,we needtocomputeE[E[XjY;Z]jY=y].E[XjY;Z] is a , itspossiblevaluesareE[XjY=y;Z=z] wherezvariesover therangeofZ. GiventhatY=y, theprobability thatE[XjY;Z] =E[XjY=y;Z=z] is justP(Z=zjY=y). Hence,E[E[XjY;Z]jY=y] =XzE[XjY=y; Z=z]P(Z=zjY=y)=XzXxx P(X=xjY=y; Z=z)P(Z=zjY=y)=Xz;xxP(X=x; Y=y; Z=z)P(Y=y; Z=z)P(Z=z; Y=y)P(Y=y)=Xz;xxP(X=x; Y=y; Z=z)P(Y=y)=XxxP(X=x; Y=y)P(Y=y)=Xxx P(X=xjY=y)=E[XjY=y](1)Example:LetXandYbe independent; each is uniformlydistributedon[0;1].LetZ=X+Y. FindE[ZjX]; E[XjZ]; E[XZjX]; E[XZjZ].We [ZjX] =E[X+YjX] =E[XjX] +E[YjX] =X+E[Y] =X+12wherewe have usedtheindependenceofXandYandproperties( iv)andthespecialcaseof (vi).

8 Usingproperty (vi),E[XZjX] =XE[ZjX] =X(X+12).Now we dothehardone:E[XjZ]. We needthejoint pdf ofXandZ. Sowe doachangeof ,Z=X+Y. Thisis a lineartransformation,so theJacobianwillbe a constant. We needto ndtheimageofthesquare0 x; y 1 is alineartransformation,thefouredgesof thesquarewillbe mappedto ndthemwe canjustcomputewherethefourcornersof thesquarearemapped.(x; y) = (0;0)!(w; z) = (0;0)(x; y) = (1;0)!(w; z) = (1;1)(x; y) = (0;1)!(w; z) = (0;1)(x; y) = (1;1)!(w; z) = (1;2)5 Sotheimageof thesquareis theparallelogramwithvertices(0;0);(1;1); (0;1) and(1;2).Thejoint density ofWandZwillbe hasarea1, we concludefW;Z(w; z) = 1((w; z)2A)Notethatwe we can gureoutwhatfXjZ(xjz) mustconsidertwo z 1.

9 GivenZ=z,Xis uniformlydistributedbetween0 andz. SoE[XjZ=z] =z=2. In theothercase1 z 2. ThenXis uniformlydistributedbetweenz 1 and1. SoE[XjZ=z] = (z 1 + 1)=2 =z= bothcasesE[XjZ=z] =z=2. ThusE[XjZ] =Z=2. Finally, we haveE[XZjZ] =ZE[XjZ] =Z2= geta smallcheck onouranswersusingproperty (v).We foundE[ZjX] =X+ 1= [X] + 1=2 = 1=2 + 1=2 = (v)saysE[E[ZjX]] =E[Z] =E[X] +E[Y] = 1=2 + 1=2 = alsofoundE[XjZ] =Z=2. Thus themeanof thisrandomvariableisE[Z]=2 = 1= (v)says it shouldbeE[X] = 1= (randomsums):LetXnbe andvariance a RV thatis independent of alltheXnandtakes onthevalues1;2;3; . LetSN=NXj=1 XjNotethatthenumber of termsin thesumis will ndE[SNjN] andE[S2 NjN]andusethemto ,SNis a sumwitha xednumber of terms:nXj=1 XjSoE[SNjN=n] =n.

10 ThusE[SNjN] =N . SincetheXjareindependent, theirvariancesaddandsoE[S2 NjN=n] (E[SNjN=n])2=var[SNjN=n] =n 2 SoE[S2 NjN] =N 2+ (E[SNjN])2=N 2+N2 2 Now usingproperty (v),we haveE[SN] =E[E[SNjN]] =E[N ] = E[N]E[S2N] =E[E[S2 NjN]] =E[N 2+N2 2] = 2E[N] + 2E[N2]andso thevarianceofSNisvar(SN) =E[S2N] (E[SN])2= 2E[N] + 2E[N2] 2E[N]2= 2E[N] + 2var(N) a \best approximationWe have saidthatE[XjY] is thefunctionofYthatbestapproximatesX. In thissectionwe make willassumewe have thissectionis truein thecontinuouscaseas needto make precisewhatit meansto \be a functionofY." In thediscretecasewe cansimplysay thatXis a functionofYif thereis a functiong:R!


Related search queries