Example: barber

Notes 3 : Modes of convergence

Notes 3 : Modes of convergenceMath 733-734: Theory of ProbabilityLecturer: Sebastien RochReferences: [Wil91, Chapters ], [Dur10, Sections , ].1 Modes of convergenceLet( ,F,P)be a probability space. We will encounter various Modes of conver-gence for sequences of RVs on( ,F,P).DEF ( Modes of convergence )Let{Xn}nbe a sequence of (not necessarilyindependent) RVs and letXbe a RV. Then we have the following definitions. convergence in probability: >0,P[|Xn X|> ] 0(asn + );which we denote byXn PX. convergence almost sure:P[Xn X] = 1. convergence inLp(p 1):E|Xn X|p better understand the relationship between these different Modes of conver-gence, we will need Markov s inequality as well as the Borel-Cantelli first state these, then come back to applications of independent interest Markov s inequalityLEM (Markov s inequality)LetZ 0be a RV on( ,F,P).

Lecture 3: Modes of convergence 6 Before we give the proofs of these theorems, we discuss further applications of Markov’s inequality and the Borel-Cantelli lemmas.

Tags:

  Lecture, Notes, Convergence

Information

Domain:

Source:

Link to this page:

Please notify us if you found a problem with this document:

Other abuse

Transcription of Notes 3 : Modes of convergence

1 Notes 3 : Modes of convergenceMath 733-734: Theory of ProbabilityLecturer: Sebastien RochReferences: [Wil91, Chapters ], [Dur10, Sections , ].1 Modes of convergenceLet( ,F,P)be a probability space. We will encounter various Modes of conver-gence for sequences of RVs on( ,F,P).DEF ( Modes of convergence )Let{Xn}nbe a sequence of (not necessarilyindependent) RVs and letXbe a RV. Then we have the following definitions. convergence in probability: >0,P[|Xn X|> ] 0(asn + );which we denote byXn PX. convergence almost sure:P[Xn X] = 1. convergence inLp(p 1):E|Xn X|p better understand the relationship between these different Modes of conver-gence, we will need Markov s inequality as well as the Borel-Cantelli first state these, then come back to applications of independent interest Markov s inequalityLEM (Markov s inequality)LetZ 0be a RV on( ,F,P).

2 Then for alla >0P[Z a] E[Z] :We haveE[Z] E[Z1{Z a}] aE[1{Z a}] =aP[Z a],where note that the first inequality uses that (assuming the first and second moments exist):Var[X] =E[(X E[X])2] =E[X2] (E[X]) 3: Modes of convergence2 LEM (Chebyshev s inequality)LetXbe a RV on( ,F,P)withVar[X]<+ . Then for alla >0P[|X E[X]|> a] Var[X] :Apply Markov s inequality toZ= (X E[X]) immediate application of Chebyshev s inequality is the (Sn)nbe a sequence of RVs with n=E[Sn]and 2n= Var[Sn]. If 2n/b2n 0, thenSn nbn Borel-Cantelli lemmasDEF (Almost surely)EventAoccursalmost surely ( )ifP[A] = (Infinitely often, eventually)Let(An)nbe a sequence of events. Thenwe defineAninfinitely often ( ) { : is in infinitely manyAn} lim supnAn m+ n= lim ,Aneventually (ev.)

3 { : is inAnfor all largen} lim infnAn m+ n= lim we have(Anev.)c= ( ). lecture 3: Modes of convergence3 LEM (First Borel-Cantelli lemma (BC1))Let(An)nbe as above. If nP[An]<+ ,thenP[ ] = :This follows trivially from the monotone- convergence theorem (or Fubini stheorem). Indeed letN= n1An. ThenE[N] = nP[An]<+ ,and thereforeN <+ ,X2,..be independent withP[Xn=fn] =pnandP[Xn=0] = 1 pnfor nondecreasingfn>0and nonincreasingpn>0. By (BC1), if npn<+ thenXn converse is only true in general for IID (Second Borel-Cantelli lemma (BC2))If the events(An)nare inde-pendent, then nP[An] = + impliesP[ ] = :TakeM < N <+ . Then by independenceP[ Nn=MAcn] =N n=M(1 P[An]) exp( N n=MP[An]) 0,asN +.

4 SoP[ + n=MAn] = 1and furtherP[ M + n=MAn]= 1,by ,X2,..be independent withP[Xn=fn] =pnandP[Xn= 0] =1 pnfor nondecreasingfn>0and nonincreasingpn>0. By (BC1) and (BC2),Xn if and only if npn<+ . lecture 3: Modes of Returning to convergence modesWe return to our ,X2,..be independent withP[Xn=fn] =pnandP[Xn=0] = 1 pnfor nondecreasingfn>0and nonincreasingpn>0. The casesfn= 1,fn= n, andfn=n2are interesting. In the first one, convergence inprobability (which is equivalent topn 0) and inLr(1 pn 0) are identical,but convergence follows from a stronger condition ( npn<+ ). In thesecond one, convergence inL1( npn 0) can happen without ( npn<+ ) or inL2(npn 0). Take for instancepn= 1/n.

5 In thelast one, convergence ( npn<+ ) can happen without convergence inL1(n2pn 0) or inL2(n4pn 0). Take for instancepn= 1 general we have:THM (Implications) in prob (Hint: Fatou s lemma) Lp= in prob (Hint: Markov s inequality) forr p 1,Lr= Lp(Hint: Jensen s inequality) in prob if and only if every subsequence contains a further subsequence thatconvergence (Hint: (BC1) for= direction)Proof:We prove the first, second and (one direction of the) fourth one. For thefirst one, we need the following (Reverse Fatou lemma)Let(S, , )be a measure space. Let(fn)n (m )+such that there isg (m )+withfn gfor allnand (g)<+ . Then (lim supnfn) lim supn (fn).(This follows from applying (FATOU) tog fn.)

6 Using the previous lemma on1{|Xn X|> }gives the the second claim, note that by Markov s inequalityP[|Xn X|> ] =P[|Xn X|p> p] E|Xn X|p direction of the fourth claim follows from (BC1). Indeed let(Xn(m))mbea subsequence of(Xn)n. Take k 0and letmkbe such thatn(mk)> n(mk 1)andP[|Xn(mk) X|> k] 2 k, lecture 3: Modes of convergence5which is summable. Therefore by (BC1),P[|Xn(mk) X|> ] = 0, ,Xn(mk) For the other direction, see [D].As a consequence of the last implication we get the continuous andXn Xin prob thenf(Xn) f(X) :For every subsequence(Xn(m))mthere is a further subsequence(Xn(mk))kwhich converges and hencef(Xn(mk)) f(X) But this implies thatf(Xn) f(X)in example and theorem show that convergence does not come from atopology (or in particular from a metric).

7 In contrast, it is possible to show thatconvergence in probability corresponds to the Ky Fan metric (X,Y) = inf{ 0 :P[|X Y|> ] }.See [D]. Statement of laws of large numbersOur first goal will be to prove the (Strong law of large numbers)LetX1,X2,..be IID withE|X1|<+ . (In fact, pairwise independence suffices.) LetSn= k nXkand =E[X1]. ThenSnn , insteadE|X1|= + thenP[limnSnnexists ( ,+ )]= (Weak law of large numbers)Let(Xn)nbe IID andSn= k necessary and sufficient condition for the existence of constants( n)nsuch thatSnn n P0,isnP[|X1|> n] that case, the choice n=E[X11|X1| n], 3: Modes of convergence6 Before we give the proofs of these theorems, we discuss further applications ofMarkov s inequality and the Borel-Cantelli Further.

8 Of Chebyshev s inequalityChebyshev s inequality and Theorem can be used to derive limit laws in somecases where sequences are not necessarily IID. We give several important examplesfrom [D].EX (Occupancy problem)Suppose we throwrballs intonbins indepen-dently uniformly at random. LetNnbe the number of empty boxes. IfAiis theevent that thei-th bin is empty, we haveP[Ai] =(1 1n)r,so thatNn= k n1Ak(not independent) andE[Nn] =n(1 1n) particular, ifr/n we haveE[Nn]n e .Because there is no independence, the variance calculation is trickier. Note thatE[N2n] =E (n m=11Am)2 = 1 m,m nP[Am Am ],andVar[Nn] =E[N2n] (E[Nn])2= 1 m,m n[P[Am Am ] P[Am]P[Am ]]=n(n 1)[(1 2/n)r (1 1/n)2r] +n[(1 1/n)r (1 1/n)2r]=o(n2) +O(n),where we divided the sum into casesm6=m andm=m.

9 Takingbn=ninTheorem , we haveNnn Pe . lecture 3: Modes of convergence7EX (Coupon s collector problem)LetX1,X2,..be IID uniform in[n] ={1,..,n}. We are interested in the time it takes to see every element in[n]at leastonce. Let nk= inf{m:|{X1,..,Xm}|=k},be the first time we collectkdifferent items, with the convention n0= 0. LetTn= nn. DefineXn,k= nk nk 1and note that theXn,k s are independent (but notidentically distributed) with geometric distribution with parameter1 (k 1) that a geometric RVNwith parameterphas lawP[N=i] =p(1 p)i 1,and momentsE[N] =1p,andVar[N] =1 pp2( 1p2).HenceE[Tn] =n k=1(1 k 1n) 1=nn m=11m nlogn,andVar[Tn] n k=1(1 k 1n) 2=n2n m=11m2 Cn2,for someC >0not depending Theorem givesTn n nm=1m 1nlogn P0,orTnnlogn previous example involved a so-called triangular array{Xn,k}n 1,1 k (Random permutations)Any permutation can be decomposed into cy-cles.

10 , if = [3,9,6,8,2,1,5,4,7], then = (136)(2975)(48). In fact, auniform permutation can be generated by following a cycle until it closes, thenstarting over from the smallest unassigned element, and so on. LetXn,kbe theLecture 3: Modes of convergence8indicator that thek-th element in this construction precedes the closure of a , we haveX9,3=X9,7=X9,9= 1. The construction above implies that theXn,k s are independent andP[Xn,j= 1] =1n j+ is because only one of the remaining elements closes the cycle. LettingSn= k nXn,kbe the number of cycles in we haveE[Sn] =n j=11n j+ 1 logn,andVar[Sn] =n j=1 Var[Xn,j] n j=1E[X2n,j] =n j=1E[Xn,j] =E[Sn].Takingbn= lognin Theorem we haveSnlogn.


Related search queries