Example: bankruptcy

1 Time-reversible Markov chains - Columbia University

Copyrightc 2009 by Karl Sigman1 Time-reversible Markov chainsIn these notes we study positive recurrent Markov chains {Xn:n 0}for which, when insteady-state (stationarity), yield the same Markov chain (in distribution) if time is fundamental condition required is that for each pair of statesi,jthe long-run rate at whichthe chain makes a transition from stateito statejequals the long-run rate at which the chainmakes a transition from statejto statei; iPi,j= jPj, Two-sided stationary extensions of Markov chainsFor a positive recurrent Markov chain {Xn:n N}with transition matrixPand stationarydistribution , let{X n:n N}denote astationary versionof the chain , that is, one in whichX0 . It turns out that we can extend this process to have timentake on negative valuesas well, that is, extend it to{X n:n Z}.

De nition 1.1 A positive recurrent Markov chain with transition matrix P and stationary distribution ˇis called time reversible if the reverse-time stationary Markov chain fX(r) n: n2 Nghas the same distribution as the forward-time stationary Markov chain fX

Tags:

  University, Time, Chain, Columbia, Markov, Reversible, 1 time reversible markov chains columbia university, Time reversible

Information

Domain:

Source:

Link to this page:

Please notify us if you found a problem with this document:

Other abuse

Transcription of 1 Time-reversible Markov chains - Columbia University

1 Copyrightc 2009 by Karl Sigman1 Time-reversible Markov chainsIn these notes we study positive recurrent Markov chains {Xn:n 0}for which, when insteady-state (stationarity), yield the same Markov chain (in distribution) if time is fundamental condition required is that for each pair of statesi,jthe long-run rate at whichthe chain makes a transition from stateito statejequals the long-run rate at which the chainmakes a transition from statejto statei; iPi,j= jPj, Two-sided stationary extensions of Markov chainsFor a positive recurrent Markov chain {Xn:n N}with transition matrixPand stationarydistribution , let{X n:n N}denote astationary versionof the chain , that is, one in whichX0 . It turns out that we can extend this process to have timentake on negative valuesas well, that is, extend it to{X n:n Z}.

2 This is a way of imagining/assuming that the chainstarted off initially in the infinite past, and we call this atwo-sided extensionof our process. Toget such an extension1we start by shifting the origin to be timek 1 and extending the processktime units into the past: DefineX n(k) =X n+k, k n < . Note how{X n(k) :n N}has the same distribution as{X n:n N}by stationarity, and in fact this extension on k n < is still stationary too. Now ask , the process{X n(k) : k n < }converges (in distribution) to a truly two-sided extension and it remains stationary; we get thedesired two-sided stationary extension{X n:n Z}. And for each timen Zit holds thatP(X n=j) = j, j time -reversibility: time -reversibility equationsLet{X n:n Z}be a two-sided extension of a positive recurrent Markov chain with transitionmatrixPand stationary distribution.

3 The Markov property is stated as the future isindependent of the past given the present state , and thus can be re-stated as the past isindependent of the future given the present state . But this means that the processX(r)n=X n, n Ndenoting the process in reverse time , is still a (stationary) Markov chain . (Byreversing time , the future and past are swapped.) In fact it has transition probabilities thatcan be exactly computed in terms of andP: lettingP(r) = (Pi,j(r)) denote the time reversedtransition probabilities,Pi,j(r) =P(X(r)1=j|X(r)0=i) =P(X 0=j|X 1=i)=P(X 1=i|X 0=j)P(X 0=j)/P(X 1=i)= j iPj, the time -reversed Markov chain is a Markov chain with transition probabilities given byPi,j(r) = j iPj,i.(1)1 The mathematical justification for extending a stationary stochastic process to be two-sided stationary iscalledKolmogorov s extension theoremfrom probability positive recurrent Markov chain with transition matrixPand stationarydistribution is called time reversible if the reverse- time stationary Markov chain {X(r)n:n N}has the same distribution as the forward- time stationary Markov chain {X n:n N}, thatis, ifP(r) =P;Pi,j(r) =Pi,jfor all pairs of statesi,j.

4 Equivalently this means that it satisfiesthetime-reversibility equations iPi,j= jPj,i,for all pairs of statesi,j. In words: for each pair of statesi,j,the long-run rate at whichthe chain makes a transition from stateito statejequals the long-run rate atwhich the chain makes a transition from statejto inconvenience with our definition is that it requires us to have at our disposal thestationary distribution in advance so as to check if the chain is Time-reversible . But thefollowing can help us avoid having to know in advance and can even help us find :Proposition for an irreducible Markov chain with transition matrixP, there exists aprobability solution to the time -reversibility set of equations iPi,j= jPj,i,for all pairs of statesi,j, then the chain is positive recurrent, Time-reversible and the solution is the unique stationary :It suffices to show that such a solution also satisfies = P, for then (via in Lecture Notes 4) it is the unique stationary distribution and since it satisfies the time -reversibility equations, the chain is also time reversible .

5 To this end, fixing a statejandsumming over alliyields i iPi,j= i jPj,i= j iPj,i= j 1= j,namely, = importance of the above Proposition is that the time -reversibility equations are simplerto solve/check than are the = Pequations. So if you suspect (via some intuition) thatyour chain is Time-reversible , then you should first try to solve the time -reversibility , if you think your chain is Time-reversible and have a guess for at hand, then youshould check to see if it satisfies the time -reversibility random walk on the non-negative integers:Here is an example where intuitionquickly tells us that we have a Time-reversible chain . Consider anegative driftsimplerandom walk, restricted to be non-negative, in whichP0,1= 1 and otherwisePi,i+1=p < , Pi,i 1= 1 p > In this case, since the chain can only make a transition (changeof state) of magnitude 1, we immediately conclude that for each statei 0, the ratefromitoi+ 1 equals the rate fromi+ 1 toi.

6 This is by the same elementary reasoning2as argued for why the rate out of stateiequals the rate into statei, for each statei ,for any function/path and has nothing to do with Markov chains : every time there is achange of state fromitoi+ 1 there must be (soon after) a change of state fromi+ 1 toibecause that is the only way the process can, yet again, go fromitoi+ 1; there is aone-to-one correspondence. But the rate fromitoi+ 1 equals the rate fromi+ 1 toi is equivalent to (in words) the time -reversibility equations, since here a pairi,jcan onlybe of the formi,i+ 1 ori,i 1. Thus the time -reversibility equations are 0= (1 p) 1, p i= (1 p) i+1, i 1,yielding 1= 0/(1 p), 2=p 0/(1 p)2,.., n=pn 1 0/(1 p)n, n n n= 1 must hold, we get 0(1 + (1 p) 1 n 0[p1 p]n)= 1,and sincep1 p<1, the geometric series converges and we can solve explicitly for thestationary distribution: 0= (1 +11 2p) 1, n= (1 p) 1[p1 p]n 1 0, n 1,which simplifies to 0=1 2p2(1 p) n= (12 p)[p1 p]n 1, n we will see later, this Markov chain is the embedded discrete- time chain for an M/M/1queue in whichp= /( + ), where is the Poisson arrival rate of customers, and isthe exponential service time walk on a connected graph:Consider a finite connected graph withn 2 nodes,labeled 1 n, and positive weightswi,j=wj,i>0 for any pair of nodesi,jfor whichthere is an arc (wi,jdef= 0 if there is no arc).

7 We can define a Markov chain random walk(with state-dependent transition probabilities) on the nodes of the graph viaPi,j=wi,j kwi, chain is irreducible (by definition ofconnectedgraph). Moreover, because of thesymmetrywi,j=wj,i, and the way in which thePi,jare defines, we expect that this chainshould be Time-reversible too. We can explicitly solve the time -reversibility equations,which are in this case (sincewi,j=wj,i) i kwi,k= j kwj,k,for each pairi,j, or equivalently that for a constantC i kwi,k=C,for alli,3or equivalently i=C kwi,k,for it must hold that i = 1, we conclude thatC=[ i kwi,k] 1,yielding the solution as i= kwi,k i kwi, the chain is Time-reversible and we have solved for the stationary that when all positive weights are defined to be 1, then the chain always moves toa next node by choosing it uniformly from among all possible arcs out: if there is an arcfromitoj, thenPi,j= 1/b, whereb=b(i,j) denotes the total number of arcs