Example: confidence

Absorbing Markov Chains - Dartmouth College

Absorbing Markov Chains8/11/2006 More Examples of Markov ChainsThe President of the United States tells person A his or her in-tention to run or not to run in the next election. Then A relays thenews to B, who in turn relays the message to C, and so forth, alwaysto some new person. We assume that there is a probabilityathat aperson will change the answer from yes to no when transmitting it tothe next person and a probabilitybthat he or she will change it fromno to yes. We choose as states the message, either yes or no. Thetransition matrix is thenP=(yesnoyes1 a anob1 b).The initial state represents the President s Examples ..In the Dark Ages, Harvard, Dartmouth , and Yale admitted onlymale students. Assume that, at that time, 80 percent of the sons ofHarvard men went to Harvard and the rest went to Yale, 40 percent ofthe sons of Yale men went to Yale, and the rest split evenly betweenHarvard and Dartmouth ; and of the sons of Dartmouth men, 70 per-cent went to Dartmouth , 20 percent to Harvard, and 10 percent toYale.

Absorbing Markov Chains † A state si of a Markov chain is called absorbing if it is impossible to leave it (i.e., pii = 1). † A Markov chain is absorbing if it has at least one absorbing state, and if from every state it is possible to go to an absorbing state (not necessarily in one step).

Tags:

  Chain, Markov, Markov chain

Information

Domain:

Source:

Link to this page:

Please notify us if you found a problem with this document:

Other abuse

Transcription of Absorbing Markov Chains - Dartmouth College

1 Absorbing Markov Chains8/11/2006 More Examples of Markov ChainsThe President of the United States tells person A his or her in-tention to run or not to run in the next election. Then A relays thenews to B, who in turn relays the message to C, and so forth, alwaysto some new person. We assume that there is a probabilityathat aperson will change the answer from yes to no when transmitting it tothe next person and a probabilitybthat he or she will change it fromno to yes. We choose as states the message, either yes or no. Thetransition matrix is thenP=(yesnoyes1 a anob1 b).The initial state represents the President s Examples ..In the Dark Ages, Harvard, Dartmouth , and Yale admitted onlymale students. Assume that, at that time, 80 percent of the sons ofHarvard men went to Harvard and the rest went to Yale, 40 percent ofthe sons of Yale men went to Yale, and the rest split evenly betweenHarvard and Dartmouth ; and of the sons of Dartmouth men, 70 per-cent went to Dartmouth , 20 percent to Harvard, and 10 percent toYale.

2 We form a Markov chain with transition matrixP= H Y .2 Absorbing Markov Chains A statesiof a Markov chain is calledabsorbingif it is impossibleto leave it ( ,pii= 1). A Markov chain is Absorbing if it has at least one Absorbing state,and if from every state it is possible to go to an Absorbing state(not necessarily in one step). In an Absorbing Markov chain , a state which is not Absorbing : Drunkard s WalkA man walks along a four-block stretch of Park Avenue. If heis at corner 1, 2, or 3, then he walks to the left or right with equalprobability. He continues until he reaches corner 4, which is a bar, orcorner 0, which is his home. If he reaches either home or the bar, hestays Transition MatrixP= 0 1 2 3 40 1 0 0 0 01 1/2 0 1/2 0 02 0 1/2 0 1/2 03 0 0 1/2 0 1/24 0 0 0 0 1.

3 5 Canonical Form For an Absorbing Markov chain , renumber the states so that thetransient states come first. If there arerabsorbing states andttransient states, the transitionmatrix will have the followingcanonical formP=(TR. )6 The firsttstates are transient and the lastrstates are Absorbing . Iis anr-by-ridentity matrix,0is anr-by-tzero matrix,Ris anonzerot-by-rmatrix, andQis Recall that the entryp(n)ijof the matrixPnis the probability ofbeing in the statesjafternsteps, when the chain is started instatesi. Pnis of the formPn=(TR. )8 Probability of an Absorbing Markov chain , the probability that theprocess will be absorbed is 1 ( ,Qn 0asn ).9 The Fundamental an Absorbing Markov chain the matrixI Qhasan inverseNandN=I+Q+Q2+ .For an Absorbing Markov chainP, the matrixN= (I Q) 1iscalled thefundamental matrixforP. The entrynijofNgives theexpected number of times that the process is in the transient statesjif it is started in the transient s Walk exampleP= 1230410 1/2 01/2 021/2 0 1/20 030 1/2 00 1/200 0 01 040 0 00 1.

4 11Q= 0 1/2 01/2 0 1/20 1/2 0 ,andI Q= 1 1/2 0 1/2 1 1/20 1/2 1 .12Q= 0 1/2 01/2 0 1/20 1/2 0 ,andI Q= 1 1/2 0 1/2 1 1/20 1/2 1 .N= (I Q) 1= 1 2 31 3/2 1 1/22 1 2 13 1/2 1 3/2 .12 Time to the expected number of steps before the chainis absorbed, given that the chain starts in statesi, and lettbe thecolumn vector whoseith entry isti. Thent=Nc ,wherecis a column vector all of whose entries are ProbabilitiesLetbijbe the probability that an Absorbing chain will be absorbedin the Absorbing statesjif it starts in the transient statesi. LetBbe the matrix with entriesbij. ThenBis ant-by-rmatrix, andB=NR ,whereNis the fundamental matrix andRis as in the canonical the Drunkard s Walk exampleN= 1 2 31 3/2 1 1/22 1 2 13 1/2 1 3/2 .t=Nc= 3/2 1 1/21 2 11/2 1 3/2 111 = 343.

5 15B=NR= 3/2 1 1/21 2 11/2 1 3/2 1/2 00 00 1/2 = 0 41 3/4 1/42 1/2 1/23 1/4 3/4 .16


Related search queries