Transcription of 1 Limiting distribution for a Markov chain
{{id}} {{{paragraph}}}
Copyrightc 2009 by Karl Sigman1 Limiting distribution for a Markov chainIn these Lecture Notes, we shall study the Limiting behavior of Markov chains as timen .In particular, under suitable easy-to-check conditions, we will see that a Markov chain possessesa Limiting probability distribution , = ( j)j S, and that the chain , if started off initially withsuch a distribution will be a stationary stochastic process. We will also see that we can find by merely solving a set of linear Communication classes and irreducibility for Markov chainsFor a Markov chain with state spaceS, consider a pair of states (i,j). We say thatjis reachablefromi, denoted byi j, if there exists an integern 0 such thatPnij>0. This means thatstarting in statei, there is a positive probability (but not necessarily equal to 1) that the chainwill be in statejat timen(that is,nsteps later);P(Xn=j|X0=i)>0.
1. 2.3 Limiting stationary distribution ˇ 0 ˇ = i); is called the limiting or stationary or steady-state distribution of the Markov chain.
Domain:
Source:
Link to this page:
Please notify us if you found a problem with this document:
{{id}} {{{paragraph}}}