Example: marketing

Essentials of Stochastic Processes

IEssentials of Stochastic ProcessesRick Durrett304050607010 Sep10 Jun10 Mayat expiry0102050052054056058060062064066068 0700 Almost Final Version of the 2nd Edition, December, 2011 Copyright 2011, All rights the first undergraduate course in probability and the first graduatecourse that uses measure theory, there are a number of courses that teachStochastic Processesto students with many different interests and with varyingdegrees of mathematical sophistication. To allow readers (and instructors) tochoose their own level of detail, many of the proofs begin with a nonrigorousanswer to the question Why is this true? followed by aProofthat fills inthe missing details. As it is possible to drive a car without knowing about theworking of the internal combustion engine, it is also possible to apply the theoryof markov chains without knowing the details of the proofs. It is my personalphilosophy that probability theory was developed to solve problems, so most ofour effort will be spent on analyzing examples.

Chapter 1 Markov Chains 1.1 Definitions and Examples The importance of Markov chains comes from two facts: (i) there are a large number of physical, biological, economic, and social phenomena that can be modeled in this way, and (ii) there is a well-developed theory that allows us to do computations.

Tags:

  Chain, Markov, 1 markov chains 1, Of markov chains

Information

Domain:

Source:

Link to this page:

Please notify us if you found a problem with this document:

Other abuse

Transcription of Essentials of Stochastic Processes

1 IEssentials of Stochastic ProcessesRick Durrett304050607010 Sep10 Jun10 Mayat expiry0102050052054056058060062064066068 0700 Almost Final Version of the 2nd Edition, December, 2011 Copyright 2011, All rights the first undergraduate course in probability and the first graduatecourse that uses measure theory, there are a number of courses that teachStochastic Processesto students with many different interests and with varyingdegrees of mathematical sophistication. To allow readers (and instructors) tochoose their own level of detail, many of the proofs begin with a nonrigorousanswer to the question Why is this true? followed by aProofthat fills inthe missing details. As it is possible to drive a car without knowing about theworking of the internal combustion engine, it is also possible to apply the theoryof markov chains without knowing the details of the proofs. It is my personalphilosophy that probability theory was developed to solve problems, so most ofour effort will be spent on analyzing examples.

2 Readers who want to master thesubject will have to do more than a few of the twenty dozen carefully book began as notes I typed in the spring of 1997 as I was teachingORIE 361 at Cornell for the second time. In Spring 2009, the mathematicsdepartment there introduced its own version of this course, MATH 474. Thisstarted me on the task of preparing the second edition. The plan was to havethis finished in Spring 2010 after the second time I taught the course, but whenMay rolled around completing the book lost out to getting ready to move toDurham after 25 years in Ithaca. In the Fall of 2011, I taught Duke s versionof the course, Math 216, to 20 undergrads and 12 graduate students and overthe Christmas break the second edition was second edition differs substantially from the first, though curiously thelength and the number of problems has remained roughly constant. Throughoutthe book there are many new examples and problems, with solutions that usethe TI-83 to eliminate the tedious details of solving linear equations by students tell me I should just use MATLAB and maybe I will for the markov chains chapter has been reorganized.

3 The chapter on Poissonprocesses has moved up from third to second, and is now followed by a treatmentof the closely related topic of renewal theory. Continuous time markov chainsremain fourth, with a new section on exit distributions and hitting times, andreduced coverage of queueing networks. Martingales, a difficult subject forstudents at this level, now comes fifth, in order to set the stage for their use ina new sixth chapter on mathematical finance. The treatment of finance expandsthe two sections of the previous treatment to include American options and thethe capital asset pricing model. Brownian motion makes a cameo appearancein the discussion of the Black-Scholes theorem, but in contrast to the previousedition, is not discussed in usual the second edition has profited from people who have told me abouttypos over the last dozen years. If you find new ones, email: DurrettContents1 markov Definitions and Examples.

4 Multistep Transition Probabilities .. Classification of States .. Stationary Distributions .. Limit Behavior .. Special Examples .. Doubly Stochastic chains .. Detailed balance condition .. Reversibility .. The Metropolis-Hastings algorithm .. Proofs of the Main Theorems* .. Exit Distributions .. Exit Times .. Infinite State Spaces* .. Chapter Summary .. Exercises .. 622 Poisson Exponential Distribution .. Defining the Poisson Process .. Compound Poisson Processes .. Transformations .. Thinning .. Superposition .. Conditioning .. Chapter Summary .. Exercises .. 923 Renewal Laws of Large Numbers .. Applications to Queueing Theory .. GI/G/1 queue .. Cost equations .. M/G/1 queue .. Age and Residual Life* .. Discrete case .. General case .. Chapter Summary .. Exercises .. 1134 Continuous Time markov Definitions and Examples.

5 Computing the Transition Probability .. Limiting Behavior .. Exit Distributions and Hitting Times .. Markovian Queues .. Queueing Networks* .. Chapter Summary .. Exercises .. 1505 Conditional Expectation .. Examples, Basic Properties .. Gambling Strategies, Stopping Times .. Applications .. Convergence .. Exercises .. 1756 Mathematical Two Simple Examples .. Binomial Model .. Concrete Examples .. Capital Asset Pricing Model .. American Options .. Black-Scholes formula .. Calls and Puts .. Exercises .. 203A Review of Probabilities, Independence .. Random Variables, Distributions .. Expected Value, Moments .. 215 Chapter 1 markov Definitions and ExamplesThe importance of markov chains comes from two facts: (i) there are a largenumber of physical, biological, economic, and social phenomena that can bemodeled in this way, and (ii) there is a well-developed theory that allows us todo computations.

6 We begin with a famous example, then describe the propertythat is the defining feature of markov chainsExample Gambler s a gambling game in which on anyturn you win $1 with probabilityp= or lose $1 with probability 1 p= further that you adopt the rule that you quit playing if your fortunereaches $N. Of course, if your fortune reaches $0 the casino makes you the amount of money you have afternplays. Your fortune,Xnhas the markov property. In words, this means that given the current state,Xn, any other information about the past is irrelevant for predicting the nextstateXn+1. To check this for the gambler s ruin chain , we note that if you arestill playing at timen, , your fortuneXn=iwith 0< i < N, then for anypossible history of your wealthin 1, in 2, .. i1, i0P(Xn+1=i+ 1|Xn=i, Xn 1=in 1, .. X0=i0) = to increase your wealth by one unit you have to win your next bet. Herewe have usedP(B|A) for the conditional probability of the eventBgiven thatAoccurs.

7 Recall that this is defined byP(B|A) =P(B A)P(A)If you need help with this notion, see section of the now to the formal definition, we say thatXnis a discrete timeMarkov chainwithtransition matrixp(i, j) if for anyj, i, in 1, .. i0P(Xn+1=j|Xn=i, Xn 1=in 1, .. , X0=i0) =p(i, j)( )Here and in what follows,boldfaceindicates a word or phrase that is beingdefined or ( ) explains what we mean when we say that given the currentstateXn, any other information about the past is irrelevant for predicting12 CHAPTER 1. markov CHAINSXn+1. In formulating ( ) we have restricted our attention to thetemporallyhomogeneouscase in which thetransition probabilityp(i, j) =P(Xn+1=j|Xn=i)does not depend on the , the transition probability gives the rules of the game. It isthe basic information needed to describe a markov chain . In the case of thegambler s ruin chain , the transition probability hasp(i, i+ 1) = , p(i, i 1) = ,if 0< i < Np(0,0) = 1p(N, N) = 1 WhenN= 5 the matrix is0 1 2 3 4 0 0 0 0 0 0 0 020 0 0 030 0 0 040 0 0 0 0 0 0 0 the chain by be represented pictorially 012345.

8 1 1 Example Ehrenfest chain originated in physics as a modelfor two cubical volumes of air connected by a small hole. In the mathematicalversion, we have two urns, , two of the exalted trash cans of probabilitytheory, in which there are a total ofNballs. We pick one of theNballs atrandom and move it to the other the number of balls in the left urn after thenth draw. It shouldbe clear thatXnhas the markov property; , if we want to guess the stateat timen+ 1, then the current number of balls in the left urnXn, is the onlyrelevant information from the observed sequence of statesXn, Xn 1, .. X1, check this we note thatP(Xn+1=i+ 1|Xn=i, Xn 1=in 1, .. X0=i0) = (N i)/Nsince to increase the number we have to pick one of theN iballs in the otherurn. The number can also decrease by 1 with probabilityi/N. In symbols, wehave computed that the transition probability is given byp(i, i+ 1) = (N i)/N, p(i, i 1) =i/Nfor 0 i Nwithp(i, j) = 0 otherwise.

9 WhenN= 4, for example, the matrix is0 1 2 3 400 1 0 0 011/4 0 3/4 0 020 2/4 0 2/4 030 0 3/4 0 1/440 0 0 1 DEFINITIONS AND EXAMPLES3In the first two examples we began with a verbal description and then wrotedown the transition probabilities. However, one more commonly describes aMarkov chain by writing down a transition probabilityp(i, j) with(i)p(i, j) 0, since they are probabilities.(ii) jp(i, j) = 1, since whenXn=i,Xn+1will be in some equation in (ii) is read sump(i, j) over all possible values ofj. In wordsthe last two conditions say: the entries of the matrix are nonnegative and eachROW of the matrix sums to matrix with properties (i) and (ii) gives rise to a markov chain , construct the chain we can think of playing a board game. When we are instatei, we roll a die (or generate a random number on a computer) to pick thenext state, going tojwith probabilityp(i, j).

10 Example Weather the weather on daynin Ithaca,NY, which we assume is either: 1 =rainy, or 2 =sunny. Even though theweather is not exactly a markov chain , we can propose a markov chain modelfor the weather by writing down a transition probability1 table says, for example, the probability a rainy day (state 1) is followed bya sunny day (state 2) isp(1,2) = A typical question of interest is:Q. What is the long-run fraction of days that are sunny?Example Social a family s social class in thenthgeneration, which we assume is either 1 =lower, 2 =middle, or 3 =upper. Inour simple version of sociology, changes of status are a markov chain with thefollowing transition probability1 2 Do the fractions of people in the three classes approach a limit?Example Brand there are three types of laundrydetergent, 1, 2, and 3, and letXnbe the brand chosen on thenth who try these brands are satisfied and choose the same thing againwith probabilities , , and respectively.


Related search queries