Example: stock market

Random Variables and Stochastic Processes

1 Random Variablesand Stochastic Processes2 Randomness The word Random effectively meansunpredictable In engineering practice we may treat somesignals as Random to simplify the analysiseven though they may not actually berandom3 Random Variable Defined X () A Random variable is the assignment of numerical values to the outcomes of experiments4 Random VariablesExamples of assignments of numbers to the outcomes vs Continuous-Value Random Variables A discrete-value (DV) Random variable has a set ofdistinct values separated by values that cannotoccur A Random variable associated with the outcomesof coin flips, card draws, dice tosses, wouldbe DV Random variable A continuous-value (CV) Random variable maytake on any value in a continuum of values whichmay be finite or infinite in size6 Distribution Functions FXx()=PX x()The distribution function of a Random variable X is theprobability that it is less than or equal to some value,as a function of that the dist

Stochastic Processes A random variable is a number assigned to every outcome of an experiment. X() A stochastic process is the assignment of a function of t to each outcome of an experiment. X()t, The set of functions corresponding

Tags:

  Processes, Random, Stochastic, Stochastic processes

Information

Domain:

Source:

Link to this page:

Please notify us if you found a problem with this document:

Other abuse

Transcription of Random Variables and Stochastic Processes

1 1 Random Variablesand Stochastic Processes2 Randomness The word Random effectively meansunpredictable In engineering practice we may treat somesignals as Random to simplify the analysiseven though they may not actually berandom3 Random Variable Defined X () A Random variable is the assignment of numerical values to the outcomes of experiments4 Random VariablesExamples of assignments of numbers to the outcomes vs Continuous-Value Random Variables A discrete-value (DV) Random variable has a set ofdistinct values separated by values that cannotoccur A Random variable associated with the outcomesof coin flips, card draws, dice tosses, wouldbe DV Random variable A continuous-value (CV) Random variable maytake on any value in a continuum of values whichmay be finite or infinite in size6 Distribution Functions FXx()=PX x()The distribution function of a Random variable X is theprobability that it is less than or equal to some value,as a function of that the distribution function is a probability it must satisfythe requirements for a probability.

2 0 FXx() 1, <x< FX ()=0 and FX+ ()=1 Px1<X x2()=FXx2() FXx1() is a monotonic function and its derivative is never negative. FXx()7 Distribution FunctionsThe distribution function for tossing a single die FXx()=1/ 6()ux 1()+ux 2()+ux 3()+ux 4()+ux 5()+ux 6() 8 Distribution FunctionsA possible distribution function for a continuous randomvariable9 Probability DensityThe derivative of the distribution function is the probability density function (pdf). fXx() ddxFXx()()Probability density can also be defined by fXx()dx=Px<X x+dx()Properties fXx() 0, <x<+ fXx()dx =1 FXx()=fX ()d x Px1<X x2()=fXx()dxx1x2 Proakis uses the notation px() instead of fXx() forprobability density.

3 10 The pdf for tossing a dieProbability Mass and Density11 Expectation and MomentsImagine an experiment with M possible distinct outcomesperformed N times. The average of those N outcomes is X=1 Nnixii=1M where is the ith distinct value of X and is the number oftimes that value occurred. Thenxi ni X=1 Nnixii=1M =niNxii=1M =rixii=1M The expected value of X is EX()=limN niNxii=1M =limN rixii=1M =PX=xi()xii=1M 12 Expectation and MomentsThe probability that X lies within some small range can beapproximated byand the expected value is then approximated by Pxi x2<X xi+ x2 fXxi() x EX()=Pxi x2<X xi+ x2 xii=1M xifXxi() xi=1M where M is now the number of subdivisions of width x of the range of the Random and MomentsIn the limit as x approaches zero, EX()=xfXx()dx Similarly EgX()()=gx()fXx()dx The nth moment of a Random variable is EXn()=xnfXx()

4 Dx 14 Expectation and MomentsThe first moment of a Random variable is its expected value EX()=xfXx()dx The second moment of a Random variable is its mean-squaredvalue (which is the mean of its square, not the square of its mean). EX2()=x2fXx()dx 15 Expectation and MomentsA central moment of a Random variable is the moment ofthat Random variable after its expected value is subtracted. EX EX() n =x EX() nfXx()dx The first central moment is always zero. The second centralmoment (for real-valued Random Variables ) is the variance, X2=EX EX() 2 =x EX() 2fXx()dx The positive square root of the variance is the and MomentsProperties of expectation Ea()=a,EaX()=aEX(),EXnn =EXn()n where a is a constant.

5 These properties can be use to provethe handy relationship, X2=EX2() E2X()The variance of a Random variable is the mean of its squareminus the square of its and MomentsFor complex-valued Random Variables absolute moments are useful. The nth absolute moment of a Random variable is defined by EXn()=xnfXx()dx and the nth absolute central moment is defined by EX EX()n =x EX()nfXx()dx 18 Joint Probability DensityLet X and Y be two Random Variables . Their joint distributionfunction is FXYx,y() PX x Y y() 0 FXYx,y() 1, <x< , <y< FXY , ()=FXYx, ()=FXY ,y()=0 FXY , ()=1 FXY ,y()=FYy() and FXYx, ()=FXx() FXYx,y() does not decrease if either x or y increases or both increase19 Joint Probability DensityJoint distribution function for tossing two dice20 Joint Probability Density fXYx,y()= 2 x yFXYx,y()() fXYx,y() 0, <x< , <y< fXYx,y()dx dy =1 FXYx,y()=fXY , ()d x d y fXx()=fXYx,y()dy and fYy()=fXYx,y()dx Px1<X x2,y1<Y y2()=fXYx,y()dxx1x2 dyy1y2 EgX,Y()()=gx,y()fXYx,y()dx dy PX,Y() R()=fXYx,y()

6 DxdyR 21 Independent Random VariablesIf two Random Variables X and Y are independent then fXYx,y()=fXx()fYy() EXY()=xyfXYx,y()dx dy =yfYy()dy xfXx()dx =EX()EY()and their correlation is the product of their expected values22 Covariance XY EX EX() Y EY() * =x EX()()y* EY*()()fXYx,y()dx dy XY=EXY*() EX()EY*()If X and Y are independent, XY=EX()EY*() EX()EY*()=0 Independent Random Variables23If two Random Variables are independent, their covariance iszero. However, if two Random Variables have a zero covariancethat does not mean they are necessarily independent. Zero Covariance Independence Independence Zero CovarianceIndependent Random Variables24In the traditional jargon of Random variable analysis, two uncorrelated Random Variables have a covariance of , this does not also imply that their correlation iszero.

7 If their correlation is zero they are said to be orthogonal. X and Y are "Uncorrelated" XY=0 X and Y are "Uncorrelated" EXY()=0 Independent Random Variables25 The variance of a sum of Random Variables X and Y is X+Y2= X2+ Y2+2 XY= X2+ Y2+2 XY X YIf Z is a linear combination of Random Variables Z=a0+aiXii=1N then EZ()=a0+aiEXi()i=1N Z2=aiaj XiXjj=1N i=1N =ai2 Xi2i=1N +aiaj XiXjj=1N i=1i jN XiIndependent Random Variables26If the X s are all independent of each other, the variance ofthe linear combination is a linear combination of the variances. Z2=ai2 Xi2i=1N If Z is simply the sum of the X s, and the X s are all independentof each other, then the variance of the sum is the sum of thevariances.

8 Z2= Xi2i=1N Independent Random Variables27 The Central Limit TheoremIf N independent Random Variables are added to form a resultant Random variable Z Z=Xnn=1N then fZz()=fX1z() fX2z() fX2z() fXNz()and it can be shown that, under very general conditions, the pdfof a sum of a large number of independent Random variableswith continuous pdf s approaches a limiting shape called the Gaussian pdf regardless of the shapes of the individual pdf Central Limit Theorem29 The Central Limit TheoremThe Gaussian pdf fXx()=1 X2 e x X()2/2 X2 X=EX() and X=EX EX() 2 30 The Central Limit TheoremThe Gaussian pdfIts maximum value occurs at the mean value of itsargumentIt is symmetrical about the mean valueThe points of maximum absolute slope occur at onestandard deviation above and below the meanIts maximum value is inversely proportional to itsstandard deviationThe limit as the standard deviation approaches zero is aunit impulse x x()=lim X 01 X2 e x X()2/2 X231 The Central Limit TheoremThe normal pdf is a Gaussian pdf with a mean of zero anda variance of one.

9 FXx()=12 e x2/2 The central moments of the Gaussian pdf are EX EX() n =0,n odd1 3 1() Xn,n even 32 Stochastic ProcessesA Random variable is a number assigned to every outcome of an experiment. X ()A Stochastic process is the assignment of a function of t to each outcome of an experiment. Xt, ()The set of functions corresponding to the N outcomes of an experiment is called an ensemble and each member is called a sample function of the Stochastic process. Xt, 1(),Xt, 2(), ,Xt, N(){} Xt, i()A common convention in the notation describing Stochastic Processes is to write the sample functions as functions of t only and to indicate the Stochastic process by instead of and any particular sample function by instead of.

10 Xt() Xt, () Xit() Xt, i()33 Stochastic ProcessesEnsembleSampleFunctionThe values of at a particular time define a Random variable or just . Xt() t1 Xt1() X134 Example of a Stochastic ProcessSuppose we place a temperature sensor at every airport control tower in the world and record the temperature at noon every day for a we have a discrete-time, continuous-value (DTCV) Stochastic of a Stochastic ProcessSuppose there is a large number of people, each flipping a fair coin every minute. If we assign the value 1 to a head and the value 0 to a tail we have a discrete-time, discrete-value(DTDV) Stochastic process 36 Continuous-Value vs.


Related search queries