Example: marketing

Strict-Sense and Wide-Sense Stationarity Autocorrelation ...

Lecture Notes 7 Stationary random Processes Strict-Sense and Wide-Sense Stationarity Autocorrelation Function of a Stationary process Power Spectral Density Stationary Ergodic random ProcessesEE 278: Stationary random ProcessesPage 7 1 Stationary random Processes Stationarity refers totime invarianceof some, or all, of the statistics of arandom process , such as mean, Autocorrelation ,n-th-order distribution We define two types of Stationarity :strict sense(SSS) andwide sense(WSS) A random processX(t)(orXn) is said to be SSS ifallits finite orderdistributions are time invariant, , the joint cdfs (pdfs, pmfs) ofX(t1), X(t2), .. , X(tk) andX(t1+ ), X(t2+ ), .. , X(tk+ )are the same for allk, allt1, t2, .. , tk, and all time shifts So for a SSS process , the first-order distribution is independent oft, and thesecond-order distribution the distribution of any two samplesX(t1)andX(t2) depends only on =t2 t1To see this, note that from the definition of Stationarity , for anyt, the jointdistribution ofX(t1)andX(t2)is the same as the joint distribution ofX(t1+ (t t1)) =X(t)a

random process, such as mean, autocorrelation, n-th-order distribution • We define two types of stationarity: strict sense (SSS) and wide sense (WSS) • A random process X(t) (or Xn) is said to be SSS if all its finite order distributions are time invariant, i.e., the joint cdfs (pdfs, pmfs) of

Tags:

  Process, Random, Random process

Information

Domain:

Source:

Link to this page:

Please notify us if you found a problem with this document:

Other abuse

Transcription of Strict-Sense and Wide-Sense Stationarity Autocorrelation ...

1 Lecture Notes 7 Stationary random Processes Strict-Sense and Wide-Sense Stationarity Autocorrelation Function of a Stationary process Power Spectral Density Stationary Ergodic random ProcessesEE 278: Stationary random ProcessesPage 7 1 Stationary random Processes Stationarity refers totime invarianceof some, or all, of the statistics of arandom process , such as mean, Autocorrelation ,n-th-order distribution We define two types of Stationarity :strict sense(SSS) andwide sense(WSS) A random processX(t)(orXn) is said to be SSS ifallits finite orderdistributions are time invariant, , the joint cdfs (pdfs, pmfs) ofX(t1), X(t2), .. , X(tk) andX(t1+ ), X(t2+ ), .. , X(tk+ )are the same for allk, allt1, t2, .. , tk, and all time shifts So for a SSS process , the first-order distribution is independent oft, and thesecond-order distribution the distribution of any two samplesX(t1)andX(t2) depends only on =t2 t1To see this, note that from the definition of Stationarity , for anyt, the jointdistribution ofX(t1)andX(t2)is the same as the joint distribution ofX(t1+ (t t1)) =X(t)andX(t2+ (t t1)) =X(t+ (t2 t1))EE 278: Stationary random ProcessesPage 7 2 Example.

2 The random phase signalX(t) = cos( t+ )where U[0,2 ]is SSS We already know that the first order pdf isfX(t)(x) =1 p1 (x/ )2, < x <+ which is independent oft, and is therefore stationary To find the second order pdf, note that if we are given the valueofX(t)atone point, sayt1, there are (at most) two possible sample functions:x1x21x22tt1t2EE 278: Stationary random ProcessesPage 7 3 The second order pdf can thus be written asfX(t1),X(t2)(x1, x2) =fX(t1)(x1)fX(t2)|X(t1)(x2|x1)=fX(t1)(x1 ) 12 (x2 x21) +12 (x2 x22) ,which depends only ont2 t1, and thus the second order pdf is stationary Now if we know thatX(t1) =x1andX(t2) =x2, the sample path is totallydetermined (except whenx1=x2= 0, where two paths may be possible),and thus alln-th order pdfs are stationary IID processes are SSS random walk and Poisson processes are not SSS The Gauss-Markov process (as we defined it) is not SSS.

3 However, if we setX1to the steady state distribution ofXn, it becomes SSS (see homework exercise)EE 278: Stationary random ProcessesPage 7 4 Wide-Sense Stationary random Processes A random processX(t)is said to bewide-sense stationary(WSS) if its meanand Autocorrelation functions are time invariant, , E(X(t)) = , independent oft RX(t1, t2)is a function only of the time differencet2 t1 E[X(t)2]< (technical condition) SinceRX(t1, t2) =RX(t2, t1), for any wide sense stationary processX(t),RX(t1, t2)is a function only of|t2 t1| Clearly SSS WSS. The converse is not necessarily trueEE 278: Stationary random ProcessesPage 7 5 Example: LetX(t) = + sintwith probability14 sintwith probability14+ costwith probability14 costwith probability14 E(X(t)) = 0andRX(t1, t2) =12cos(t2 t1), thusX(t)is WSS ButX(0)andX( 4)do not have the same pmf (different ranges), so the firstorder pmf is not stationary, and the process is not SSS For Gaussian random processes, WSS SSS, since the process is completelyspecified by its mean and Autocorrelation functions random walk is not WSS, sinceRX(n1, n2) = min{n1, n2}is not timeinvariant.

4 Similarly Poisson process is not WSSEE 278: Stationary random ProcessesPage 7 6 Autocorrelation Function of WSS Processes LetX(t)be a WSS process . RelabelRX(t1, t2)asRX( )where =t1 ( )is real and even, ,RX( ) =RX( )for every 2.|RX( )| RX(0) = E[X2(t)], the average power ofX(t)This can be shown as follows. For everyt,(RX( ))2= [E(X(t)X(t+ ))]2 E[X2(t)] E[X2(t+ )] by Schwarz inequality= (RX(0))2by stationarity3. IfRX(T) =RX(0)for someT6= 0, thenRX( )is periodic with periodTandso isX(t)(with probability1) !! That is,RX( ) =RX( +T), X( ) =X( +T) every EE 278: Stationary random ProcessesPage 7 7 Example: The Autocorrelation function for the periodic signal with randomphaseX(t) = cos( t+ )isRX( ) = 22cos (also periodic) To prove property 3, we again use the Schwarz inequality: Forevery , RX( ) RX( +T) 2= E (X(t)(X(t+ ) X(t+ +T))) 2 E[X2(t)] E (X(t+ ) X(t+ +T))2 =RX(0)(2RX(0) 2RX(T))=RX(0)(2RX(0) 2RX(0)) = 0 ThusRX( ) =RX( +T)for all , ,RX( )is periodic with periodT The above properties ofRX( )are necessary but not sufficient for a function toqualify as an Autocorrelation function for a WSS processEE 278.

5 Stationary random ProcessesPage 7 8 The necessary and sufficient conditions for a function to be anautocorrelationfunction for a WSS process is that it bereal,even, andnonnegative definiteBy nonnegative definite we mean that for anyn, anyt1, t2, .. , tnand any realvectora= (a1, .. , an),nXi=1nXj=1aiajR(ti tj) 0To see why this is necessary, recall that the correlation matrix for a randomvector must be nonnegative definite, so if we take a set ofnsamples from theWSS random process , their correlation matrix must be nonnegative definiteThe condition is sufficient since such anR( )can specify a zero meanstationary Gaussian random process The nonnegative definite condition may be difficult to verify directly. It turnsout, however, to be equivalent to the condition that the Fourier transformofRX( ), which is called thepower spectral densitySX(f), is nonnegative forall frequenciesfEE 278: Stationary random ProcessesPage 7 9 Which Functions Can Be anRX( )?

6 E | | sinc EE 278: Stationary random ProcessesPage 7 10 Which Functions can be anRX( )? T2T1 12 |n|n 4 3 2 1 1 2 3 1 TT 1EE 278: Stationary random ProcessesPage 7 11 Interpretation of Autocorrelation Function LetX(t)be WSS with zero mean. IfRX( )drops quickly with , this meansthat samples become uncorrelated quickly as we increase . Conversely, ifRX( )drops slowly with , samples are highly correlatedRX1( ) RX2( ) SoRX( )is a measure of the rate of change ofX(t)with timet, ,thefrequency responseofX(t) It turns out that this is not just an intuitive interpretation the Fouriertransform ofRX( )(the power spectral density) is in fact the average powerdensity ofX(t)over frequencyEE 278: Stationary random ProcessesPage 7 12 Power Spectral Density Thepower spectral density(psd) of a WSS random processX(t)is the Fouriertransform ofRX( ):SX(f) =F RX( ) =Z RX( )e i2 fd For a discrete time processXn, the power spectral density is the discrete-timeFourier transform (DTFT) of the sequenceRX(n).

7 SX(f) = Xn= RX(n)e i2 nf,|f|<12 RX( )(orRX(n)) can be recovered fromSX(f)by taking the inverse Fouriertransform or inverse DTFT:RX( ) =Z SX(f)ei2 fdfRX(n) =Z12 12SX(f)ei2 nfdfEE 278: Stationary random ProcessesPage 7 13 Properties of the Power Spectral (f)is real and even, since the Fourier transform of the real and evenfunctionRX( )is real and SX(f)df=RX(0) = E(X2(t)), the average power ofX(t), , the areaunderSXis the average (f)isthe average power density, , the average power ofX(t)in thefrequency band[f1, f2]isZ f1 f2SX(f)df+Zf2f1SX(f)df= 2Zf2f1SX(f)df(we will show this soon) From property 3, it follows thatSX(f) 0. Why? In general, a functionS(f)is a psd if and only if it is real, even, nonnegative,andZ S(f)df < EE 278: Stationary random ProcessesPage 7 ( ) =e | | SX(f) =2 2+ (2 f) ( ) = 22cos SX(f) 24 24f 2 2 EE 278: Stationary random ProcessesPage 7 (n) = 2 |n|n 4 3 2 11 2 3 4SX(f) =35 4 cos 2 ff time white noise process :X1, X2.

8 , Xn, ..zero mean, uncorrelated,with average powerNRX(n) =(N n= 00 otherwiseNnSX(f)Nf 12+12 IfXnis also a GRP, then we obtain adiscrete time WGN processEE 278: Stationary random ProcessesPage 7 white noise process : WSS zero mean processX(t)withSX(f)N2f BBRX( ) =N Bsinc 2B 12B22B For anyt, the samplesX t n2B forn= 0,1,2, ..are noise process : If we letB in the previous example, we obtain awhite noise process , which hasSX(f) =N2for allfRX( ) =N2 ( )EE 278: Stationary random ProcessesPage 7 17If, in addition,X(t)is a GRP, then we obtain the famouswhite Gaussian noise(WGN) process Remarks on white noise: For a white noise process , all samples are uncorrelated The process is not physically realizable, since it has infinite power However, it plays a similar role in random processes to pointmass in physicsand delta function in linear systems Thermal noise and shot noise are well modeled as white Gaussian noise, sincethey have very flat psd over very wide band (GHz)EE 278.)

9 Stationary random ProcessesPage 7 18 Stationary Ergodic random processes Ergodicity refers to certain time averages of random processes converging totheir respective statistical averages We focus only on mean ergodicity of WSS processes LetXn,n= 1,2, .., be a discrete time WSS process with mean andautocorrelation functionRX(n) To estimate the mean ofXn, we form the sample mean Xn=1nnXi=1Xi The processXnis said to bemean ergodicif: Xn in mean square, ,limn E[( Xn )2] = 0 SinceE( Xn) = , this condition is equivalent to:Var( Xn) 0asn EE 278: Stationary random ProcessesPage 7 19 We can express this condition in terms ofCX(n) =RX(n) 2as followsVar( Xn) =1n2nXi=1nXj=1E[(Xi )(Xj )]=1n2nXi=1nXj=1CX(i j)=1nCX(0) +2n2n 1Xi=1(n i)CX(i)Since by definition,CX(0)< , the condition for mean ergodicity is:2n2n 1Xi=1(n i)CX(i) 0 Example.

10 LetXna WSS process withCX(n) = 0forn6= 0, then the process ismean ergodic The process does not need to have uncorrelated samples for itto be meanergodic, however (see stationary Gauss-Markov process problem in HW7)EE 278: Stationary random ProcessesPage 7 20 Not every WSS process is mean ergodicExample: Consider the coin with random biasPexample in Lecture Notes random processX1, X2, ..is stationaryHowever, it is not mean ergodic, since Xn Pin Remarks: The process in the above example can be viewed as amixtureof IIDB ernoulli(p) processes, each of which is stationary ergodic (it turns out thatevery SSS process is a mixture of stationary ergodic processes) Ergodicity can be defined for general (not necessarily stationary) processes(this is beyond the scope of this course)EE 278: Stationary random ProcessesPage 7 21 Mean Ergodicity for Continuous-time WSS Processes LetX(t)be WSS process with mean and Autocorrelation functionRX( ) To estimate the mean, we form thetime average X(t) = (1/t)Zt0X( )d But what does this integral mean?


Related search queries