Example: marketing

TIME SERIES - University of Cambridge

time SERIESC ontentsSyllabus .. iiiBooks .. iiiKeywords .. iv1 Models for time time SERIES data .. Trend, seasonality, cycles and residuals .. Stationary processes .. Autoregressive processes .. Moving average processes .. White noise .. The turning point test .. 42 Models of stationary Purely indeterministic processes .. ARMA processes .. ARIMA processes .. Estimation of the autocovariance function .. Identifying a MA(q) process .. Identifying an AR(p) process .. Distributions of the ACF and PACF .. 83 Spectral The discrete Fourier transform .. The spectral density .. Analysing the effects of smoothing.

economics - e.g., monthly data for unemployment, hospital admissions, etc. • finance - e.g., daily exchange rate, a share price, etc. • environmental - e.g., daily rainfall, air quality readings. • medicine - e.g., ECG brain wave activity every 2−8 secs. The methods of time series analysis pre-date those for general stochastic ...

Tags:

  Economic, Series, Time, Environmental, Time series

Information

Domain:

Source:

Link to this page:

Please notify us if you found a problem with this document:

Other abuse

Transcription of TIME SERIES - University of Cambridge

1 time SERIESC ontentsSyllabus .. iiiBooks .. iiiKeywords .. iv1 Models for time time SERIES data .. Trend, seasonality, cycles and residuals .. Stationary processes .. Autoregressive processes .. Moving average processes .. White noise .. The turning point test .. 42 Models of stationary Purely indeterministic processes .. ARMA processes .. ARIMA processes .. Estimation of the autocovariance function .. Identifying a MA(q) process .. Identifying an AR(p) process .. Distributions of the ACF and PACF .. 83 Spectral The discrete Fourier transform .. The spectral density .. Analysing the effects of smoothing.

2 124 Estimation of the The periodogram .. Distribution of spectral estimates .. The fast Fourier transform .. 165 Linear The Filter Theorem .. Application to autoregressive processes .. Application to moving average processes .. The general linear process .. Filters and ARMA processes .. Calculating autocovariances in ARMA models .. 206 Estimation of trend and Moving averages .. Centred moving averages .. The Slutzky-Yule effect .. Exponential smoothing .. Calculation of seasonal indices .. 247 Fitting ARIMA The Box-Jenkins procedure .. Identification .. Estimation .. Verification .. Tests for white noise .. Forecasting with ARMA models.

3 288 State space Models with unobserved states .. The Kalman filter .. Prediction .. Parameter estimation revisited .. 32iiSyllabusTime SERIES analysis refers to problems in which observations are collected at regulartime intervals and there are correlations among successiveobservations. Applicationscover virtually all areas of Statistics but some of the most important include economicand financial time SERIES , and many areas of environmental orecological this course, I shall cover some of the most important methods for dealing withthese problems. In the case of time SERIES , these include thebasic definitions ofautocorrelations etc., then time -domain model fitting including autoregressive andmoving average processes, spectral methods, and some discussion of the effect of timeseries correlations on other kinds of statistical inference, such as the estimation ofmeans and regression Brockwell and Davis, time SERIES : Theory and Methods, SpringerSeries in Statistics (1986).

4 2. C. Chatfield,The Analysis of time SERIES : Theory and Practice, Chapman andHall (1975). Good general introduction, especially for those completely new totime Diggle, time SERIES : A Biostatistical Introduction, Oxford University Press(1990).4. M. Kendall, time SERIES , Charles Griffin (1976).iiiKeywordsACF, 2AR(p), 2 ARIMA(p,d,q), 6 ARMA(p,q), 5autocorrelation function, 2autocovariance function, 2, 5autoregressive moving averageprocess, 5autoregressive process, 2 Box-Jenkins, 18classical decomposition, 1estimation, 18filter generating function, 12 Gaussian process, 5identifiability, 14identification, 18integrated autoregressive movingaverage process, 6invertible process, 4MA(q), 3moving average process, 3nondeterministic, 5nonnegative definite sequence, 6 PACF, 11periodogram, 15sample partial autocorrelationcoefficient, 11second order stationary, 2spectral density function, 8spectral distribution function, 8strictly stationary, 1strongly stationary, 1turning point test, 4verification, 20weakly stationary, 2white noise, 4 Yule-Walker equations, 3iv1 Models for time time SERIES dataA time SERIES is a set of statistics, usually collected at regular intervals.

5 time seriesdata occur naturally in many application areas. economics - , monthly data for unemployment, hospital admissions, etc. finance - , daily exchange rate, a share price, etc. environmental - , daily rainfall, air quality readings. medicine - , ECG brain wave activity every 2 methods of time SERIES analysis pre-date those for general stochastic processesand Markov Chains. The aims of time SERIES analysis are to describe and summarisetime SERIES data, fit low-dimensional models, and make write our real-valued SERIES of observations as.. , X 2, X 1, X0, X1, X2, .., adoubly infinite sequence of real-valued random variables indexed Trend, seasonality, cycles and residualsOne simple method of describing a SERIES is that ofclassical decomposition.

6 Thenotion is that the SERIES can be decomposed into four elements:Trend(Tt) long term movements in the mean;Seasonal effects(It) cyclical fluctuations related to the calendar;Cycles(Ct) other cyclical fluctuations (such as a business cycles);Residuals(Et) other random or systematic idea is to create separate models for these four elementsand then combinethem, either additivelyXt=Tt+It+Ct+Etor multiplicativelyXt=Tt It Ct Stationary processes1. A sequence{Xt, t Z}isstrongly stationaryorstrictly stationaryif(Xt1, .. , Xtk)D=(Xt1+h, .. , Xtk+h)for all sets of time pointst1, .. , tkand A sequence isweakly stationary, orsecond order stationaryif1(a)E(Xt) = , and(b) cov(Xt, Xt+k) = k,where is constant and kis independent The sequence{ k,k Z}is called theautocovariance We also define k= k/ 0= corr(Xt, Xt+k)and call{ k, k Z}theautocorrelation function(ACF).

7 A strictly stationary process is weakly If the process is Gaussian, that is (Xt1, .. , Xtk) is multivariate normal, for allt1, .. , tk, then weak stationarity implies strong 0= var(Xt)>0, assumingXtis genuinely By symmetry, k= k, for Autoregressive processesTheautoregressive processof orderpis denoted AR(p), and defined byXt=pXr=1 rXt r+ t( )where 1, .. , rare fixed constants and{ t}is a sequence of independent (or uncor-related) random variables with mean 0 and variance AR(1) process is defined byXt= 1Xt 1+ t.( )To find its autocovariance function we make successive substitutions, to getXt= t+ 1( t 1+ 1( t 2+ )) = t+ 1 t 1+ 21 t 2+ The fact that{Xt}is second order stationary follows from the observation thatE(Xt) = 0 and that the autocovariance function can be calculated as follows: 0=E t+ 1 t 1+ 21 t 2+ 2= 1 + 21+ 41+ 2= 21 21 k=E Xr=0 r1 t r Xs=0 s1 t+k s!

8 = 2 k11 is an easier way to obtain these results. Multiply equation ( ) byXt kand take the expected value, to giveE(XtXt k) =E( 1Xt 1Xt k) +E( tXt k).Thus k= 1 k 1,k= 1,2, ..Similarly, squaring ( ) and taking the expected value givesE(X2t) = 1E(X2t 1) + 2 1E(Xt 1 t) +E( 2t) = 21E(X2t 1) + 0 + 2and so 0= 2/(1 21).More generally, the AR(p) process is defined asXt= 1Xt 1+ 2Xt 2+ + pXt p+ t.( )Again, the autocorrelation function can be found by multiplying ( ) byXt k, takingthe expected value and dividing by 0, thus producing theYule-Walker equations k= 1 k 1+ 2 k 2+ + p k p, k= 1,2, ..These are linear recurrence relations, with general solution of the form k=C1 |k|1+ +Cp |k|p,where 1, .. , pare the roots of p 1 p 1 2 p 2 p= 0andC1.

9 , Cpare determined by 0= 1 and the equations fork= 1, .. , p 1. Itis natural to require k 0 ask , in which case the roots must lie inside theunit circle, that is,| i|<1. Thus there is a restriction on the values of 1, .. , pthat can be Moving average processesThemoving average processof orderqis denoted MA(q) and defined byXt=qXs=0 s t s( )where 1, .. , qare fixed constants, 0= 1, and{ t}is a sequence of independent(or uncorrelated) random variables with mean 0 and variance is clear from the definition that this is second order stationary and that k= 0,|k|> q 2Pq |k|s=0 s s+k,|k| q3We remark that two moving average processes can have the sameautocorrelationfunction. For example,Xt= t+ t 1andXt= t+ (1/ ) t 1both have 1= /(1 + 2), k= 0,|k|>1.

10 However, the first gives t=Xt t 1=Xt (Xt 1 t 2) =Xt Xt 1+ 2Xt 2 This is only valid for| |<1, a so-calledinvertible process. No two invertibleprocesses have the same autocorrelation White noiseThe sequence{ t}, consisting of independent (or uncorrelated) random variables withmean 0 and variance 2is calledwhite noise(for reasons that will become clearlater.) It is a second order stationary SERIES with 0= 2and k= 0,k6= The turning point testWe may wish to test whether a SERIES can be considered to be white noise, or whethera more complicated model is required. In later chapters we shall consider variousways to do this, for example, we might estimate the autocovariance function, say{ k}, and observe whether or not kis near zero for allk > , a very simple diagnostic is theturning point test, which examines aseries{Xt}to test whether it is purely random.


Related search queries