Download STAT 520 (Spring 2010) Lecture 2, January 14, Thursday

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Taylor's law wikipedia , lookup

Bootstrapping (statistics) wikipedia , lookup

Misuse of statistics wikipedia , lookup

Law of large numbers wikipedia , lookup

Karhunen–Loève theorem wikipedia , lookup

Time series wikipedia , lookup

Transcript
STAT 520 (Spring 2010) Lecture 2, January 14, Thursday
1. Stationary models and the autocorrelation function
Why is stationarity so important a concept? Consider the problem of using data to make
statements (say, prediction) about a presumed underlying time series process. One needs
to make sure that the dynamics of the process stays the same over time. One assumption
is to require that the distribution of any finite sequence of random variables from the
process does not depend upon the choice of time origin.
A process { X t } is strictly stationary if
d
( X t1 ,, X tk )  ( X t1 t ,, X tk t ), for any t and t1 ,, t k
Suppose that { X t } is a time series with E ( X t2 )   . Its mean function is
t  E( X t ) .
Its autocovariance function is
 X (s, t )  Cov( X s , X t )  E[( X s   s )( X t   t )] .
We say that { X t } is (weakly) stationary if
1.  t is independent of t, i.e,  t   and
2. For each h,  X (t  h, t ) is independent of t. i.e.,  X (t  h, t )   X (h)
Then its autocorrelation function (ACF) is
Cov( X t  h , X t )
 ( h)
 X ( h)  X

 Corr ( X t  h , X t ) .
Var ( X t )
Var ( X t )
An example of a weakly stationary but not strictly stationary process:?
Another example of a strictly stationary but not weakly stationary process:?
In summary:
Strict Stationarity + finite variance  (weakly) stationarity
Strict Stationarity + finite variance  (weakly) stationarity +normality
Now consider a counter-example of (weakly) stationary time series: Random walk. How
do they look graphically? In particular, how does a typical realization of the random walk
process look like? A noticeable feature is the fact that the random walks observations
really “walk randomly” across a wide range of values. In contrast, the white noise process
with standard deviation 1 oscillates around the mean value of zero and stay with the
range of roughly –2 to 2.
Consider the following (weakly) stationary processes and calculate their autocovariance
and autocorrelation functions: iid noise, white noise, MA(1), and AR(1) processes.
The behavior pattern of  X (h) is useful when examining the sample ACF of a data set.
Definition 1.4.4 Let x1 ,, x n be observations of a time series. The sample mean of
x1 ,, x n is
x
1 n
 xi .
n i 1
The sample autocovariance function is
1 n |h|
ˆ (h)   ( xt |h|  x )( xt  x ) , for all  n  h  n .
n t 1
The sample ACF is
ˆ (h) 
ˆ (h)
, for all  n  h  n .
ˆ (0)
Remark:
1. Page 19, remark 3. (Why the division n for ˆ (h) ?)
2. 95% Confidence bar for ˆ (h) with h  0 is 1.96n 1 / 2 for WN (0,  2 ) . (Figures I12 and 1-1313, p.19-20)
3. If ˆ (h) will have value outside the bar only for h  1, then you might think of
them as MA(1) process. If it decrease geometrically, you could think of them as
AR(1).
4. For data containing a trend, | ˆ (h) | will decay slowly and for data with
substantial deterministic periodic component, | ˆ (h) | will have the similar
periodicity. (Figure 1-14, p. 21)
Example Consider the Lake Huron data (LAKE.TSM)
Please read the ITSM Tutorial D.3.