Download iid N(m, s)

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Central limit theorem wikipedia , lookup

Transcript
Random Series / White Noise
Notation
• WN (white noise) – uncorrelated
• iid
independent and identically distributed
• Yt ~ iid N(m, s) Random Series
• et ~ iid N(0, s)
White Noise
Data Generation
• Independent observations at every t from
the normal distribution (m, s)
Yt
Yt
t
Identification of WN Process
How to determine if data are from WN
process?
Tests of Randomness - 1
• Timeplot of the Data
Check trend
Check heteroscedasticity
Check seasonality
Generating a Random Series
Using Eviews
• Command: nrnd generates a RND N(0, 1)
3
2
1
0
-1
-2
-3
5
10
15
20
25
30
WN
35
40
45
50
Test of Randomness - 2
Correlogram
Sample: 1 50
Included observations: 50
Autocorrelation
.|.
. |*.
. |**
**| .
.|.
.|.
|
|
|
|
|
|
Partial Correlation
.|.
. |*.
. |**
**| .
.|.
.|.
|
|
|
|
|
|
1
2
3
4
5
6
AC
PAC
Q-Stat
Prob
-0.042
0.112
0.275
-0.215
0.036
0.047
-0.042
0.111
0.288
-0.218
-0.053
0.033
0.0952
0.7777
4.9515
7.5666
7.6427
7.7741
0.758
0.678
0.175
0.109
0.177
0.255
Scatterplot and Correlation
Coefficient - Review
Y
X
Autocorrelation Coefficient
• Definition:
The correlation coefficient between Yt and Y(t-k) is
called the autocorrelation coefficient at lag k and
is denoted as rk . By definition, r0 = 1.
• Autocorrelation of a Random Series:
If the series is random, rk = 0 for k = 1,...
rk
Process Correlogram
1
0
Lag, k
-1
Sample Autocorrelation
Coefficient
Sample Autocorrelation at lag k.
 Y  Y Y
n
ˆrk  t 1 k
t k
t
 Y  Y 
n
2
t
t 1
Y
Standard Error of the Sample
Autocorrelation Coefficient
• Standard Error of the sample autocorrelation
if the Series is Random.
s rk =
n -k
 1n
n n +2
Z- Test of H0: rk = 0
z
rk
srˆ k
Reject H0 if Z < -1.96 or Z > 1.96
Box-Ljung Q Statistic
• Definition
 1 ˆ2
QBL (m)  n(n  2) 
r k
k 1  n  k 
m
Sampling Distribution of QBL(m) | H0
• H0 : r1=r2=…rk = 0
• QBL(m) | H0 follows a c2 (DF=m)
distribution
Reject H0 if QBL > c2(95%tile)
Test of Normality - 1
Graphical Test
• Normal Probability Plot of the Data
Check the shape: straight, convex,
S-shaped
Construction of a Normal
Probability Plot
• Alternative estimates of the cumulative relative frequency
of an observation
– pi = (i - 0.5)/ n
– pi = i / (n+1)
– pi = (i - 0.375) / (n+0.25)
• Estimate of the percentile | Normal
– Standardized Q(pi) = NORMSINV(pi)
– Q(pi) = NORMINV(pi, mean, stand. dev.)
Non-Normal Populations
Flat
Skewed
Data
Data
Expected | Normal
Expected | Normal
Test of Normality - 2
Test Statistics
n
• Stand. Dev.
• Skewness
• Kurtosis
sˆ 
 Y
t 1
t
Y

2
n
 Yt  Y 
1
ˆ
S  

n t 1  s

n
 Yt  Y 
1
K  

n t 1  s

n
4
3
The Jarque-Bera Test
If the population is normal and the data are random,
then:
2 1
2
n
JB =
S +
K-3
6
4
follows approximately c2with the # 0f degrees of
freedom 2.
Reject H0 if JB > 6
Forecasting Random Series
• Given the data Y1,...,Yn, the one step ahead
forecast Y(n+1) is:
Y  t-coeff  s
1+ 1
n
or Approx.
Y  z-coeff  s
Forecasting a Random Series
• If it is determined that Yt is RND N(m, s)
a) The best point forecast of Yt = E(Yt) = m
b) A 95% interval forecast of Yt =
(m – 1.96 s, m+1.96 s)
for all t (one important long run implication of a
stationary series.)
The Sampling Distribution
of the von-Neumann Ratio
The vN Ratio | H0 follows an approximate normal
with:
Expected Value of v: E(v) = 2
Standard Error of v:
SE (v) =
4 (n - 2)
n2 - 1
Appendix:
The von Neumann Ratio
• Definition:
 Y
n
v
t 2
t
 Y( t 1) 
2
( n  1) s
2
Y
The non Neumann Ratio of the regression
residual is the Durbin - Watson Statistic