Download Testing Time Reversibility of Markov Processes

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Covariance and contravariance of vectors wikipedia , lookup

Four-vector wikipedia , lookup

Matrix calculus wikipedia , lookup

Ordinary least squares wikipedia , lookup

Transcript
Testing Time Reversibility of Markov Processes
Efstathios Paparoditis
University of Cyprus, Department of Mathematics and Statistics
P.O.Box 20537
CY-1678-Nicosia,Cyprus
E-mail:[email protected]
Dimitris N. Politis
University of California, San Diego, Department of Mathematics
La Jolla, CA 92093-0112, USA
E-mail: [email protected]
A stationary process fXt; t 2 ZZg is said to be time reversible if, for every n 2 IN and
all t1; t2; : : :; tn the random vectors (Xt1 ; Xt2 ; : : : ; Xtn ) and (X,t1 ; X,t2 ; : : :; X,tn ) have the
same joint probability distribution. In the following we assume that fXtg is a geometrically
ergodic pth order Markov process. For t 2 ZZ and l 2 IN we denote by Yt;l the random vector
(Xt; Xt,1; : : : ; Xt,l+1) .
Let = (1; 2; : : : ; p+1) 2 IRp+1 and let ( ) be the characteristic function of Yt;p+1 given
by ( ) = E (expfi Yt;p+1g). If fXtg is time reversible then ( ) = (e) for every 2 IRp+1
where here and in the sequel for every (not necessarily random) vector x = (x1; x2; : : : ; xl) , xe
denotes the vector with elements appearing in reversed order, i.e., xe = (xl; xl,1; : : :; x1) . Now,
let m = e1 , em+1 where el is the (p + 1)-dimensional vector with one in the lth position
and zero elsewhere. Time reversibility of fXtg implies then that (m ) is real-valued, i.e., that
E (sin(mYt;p+1)) = 0 for every m = 1; 2; : : : ; p. Given observations X1 ; X2; : : :; Xn , one way to
test the null hypothesis of time reversibility is to use the statistic
0
0
0
0
0
0
0
0
b
Tn = n Sb ,1
p S;
0
where Sb = (S^1; S^2; : : :; S^p) , Sbm = (n , p),1 Pnt=p+1 sin(Xt , Xt,m ) for m = 1; 2; : : : ; p, p =
limn!1 nV ar(S^ ) = P1
h=,1 ,(h) and ,(h) is the covariance matrix of the p-dimensional process
fVt = (V1;t; V2;t; : : :; Vp;t)>g the mth component of which is given by Vm;t = sin(Xt , Xt,m ).
That is ,(h) = (r;c(h))r;c=1;2;:::;p where r;c(h) = E (Vr;t , E (Vr;t))(Vc;t+h , E (Vc;t+h )).
To perform the test and because
0
1
X
,(h) = 2f (0)
h=,1
where f (0) is the spectral density matrix of fVtg, the unknown covariance matrix p can be
replaced by the estimator
NX
,1
b p =
N (h),b (h);
h=,N +1
where ,(h) = (^rc (h))r;c=1;2;:::;p, ^rc(h) = N ,1 Nt=1,h (Vr;t , V r )(Vc;t+h , V c ) and
V m = N ,1 Nt=m+1 Vm;t for m 2 f1; 2; : : : ; pg. Furthermore, N (h) = K (h=M ) is
b P
P
a lag window
M 2 N aRtruncation parameter and K a continuous even function satisfying K (0) = 1,
Rwith
K (u)du = 1, K 2(u)du < 1 and K (u) = 0 for juj > 1.
Using a central limit theorem for strongly mixing sequences of random variables and
Slutsky's Theorem it can be shown that, under some appropriate conditions, L(Tn) ) 2p as
M; n ! 1 provided M=n ! 0. Here L(X ) denotes the law of a random variable X and ` )0
weak convergence.
Apart from this asymptotic result, however, the quality of the 2-approximation might
be pure in nite sample situations due to the nonparametric nature of the estimator b p used
and its dependence on the \smoothing parameter" M .
As an alternative to this large-sample approximation we propose the following procedure
which is based on the time reversible local bootstrap algorithm. This algorithm generates through
the following two steps and in a fully nonparametric way a pseudoseries X1; X2; : : : ; Xn which
correctly imitates the Markov dependence structure of the observations.
1. Select a resampling width b = b(n) > 0, a resampling kernel W and a set of starting
X1 ; X2 ; : : : ; XRp . W is an everywhere positive probability density on IRp with
Rvalues
uW (u)du = 0 and u2i W (u)du < 1 for every i 2 f1; 2; : : : ; pg.
= (X ; X ; : : : ; X 2. For any t + 1 2 fp + 1; p + 2; : : : ; ng suppose that Yt;p
t t,1
t,p+1 ) has been
generated already. Let J1, J2 and I (Yt;p) be three random variables dened as follows:
(i) J1 and J2 are discrete random variables taking values in the sets Np+1;n,1 = fp +
1; p + 2; : : : ; n , 1g and N2;n,p = f2; 3; : : : ; n , pg respectively with probability mass
functions given by
,Y
fs;p)
Yt;p
XWb(YWt;p(,YYs;p,)Y ) and P (J2 = s) = XWb(W
P (J1 = s) =
fl;p) :
b t;p
l;p
b (Yt;p , Y
l2N +1 ,1
p
l2N2 ,
;n
;n
p
fs;p denotes
Here Ys;p denote the set of p past values (Xs ; Xs,1 ; : : : ; Xs,p+1 ) and Y
the set of p future values (Xs ; Xs+1 ; : : : ; Xs+p,1 ).
) is a Bernoulli random variable with probability of success given by c^(Y )
(ii) I (Yt;p
t;p
where
X W (Y , Y )
b t;p
l;p
l2N +1 ,1
X W (Y , Y ) + X W (Y , Yf ) :
c^(Yt;p ) =
b t;p
l;p
b t;p
l;p
p
;n
l2N +1 ,1
l2N2 ,
) = 1 and Xt+1 =
The bootstrap replicate Xt+1 is then dened by Xt+1 = XJ1 +1 if I (Yt;p
) = 0:
XJ2 ,1 if I (Yt;p
Given a series X1; X2; : : : ; Xn generated according to the above time reversible local
bootstrap algorithm, let S = (S1; S2; : : : ; Sp) where Sm = (n , p),1 nt=p+1 sin(Xt , Xt,m) and
,1
(h))r;c=1;2;:::;p with ^ (h) = N ,1 N ,h (V ,
p = Nh=,
r;c
N +1 N (h), (h) where , (h) = (^
r;c
t=1 r;t
and V = sin(X , X ). Under some appropriate
V r )(Vc;t+h , V c ), V m = N ,1 N
V
t
t,m
m;t
t=m+1 m;t
conditions and if b ! 0 at an appropriate rate as n ! 1, it can be shown that
d0 L(TnjX1 ; X2 ; : : : ; Xn ); L(Tn ) ! 0
p
b
P
b
b
P
b b
b
;n
;n
b
b
p
P
P
n b
b o
0
in probability, where Tbn = nSb b p,1 Sb is the bootstrap analogue of Tn and d0 denotes Kolmogorov's distance dened by d0(P ; Q) = supx2IR jP (X x) , Q(X x)j.