Download Stable Process

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
Stable Process
3. Stable Random Processes and Stochastic Intergrals
August, 2006
1. Stable stochastic processes.
2. Definition of stable integrals as a stochastic process.
3. α-stable random measures.
4. Constructive definition of stable integrals.
5. Properties of stable integrals.
6. Examples.
7. Sub-Gaussian process and sub-stable process.
8. Series representation for α-stable random measures.
9. condition S.
10. Poisson representation.
1
Stable stochastic processes
The finite dimensional distribution of a stochastic process {X(t), t ∈ T } are the distributions of the
vectors
X(t1 ), . . . , X(td ) , t1 , . . . , td ∈ T, d ≥ 1.
Definition A stochastic process {X(t), t ∈ T } is (strictly/symmetric) stable if all its finite dimentional distributions are (strictly/symmetric) stable. By consistancy, they must have the same index
of stability α, and we use the term α-stable stochastic process here.
Theorem 1.1 Let {X(t), t ∈ T } be a stochastic process.
(a) {X(t), t ∈ T } is strictly stable if and only if all linear combinations are strictly stable.
(b) {X(t), t ∈ T } is symmetric stable if and only if all linear combinations are symmetric stable.
(c) {X(t), t ∈ T } is α-stable if and only if all linear combinations are α-stable, given α ≥ 1.
Example α-stable Lev́y motion. A stochastic process {X(t), t ≥ 0} is called (standard) α-stable
Lev́y motion if
(1) X(0) = 0 a.s.
1
Stable Process 3
Zhi Ouyang
August, 2006
(2) X has independent increments.
(3) X(t)−X(s) ∼ Sα (t−s)1/α , β, 0 for any 0 ≤ s < t < ∞. and for some 0 < α ≤ 2, −1 ≤ β ≤ 1.
NOTE: It is a Brownian motion when α = 2. The role that α-stble Lev́y motion plays among
α-stable process is similar to the role that Browian motion plays among Gaussian process.
2
Definition of stable integrals as a stochastic process
Suppose f is non-random function defined on the set E, I(·) is an operator, which can be exercised
onto f . The goal is to make the connection of the collections between {I(f ), f ∈ F } and a stochastic
process. That is to say, the element, such as I(f ), should be treated as a random variable, and the
stochastic process is indexed by f ∈ F .
Billingsley(1986) shows that, in order to define a stochastic process, one only need to define the
finite-dimensional distribution, and show that it is consistant according to Kolmogorov’s existance
theorem.
2.1
Construction
Suppose (E, E, m) is a measurable space, with m is σ-finite. β : E → [−1, 1] is a measurable funciton.
When α 6= 1, define F = Lα (E, E, m); when α = 1, define
Z
1
f (x)β(x) ln |f (x)| m(dx) < ∞ .
F = F(m, β) = f : f ∈ L (E, E, m) and
E
Note that F is a vector space.
For f1 , . . . , fd ∈ F , associate I(f1 ), . . . , I(fd ) with a probability measure Pf in Rd via characteristic function as follows:

o
n R P
Pd
πα
 exp − | d θj fj (x)|α 1 − iβ(x)sign
m(dx)
,
α 6= 1,
θ
f
(x)
tan
j=1 j j
2
n RE Pj=1
o
φf (θ) =
P
P
d
d θj fj (x) m(dx) , α = 1.
 exp − | d θj fj (x)| 1 + i 2 β(x)sign
j=1
j=1 θj fj (x) ln
j=1
E
π
One can show this is indeed a characteristic funciton of an α-stable vector. Notice that m is
not the spectrum measure, since E is not necessarily the sphere S d . Nevertheless, one can make
transformations
such that m becomes
Γ, and E becomes S d . For example, when α 6= 1, on space
n
o
P
E+ = x ∈ E : dj=1 fj (x)2 > 0 , transform vector f (x) to g(x) := f (x)/kf (x)k. Meanwhile, m is
transformed to m1 = kf (x)kα m. Notice that f ∈ Lα , then m1 is a finite measure.
Next, transform x to g(x), meanwhile, convert m 1 to Γ. This requires us to extract the intergrand
into two components, such as




d
d
X
X
πα  1 − β
πα  1 + β

1 − isign
θj gj (x) tan
+ 1 + isign
.
θj gj (x) tan
2
2
2
2
j=1
j=1
and make different transformations in different compoents.
The case of α = 1 is a bit more subtle, but similar.
To verify the consistancy of Pf , note that for any permutation π, we have
φfπ(1) ,...,fπ(d) (θπ(1) , . . . , θπ(d) ) = φf1 ,...,fd (θ1 , . . . , θd ),
2
Stable Process 3
Zhi Ouyang
August, 2006
and for any n ≤ d,
φf1 ,...,fn ,...,fd (θ1 , . . . , θn ) = φf1 ,...,fn ,...,fd (θ1 , . . . , θn , 0, . . . , 0).
By Kolmogorov’s existance theorem, there is a stochastic process which we denote {I(f ); f ∈ F }
whose finite dimensional distributions are given in this section. Meanwhile, I(f ) is called α-stable
integral of f . We shall see why it is called intergral in latter sections. The measure m is called the
control measure, and the funciton β is called the skewness intensity.
2.2
Properties
Proposition 2.1 For any intergrable f 1 , . . . , fd , the integral I(f1 ), . . . , I(fd ) are jointly α-stable.
They are jointly SαS if the skewness intensity is zero.
Proposition 2.2 Let f be integrable, then I(f ) ∼ S α (σf , βf , µf ), where
σf
=
Z
βf
=
R
µf
=
E
α
E
|f (x)| m(dx)
1/α
,
f (x)<α> β(x)m(dx)
R
,
α
E |f (x)| m(dx)
0 R
− π2 E f (x)β(x) ln |f (x)|m(dx)
if α 6= 1;
if α = 1.
Proposition 2.3 I is a linear operator on F , i.e. if f 1 and f2 are integrable, then
I(a1 f1 + a2 f2 ) = a1 I(f1 ) + a2 I(f2 )
a.s.
for any real numbers a1 and a2 .
3
α-stable Random measures
In order to view I(f ) as an intergral, we first need to define the measure to serve as an itergrator.
A random measure is best viewed as a stochastic process M (·) indexed by
P sets A. Two major
characteristics are (M (A1 ), . . . , M (An )) is a random vector and M (∪Ai ) =
M (Ai ) a.s. if Ai are
disjoint.
Denote (Ω, F, P ) is the underling probability space, and L 0 (Ω) is the set of all real random
variables on this probability space. (E, E, m) is a measurable space, β : E → [−1, 1] is a measurable
funciton. Let E0 be the subset of E that contains all m-finite sets.
Definition An independent scattered σ-additive set function
M : E0 → L0 (Ω)
such that for each A ∈ E0 ,
R
1/α A β(x)m(dx)
M (A) ∼ Sα m(A)
,
,0
m(A)
is called an α-stable random measure on (E, E) with control measure m and skewness intensity β.
3
Stable Process 3
Zhi Ouyang
August, 2006
Independent scattered means, random variables M (A i ) are independent for disjoint Ai in E0 .
σ-additive means, in addition, if ∪ ∞
j=1 Ai ∈ E0 , then


∞
∞
[
X


M
Ai =
M (Ai ) a.s.
j=1
j=1
Since the shift parameter is always 0, the random measure is characterized by m and β only.
The existance is easy to show from previous section. First, we have shown the existance of a
stochastic process {I(f ); f ∈ F }, setting M (A) = I(1 A ) for A ∈ E0 yeilds the existance of a stochastic
process {M (A); A ∈ E0 }. Its finite-dimensional distribution can be characterized via:
Proposition 3.1 For A1 , . . . , Ad ∈ E0 , and θ1 , . . . , θd ∈ R, we have
d
X
i=1
θj M (Aj ) ∼ Sα (σ, β, µ),
(1)
where
Z
σ =
E
|
d
X
θj 1Aj (x)|α m(dx)
i=1
!1/α
,
<α>
P
d
m(dx)
θ
1
(x)
β(x)
i=1 j Aj
E
,
β =
R Pd
α
m(dx)
θ
1
(x)
j
A
j
i=1
E
0
R P
P
µ =
− π2 E di=1 θj 1Aj (x)β(x) ln | di=1 θj 1Aj (x)|m(dx)
R
if α 6= 1,
if α = 1.
The rest is to show M is independently scattered(via characteristic functions), and σ-additive.
Proposition 3.2 M is a SαS random measure if the skewness intensity β = 0.
Example Let M be an α-stable random measure on [0, ∞), B with Lebsgue control measure and
constant skewness intensity β(x) = β, 0 ≤ x < ∞. Let
X(t) = M ([0, t]),
0 ≤ t < ∞.
M ([0, t]) is well defined since m([0, t] = t is finite for any t < ∞.
The stochastic process {X(t), 0 ≤ t < ∞} is α-stable Lev́y motion.
4
Constructive definition of stable integrals
After
definition of the random measure, we can show that I(f ) is bona
integral, which we denote
R
R fide
(n)
(x)M (dx), where f (n) are
E f (x)M (dx). First, we need to approximate this integral with E f
simple functions which approximates f (x); then show the converged limit is the desired integral.
Let M be an α-stable random measure on (E, E) with a control measure m and skewness intensity
β, and E0 is all the m-finte sets in E.
4
Stable Process 3
4.1
Zhi Ouyang
August, 2006
Simple functions
Write simple function f (x) =
I(f ) =
Z
f (x)M (dx) =
E
Pn
j=1 cj 1Aj (x)
n
X
with disjoint Aj ∈ E0 , define
cj M (Aj ).
(2)
j=1
From the definition of the random measure, we know that M (A j ) are independent α-stable
random variables with parameter
R
1/α A β(x)m(dx)
,
Sα m(A)
,0 .
m(A)
The linear combination of these random variables yield another α-stable random variable with parameter
Sα (σf , βf , µf ),
where
σf
βf
µf
=
Z
=
R
=
E
α
E
|f (x)| m(dx)
1/α
,
β(x) (f (x))<α> m(dx)
R
,
α
E |f (x)| m(dx)
0 R
− π2 E f (x)β(x) ln |f (x)|m(dx)
(3)
(4)
if α 6= 1,
if α = 1.
(5)
The operator I(·) is linear on the subspace of all simple functions.
4.2
Genearal functions
Genearlly, one can always find simple functions f (n) which approximates f (x), i.e.
f (n) (x) → f (x) for almost every x ∈ E,
|f (n) (x)|
≤ θ(x) for all n, x, and some θ ∈ F.
(6)
(7)
Then the sequence integral I(f (n) ) converges in probability, hence I(f ) is defined as
I(f ) = plimn→∞ I(f (n) ).
where plim denotes limite in probability. Also a check that the definition does not depend on the
choice of approximating sequence shows that the definition is well-stated.
Since convergence in probability infers convergence in distribution, we know the characteristic
function for I(f ) is

n R
o
 exp − |θf (x)|α 1 − iβ(x)sign θf (x) tan πα m(dx) ,
α 6= 1,
2
n RE
o
φf (θ) =
 exp − |θf (x)| 1 + i 2 β(x)sign θf (x) ln θf (x) m(dx) , α = 1.
π
E
After lineararity of I(·) is approved, the characteristic function of vector I(f1 ), . . . , I(fd ) could
be established. It is not difficult to see this construction is equivilent to the construction described
in section 2.
5
Stable Process 3
5
Zhi Ouyang
August, 2006
Properties of stable integrals
R
R
Proposition 5.1 Let Xj = E fj (x)M (dx), j = 1, . . . , and X = E f (x)M (dx), where M is an
α-stable random measure with control measure m and skewness intensity β, then
plimj→∞ Xj = X
if and only if
Z
fj (x) − f (x)α m(dx) = 0,
lim
(8)
j→∞ E
and in the case of α = 1, if also
Z
lim
fj (x) − f (x) β(x) ln |fj (x) − f (x)|m(dx) = 0,
(9)
j→∞ E
R
Proposition 5.2 Let Xj = E fj (x)M (dx), j = 1, 2, where M is an SαS random measure with
1 < α ≤ 2 and control measure m, then
Z
f1 (x)f2 (x)<α−1> m(dx).
(10)
[X1 , X2 ]α =
E
R
Theorem 5.3 Let Xj = E fj (x)M (dx), j = 1, 2 be two α-stable random integrals with respect to an
α-stable random measure with control measure m and 0 < α < 2. Then X 1 and X2 are independent
if and only if
f1 (x)f2 (x) = 0
m − a.e. on E.
HINT: Using characteristic functions, independence is equivalent to say, for all θ i ,
Z
Z
Z
θ1 f1 (x) + θ2 f2 (x)α m(dx) = |θ1 |α
|f1 (x)|α m(dx) + |θ2 |α
|f2 (x)|α m(dx).
E
E
E
NOTE: This theorem is not true when α = 2. For example, let (R, B, dx) be the underling
probability space, and E = [−1, 1], E = B([−1, 1]), M is an 2-stable random measure with control
measure m equal to Lebesgue measure and zeroskewness intensity. f 1 (x) = x, and f2 (x) = 1 for
x ∈ [−1, 1]. The joint distribution of I(f1 ), I(f2 ) has characteristic function exp − 21 34 θ12 + 4θ22 ,
Hence I(f1 ) and I(f2 ) are independet(normal), but m {supp(f 1 ) ∩ supp(f2 )} = 2 > 0.
R
Corollary 5.4 Let Xj = E fj (x)M (dx), j = 1, . . . , d, be jointly α-stable. They are independet if
and only if they are pairwise independent, i.e., if and only if f k1 (x)fk2 (x) = 0 m-a.e. on E for any
subset {k1 , k2 } of {1, . . . , d}.
Proposition 5.5 (Change of variables) Let M m and Mν be either two α-stable random measure
with 0 < α ≤ 2 and α 6= 1, or two S1S random measures, with identical skewness intensity and with
control measures m and ν, respectively, satisfying
m(dx)
= (r(x))α ,
ν(dx)
x ∈ E,
where r(x) ≥ 0. Then
Z
Z
dist
f (x)Mm (dx) =
f (x)r(x)Mν (dx)
E
E
for any f ∈ Lα (E, m).
6
(11)
Stable Process 3
Zhi Ouyang
August, 2006
HINT: Since they are all α-stable distributions, we only need to show their scale, skewness, shift
parameters are the same, recall (3).
NOTE: This change of variable will not hold if α = 1 and M m is not symmetric.
dist
Definition We say a process {X(t), t ∈ T } has the representation {Y (t), t ∈ T }, if X = Y , i.e. X
and Y has the same finite-dimensional distributions.
From α-stable random variable perspective, a spectra measure Γ on S d and a shift vector µ0
define the distribution. From α-stable random measure pespective, we need a measurable space
(E, E), on which defines a random measure M ; for d dimensional vector, we also need d funcions f j
in Lα (E, m) to define a strictly stable random variable I(f ), hence adding a shift vector η in R d
would resolve a α-stable random variable:
Z
Z
dist
fd (x)M (dx) + η.
(12)
f1 (x)M (dx), . . . ,
(X1 , . . . , Xd ) =
E
E
Theorem 5.6 (Representation theorem in R d )
Let X = (X1 , . . . , Xd ) be an α-stable random vector in Rd .
k·k
(i) Let k · k be an arbitrary norm on Rd , Sd = {s : ksk = 1} the corresponding unit sphere and let
Γk·k , µ0k·k be the corresponding spectra measure and shift parameter. The relation (12) always
holds with:
k·k
k·k
(E, E) = (Sd , Borel σ-algebra on Sd ),
M has control measure m = Γk·k and skewness intensity β(·) = 1,
k·k
fj : Sd → R is defined by fj (s1 , . . . , sd ) = sj , j = 1, . . . , d,
η = µ0k·k .
so that
dist
X = (X1 , . . . , Xd ) =
Z
k·k
Sd
s1 M (dx),
Z
k·k
s2 M (dx), . . . ,
Sd
Z
k·k
sd M (dx)
Sd
!
+ µ0k·k .
(13)
(ii) If X is SαS, then there exists bounded measurable functions f j such that
dist
X =
Z
0
1
f1 (x)M (dx),
Z
1
f2 (x)M (dx), . . . ,
0
Z
1
fd (x)M (dx) ,
0
where M is an SαS random measure with control measure m(dx) = dx.
(iii) If X is strictly α-stable with α =
6 1, then there exists bounded measurable functions f j such
that
Z 1
Z 1
Z 1
dist
X =
f1 (x)M (dx),
f2 (x)M (dx), . . . ,
fd (x)M (dx) ,
0
0
0
where M is an α-stable random measure with control measure m(dx) = dx and skewness
intensity β(x) = 1.
7
Stable Process 3
Zhi Ouyang
August, 2006
(iv) If α = 1, then there exists bounded measurable functions f j such that
dist
X =
1
Z
f1 (x)M (dx),
0
Z
1
f2 (x)M (dx), . . . ,
0
Z
1
fd (x)M (dx) + η,
0
where M is an 1-stable random measure with control measure m(dx) = dx and skewness intensity β(x) = 1, and where η = (η1 , . . . , ηd ) with
1
ηj =
π
Z
1
fj (x) ln
0
d
X
fk (x)2
k=1
!
dx + µ0j ,
j = 1, . . . , d.
HINT: Recall equation (3) which proves (i). The key for the rest part can be viewed as sort
of transformation. This integral representaion holds not only for α-stable random variables, but
also for most α-stable
stochastic processes. In that case, one mey have to replace the commen
[0, 1], B [0, 1] , dx by some general measurable space (E, E, m).
NOTE : Proposition (3.2) shows that we can construct SαS random vector via symmetric αstable random measures. However, we can also contruct SαS random vector via non-symmetric
α-stable random measures. For example,
Z d
Xk =
1[k−1,k] (x) − 1[−k,−k+1](x) M (dx), , k = 1, . . . , d,
−d
where E = [−d, d], M is a α-stable random measure with Lebesgue control measure and skewness
intensity β = 1.
Corollary 5.7 Let X1 , . . . , Xd be jointly α-stable. If they are pairwise independent, then they are
independent.
6
Examples
Example SαS Lev́y motion.
R∞
Rt
Let X(t) = M ([0, t]) = 0 1{x ≤ t}M (dx) = 0 M (dx), t ≥ 0, where M is SαS on [0, ∞) with
control measure m(dx) = dx. Then
X(0) = 0,
a.s.
X(t) − X(s) =
Z
t
s
M (dx) = M ([s, t]) ∼ Sα (|t − s|1/α , 0, 0).
If 0 ≤ t1 < t2 < · · · < tn , then
X(t2 ) − X(t1 ), . . . , X(tn ) − X(tn−1 ) =
Z
t2
t1
M (dx), . . . ,
Z
tn
tn−1
M (dx) .
To summarize, {X(t), t ≥ 0} is a process starts at 0, has stationary independent increments and
SαS finite dimenstional ditribution. Hence it is SαS Lev́y motion.
8
Stable Process 3
Zhi Ouyang
August, 2006
Example Moving average.
R∞
Let f be a measurable function in R1 satifying −∞ |f (x)|α dx < ∞, 0 < α ≤ 2, and define
Z ∞
X(t) =
f (t − x)M (dx), t ∈ R1 ,
(14)
−∞
where M is SαS with Lebesgue control measure. A process X(·) of the form 14 is called an SαS
moving average process. X(·) is stationary, since for any t 1 , . . . , td , h ∈ R, and real θ1 , . . . , θd ,
k
d
X
θk X(tk + h)kαα
k=1
=
Z
∞
|
d
X
−∞ k=1
α
θk f (tk + h − x)| dx =
Z
∞
|
d
X
−∞ k=1
α
θk f (tk − x)| dx = k
d
X
θk X(tk )kαα
k=1
Example Ornstein-Uhlenbeck process.
Let λ > 0 and M be a SαS random measure, 0 < α ≤ 2 with Lebesgue control measure. The
process
Z t
e−λ(t−x) M (dx), t ∈ R1 ,
X(t)
−∞
is called an Ornstein-Uhlembeck process. Note that it is a moving average process with
f (x) = e−λx 1[0,∞) (x),
x ∈ R1 ,
Hence it is well-defined, and stationary.
Note for any fixed s < t, we have
Z t
−λ(t−s)
X(t) − e
X(s) =
e−λ(t−x) M (dx)
(15)
s
P
d
is a SαS random varaible independent of any linear combination
k = 1 bk X(u
k ), uk ≤ s,
−λ(t−s)
k = 1, . . . , d. Therefore, X(t) − e
X(s) is independent of σ X(u), u ≤ s , implying that
the Ornstein-Uhlenbeck process is Markovian.
Example Reverse Ornstein-Uhlenbeck process.
Let λ > 0 and M be a SαS random measure, 0 < α ≤ 2 with Lebesgue control measure. The
process
Z ∞
X(t)
e−λ(x−t) M (dx), t ∈ R1 ,
t
is called a reverse Ornstein-Uhlembeck process. Note for any fixed s > t, we have
Z s
−λ(s−t)
X(t) − e
X(s) =
e−λ(t−x) M (dx)
(16)
t
Hence the reverse Ornstein-Uhlenbeck process is also Markovian.
Recall the Gausssian case α = 2, all stationary Markov process are Ornstein-Uhlenbeck, meaning
when α = 2, the Ornstein-Uhlenbeck process and the reverse Ornstein-Uhlenbeck process are the
same process. However, they are different when 0 < α < 2.
Example Well-balanced linear fractional stable motion.
Let M be SαS, 0 < α ≤ 2 with Lebesgue control measure and
Z ∞
H−1/α
H−1/α
X(t) =
|t − x|
− |x|
M (dx), t ∈ R1 ,
−∞
where 0 < H < 1, and H 6= 1/α. The process is called Well-balanced linear fractional stable motion,
with two major properties.
9
Stable Process 3
Zhi Ouyang
August, 2006
• Self-similar with index H, or H − ss, i.e.
X(ct1 ), . . . , X(ctd )
dist
=
cH X(t1 ), . . . , cH X(td ) .
• Stationary increment, i.e.
dist
{X(t) − X(0), t ∈ R} = {X(t + τ ) − X(τ ), t ∈ R}.
Example Log-fractional stable motion
It looks like the well-balanced linear fractional stable motion, with the limit H goes to 1/α, the
power function changed to the log function. Assuming 0 < α < 1, the process
Z ∞
ln |t − x| − ln |x| M (dx), t ∈ R1 ,
X(t) =
−∞
is called the (symmetric) log-fractional stable motion. It is self similar with index 1/α, and has
stationary increment.
NOTE: When α = 2, the log-fractional stable motion and the well-balanced linear fractional
stable motion becomes one stochastic process, it is called fractional Brownian motion.
7
7.1
Sub-stable process
Sub-Gaussian processes
Recall that if {G(t), t ∈ T } is a Gaussian process, and let A ∼ S α/2
cos πα
4
2/α
, 1, 0 , where α < 2,
and A is independent of {G(t), t ∈ T }. Then {X(t) = A 1/2 G(t), t ∈ T } is a sub-Gaussian process
with underling Gaussian process {G(t), t ∈ T }. {X(t)} is SαS, and has finte sub-Gaussian SαS
distributions.
Let {Ω, F, P } be the underling probability space on which {G(t)} is defined. Note in this case,
we have E = Ω, and E = F as in previous notations. Write {G(t), t ∈ T } = {G(t, ω), t ∈ T, ω ∈ Ω}
to make the dependence on Ω explicit. To avoid confusion, use x instead of ω. Let M be a SαS
random measure on {Ω, F} with control measure P .
Proposition 7.1 The sub-Gaussian process {X(t) = A 1/2 G(t), t ∈ T } has the representation
Z
1
dist
√
G(t, x)M (dx), t ∈ T ,
{X(t), t ∈ T } =
dα 2 Ω
where dα = (E|Z|α )1/α , with Z ∼ N (0, 1).
10
(17)
Stable Process 3
Zhi Ouyang
August, 2006
Proof For any d ≥ 1, t1 , . . . , td ∈ T , θ1 , . . . , θd ∈ R,




Z
Z d
d



 X
X
α
1
1
G(tj , x)M (dx)
= exp −d−α
θj G(tj , x) P (dx)
E exp i
θj √
√
α




dα 2 Ω
2 j=1
Ω
j=1
 
α/2 


Z d


X
2
1
[ notice that G is Gaussian ] = exp −  √
θj G(tj , x) P (dx)


2 j=1
Ω




d
d X

 1 X
θj θk G(tj , x)G(tk , x)
[ expand ] = exp − 
 2
j=1 k=1


d
 X

1/2
[ notice X = A G ] = E i
θj Xj .


j=1
7.2
Sub-α-stable processes
To generalize the result, let {Y (t), t ∈ T } be a Sα 0 S process with α0 < 2. For fixed 0 < α < α0 , let
πα α0 /α A ∼ Sα/α0 cos 0
,
2α
Definition The process
0
X(t) = A1/α Y (t),
t ∈ T,
(18)
is SαS, can it is called sub-stable process or sub-α-stable process with underling stable process
{Y (t), t ∈ T }.
Similarly, let M be a SαS random measure on {Ω, F} with control measure P .
0
Proposition 7.2 The sub-stable process {X(t) = A 1/α Y (t), , t ∈ T } has the representation
Z
dist
−1
{X(t), t ∈ T } = dα,α0
Y (t, x)M (dx), t ∈ T ,
(19)
Ω
where dp,α0 = (E|Z|p )1/p , Z ∼ Sα0 (1, 0, 0).
R
NOTE: When the random measure M is not symmetric, the process {X(t) = Ω W (t)M (dx), t ∈
T } is still well-defined when corresponding W (t) has finte α th -moment. However, X(t) no longer
bears the simple form as Ap W (t).
8
Series representation for α-stable random measures
To generalize the result on the one-dimensional α-stable random variable, it indicates that α-stable
random measures have essentially a “discrete” structure.
Let M be an α-stable random measure on (E, E) with finte control measure m and skewness
intensity β(·). Denote m̂ as the standerdized control measure m/m(E). Let {Γ 1 , Γ2 , . . . } be a
sequence of arrival times of a Poission process with unit arrival rate, and let {(V 1 , γ1 ), (V2 , γ2 ), . . . }
be a sequence of i.i.d. random vectors, independent of {Γ i }, such that Vi has distribution m̂ on E,
and
1 + β(Vi )
P (γi = 1|Vi ) = 1 − P (γi = 0|Vi ) =
2
11
Stable Process 3
Zhi Ouyang
August, 2006
Theorem 8.1 The series representation for M is
)
(
Z
∞ h
i
1/α X
dist
−1/α
(α)
Cα m(E)
γi Γi
1(Vi ∈ A) − bi
β(x)m̂(dx) + ηA , A ∈ E ,(20)
{M (A), A ∈ E} =
A
i=1
where Cα is the constant about the tail convergence, i.e.
Z ∞
−1 1−α
if α 6= 1;
−α
Γ(2−α) cos(πα/2) ,
Cα =
=
x sin x dx
2/π,
if α = 1.
0
and
(α)
bi
=
ηA =


 R0
1/(i−1)


x−2
1/i
α
α
α−1
α−1 (i
0
2
π
ln
if 0 < α < 1,
if α = 1,
sin xdx
α
− (i − 1) α−1
2
π m(E)
R
if 1 < α < 2,
A β(x)m(dx)
if α 6= 1,
if α = 1.
Equation (20) shows that M is decomposed into two components: a random component consisting
−1/α
a signed random point mass γi Γi
placed at random points Vi ; and a non-random component
proportional to the signed measure m s (dx) = β(x)m(dx). The non-random component is absent
when 0 < α < 1, and it is equal to zero when M is SαS.
dist
Theorem 8.2 For any integrable function f ∈ F , we have I(f ) = S(f ), where S(f ) is an a.s.
convergent random series defined as follows:
(i) When 0 < α < 1,
S(f ) = Cα m(E)
∞
1/α X
−1/α
γi Γi
f (Vi ),
i=1
(ii) When α = 1,
∞
S(f ) =
Xh
2
(1)
γi Γ−1
m(E)
i f (Vi ) − bi
π
i=1
Z
E
i
f (x)β(x)m̂(dx) + ηf ,
where
Z
2 2
ηf = ln
m(E)
f (x)β(x)m(dx).
π
π
E
(21)
(iii) When 1 < α < 2,
S(f ) = Cα m(E)
∞ h
1/α X
−1/α
γi Γi
f (Vi )
i=1
−
(α)
bi
Z
E
f (x)β(x)m̂(dx)
i
We refer S(f ) as the series representation of the stable stochastic integral I(f ). This definition is
equivalent to previous definitions about α-stable integral of f .
12
Stable Process 3
Zhi Ouyang
August, 2006
Corollary 8.3 If fi ∈ F , then
dist
I(f1 ), . . . , I(fd ) = S(f1 ), . . . , S(fd ) .
Example Lev́y α-stable motion.
Let {X(t), 0 ≤ t ≤ 1} be standard Lev́y α-stable motion on [0, 1], with X(1) ∼ S α (1, β, 0). We
have shown that
Z 1
dist
{X(t), 0 ≤ t ≤ 1} =
1[0,t] M (dx), 0 ≤≤ 1 ,
0
where M is an α-stable random measure on [0, 1], B with Lebesgue control measure and constant
skewness intensity β(x) = β.
Hence, for 0 < α < 1,
)
(
∞
X
1/α
dist
−1/α
γi Γi
1(Ui ≤ t), 0 ≤ t ≤ 1 ,
{X(t), 0 ≤ t ≤ 1} =
Cα
i=1
for α = 1,
dist
{X(t), 0 ≤ t ≤ 1} =
(
)
∞
i
2 X h −1
2 2
(1)
γi Γi 1(Ui ≤ t) − βtbi + βt ln , 0 ≤ t ≤ 1 ,
π
π π
i=1
for 1 < α < 2
dist
{X(t), 0 ≤ t ≤ 1} =
(
Cα
∞ h
1/α X
i=1
(1)
γi Γ−1
i 1(Ui ≤ t) − βtbi
i
)
,0≤t≤1 ,
where {γi , . . . } is a sequence of i.i.d random variables satisfying
P (γi = 1) = 1 − P (γi = 0) =
1+β
,
2
{Γ1 , . . . } is a sequence of arrival times of a Poisson process with unite arrival rate, and {U 1 , . . . } is
a sequence of i.i.d random variables uniformluy distributed on [0, 1]. They are independent.
Meaning, when 0 < α < 1, the α-stable Lev́y motion, and when 1 < α < 2, the SαS Lev́y motion
can be regarded as pure jump process. The instants U i s of the jumps are uniformly distributed in
[0, 1], the directions of the jumps are up by (1 + β)/2, and down by (1 − β)/2, with jump heights
following a decreasing order distributed as −1/α power of arrival times of Poisson process with unit
rate.
Non-symetric α-stable Lev́y motion with 1 ≤ α ≤ 2 can be regarded as a pure jump process with
a linear deterministic trend.
Corollary 8.4 In the symmetric case, if M is SαS, 0 < α < 2, then for any f ∈ F ,
dist
I(f ) =
Cα m(E)
∞
1/α X
−1/α
i Γi
f (Vi ),
i=1
where {i , . . . } is the fair toss bernulli sequence independent of {Γ} and {V i }.
13
Stable Process 3
9
Zhi Ouyang
August, 2006
Condition S
To replace the Rademacher sequence { i } in Corollary 8.4 by sequence {d−1
α Gi }, where Gi are i.i.d.
1/α
α
−1
α
standard normal random variable, and d α = E|G1 |
, or E|dα G1 | = 1. Notice that
α + 1
,
2
dαα = 2α/2 π −1/2 Γ
then denote
Cα0
=
d−α
α Cα
=2
−α/2 1/2
π
Then
dist
I(f ) =
Cα0 m(E)
∞
1/α X
α + 1
Γ
2
−1/α
Gi Γi
−1
Cα ,
(22)
f (Vi ),
(23)
i=1
For a SαS stochastic process represented as
Z
X(t) =
ft (x)M (dx), t ∈ T ,
E
by Corollary 8.3, we obtain
(
dist
{X(t), t ∈ T } =
Cα0 m(E)
∞
1/α X
−1/α
Gi Γi
i=1
ft (Vi ), t ∈ T
)
.
(24)
View this series on the right-hand side as the product of two probability space: (Ω 1 , F1 , P1 ) on which
the Gaussian sequence Gi is defined, (Ω2 , F2 , P2 ) on which Γi and Vi are defined. Hence conditional
on Γi and Vi , it is a mixture of zero mean Gaussian processes on the probability space (Ω 1 , F1 , P1 ).
Proposition 9.1 Let M be a SαS random measure
R on (E, E) with a finte control measure m.
Let ft ∈ F , then SαS stochastic process {X(t) = E ft (x)M (dx), t ∈ T } is conditionally centered
Gaussian, i.e., it is a probability mixture of zero mean Gaussian processes.
This indicates, besides sub-Gaussian processes, they are many more SαS processes are conditionally centered Gaussian.
Definition Let {X(t), t ∈ T } be a stochastic process. It satisfies condition S, if there is a countable
subset To ⊆ T , such that for every t ∈ T , there exists t k ∈ To , and X(t) = plimk→∞ X(tk ).
NOTE: Condition S, the letter “S” stands for seperability in probability. All SαS processes that
satisfies condition S bears the representation equation
Z
dist
ft (x)M (dx), t ∈ T , m finite.
(25)
{X(t), t ∈ T } =
E
Morever, we have following relationship:
14
Stable Process 3
Zhi Ouyang
August, 2006
Conditionally centered Gaussian
KS
R
E
ft (x)M (dx) with m finite ks
KS
R
E gt (x)M
0 (dx)
with m0 σ-finite
Condition S
KS
(T, d) sep., ctb. T1 ⊆ T , s.t. {X(t), t ∈ T \T1 } conti. i.p.
Example Counterexample: Consider X(t), t ∈ [0, 1] be i.i.d. (non-degenerate) SαS random variables,
R then it certainly not satisfy condition S, nor it has a finte m(or σ-finite m) representation
as E ft M (dx). But it is conditionally centered Gaussion, since we can write X(t) = A(t) 1/2 G(t),
where A(t) are i.i.d. totally skewed α/2-stable, and G(t) is i.i.d. standard normal random variables
independent of A(t).
10
Poisson representation
Let M be an α-stable random measure on (E, E) with control measure m and skewness intensity
β(·), let f ∈ F . The goal of this section is to write I(f ) in terms ofa integral with respect to a
Poisson random measure N on E × R0 , where R0 = R\{0}.
condition here. In fact, since
R f ∈ αF implies that m is σ-finte, which−i is a very important
−i+1
< |f (x)| ≤ 2
} has finte m-measure for every
E |f (x)| M (dx) < ∞, write Ei = {x : 2
integer i, and the union of them is whole space E.
To define a Poisson random measure, recall the conterpart of the definition of α-stable random
measure
Definition Let (S, S, n) be a measurable space, with S 0 = {A ∈ S : n(A) < ∞}. (Ω, F, P ) is an
underlining probability space. A Poisson random measure N on (S, S, n) is an independent scattered
σ-additive set function
N : S0 −→ L0 (Ω)
A 7−→ N (A) ∼ P o n(A)
That is to say, N (A) has the discrete distribution that
k
−n(A) n(A)
, k = 0, 1, 2, . . . .
P n(A) = k = e
k!
n is called the control measure of N .
In this context, set S = E × R0 , and
(
1 + β(x) m(dx) |u|du
α+1
n(dx, du) = EN (dx, du) =
1 − β(x) m(dx) |u|du
α+1
15
u>0
u<0
(26)