Download On convergence of series of independent random variables.

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the work of artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
Bull. Korean Math. Soc. 38 (2001), No. 4, pp. 763–772
ON CONVERGENCE OF SERIES OF
INDEPENDENT RANDOM VARIABLES
Soo Hak Sung and Andrei I. Volodin
Abstract. TheP
rate of convergence for an almost surely convergent series Sn = n
i=1 Xi of independent random variables is studied in this paper. More specifically, when Sn converges almost
surely
P∞ to a random variable S, the tail series Tn ≡ S − Sn−1 =
i=n Xi is a well-defined sequence of random variables with Tn →
0 almost surely. Conditions are provided so that for a given posiP
tive sequence {bn , n ≥ 1}, the limit law supk≥n |Tk |/bn → 0 holds.
This result generalizes a result of Nam and Rosalsky [4].
1. Introduction
Throughout this paper, {Xn , n ≥ 1} is a sequence of independent random variables defined on a probability P
space (Ω, F, P ). As usual, their
n
partial sums will be denoted by Sn = i=1 Xi , n ≥ 1. If Sn converges
almost surely (a.s.) to a random variable S, then (set S0 = 0)
Tn ≡ S − Sn−1 =
∞
X
Xi , n ≥ 1
i=n
is a well-defined sequence of random variables (referred to as the tail
series) with
(1.1)
Tn → 0 a.s.
Received April 6, 2001.
2000 Mathematics Subject Classification: 60F05, 60F15.
Key words and phrases: convergence in probability, tail series, independent random variables, weak law of large numbers, almost sure convergence.
The first author was supported by Korea Research Foundation Grant(KRF-2000015-DP0049).
764
Soo Hak Sung and Andrei I. Volodin
In this paper, we shall be concerned with the rate in which Sn converges
to S or, equivalently, in which the tail series Tn converges to 0. Recalling
that (1.1) is equivalent to
P
sup |Tk | → 0,
k≥n
Nam and Rosalsky [4] proved the limit law
supk≥n |Tk | P
(1.2)
→0
bn
if {Xn , n ≥ 1} is a sequence of independent random variables with
EXn = 0, n ≥ 1, and {bn , n ≥ 1} is a sequence of positive constants
such that
∞
X
(1.3)
E|Xi |p = o(bpn )
i=n
for some 1 < p ≤ 2. Rosalsky and Rosenblatt [7] extended Nam and
Rosalsky’s result to Banach spaces of Rademacher type p. Rosalsky and
Rosenblatt [8] extended Nam and Rosalsky’s result to the setting of
martingale differences. The main purpose of this paper is to generalize
Nam and Rosalsky’s result. More specifically, we will provide more
general conditions than (1.3), under which the limit law (1.2) holds.
2. Preliminary lemmas
Some lemmas are needed to establish the main results. The first
lemma is due to von Bahr and Esseen [1].
Lemma 1 (von Bahr and Esseen [1]). Let X1 , · · · , Xn be random
variables such
Pmthat E{Xm+1 |Sm } = 0 for 0 ≤ m ≤ n − 1, where S0 = 0
and Sm = i=1 Xi for 1 ≤ m ≤ n. Then
n
X
E|Sn |p ≤ 2
E|Xi |p for all 1 ≤ p ≤ 2.
i=1
Note that Lemma 1 holds when X1 , · · · , Xn are independent random variables with EXm = 0 for 1 ≤ m ≤ n. Nam, Rosalsky, and
Volodin [6] proved the following maximal inequality for the tail series
using Etemadi’s [3] maximal inequality for the partial sums. Furthermore, the maximal inequalities for the partial sums and the tail sums
hold for Banach space valued random variables.
On convergence of series
765
Lemma 2 (Nam, Rosalsky, and Volodin [6]).PLet {Xn , n ≥ 1} be a
∞
sequence of independent
P∞ random variables with n=1 Xn converges a.s.
Then setting Tn = i=n Xi , n ≥ 1,
²
P {sup |Tk | > ²} ≤ 4 sup P {|Tk | > }, ² > 0.
4
k≥n
k≥n
Lemma 3. Let {Xi , n ≤ i ≤ N } be a sequence of independent random variables with EXi = 0, n ≤ i ≤ N. Let {gi (x), n ≤ i ≤ N } be a
sequence of functions defined on [0, ∞) such that
gi (x)
gi (x)
↑,
↓ on (0, ∞), n ≤ i ≤ N
x
xp
for some 1 ≤ p ≤ 2. Let {bi , n ≤ i ≤ N } be a sequence of positive
constants. Then for ² > 0
N
N
n¯ X
o ³ 22p+1
22 ´ X Egi (|Xi |)
Xi ¯¯
¯
P ¯
+
.
¯>² ≤
b
²p
² i=n gi (bi )
i=n i
(2.1)
0 ≤ gi (0) ≤ gi (x),
Proof. Define Xi0 = Xi I(|Xi | ≤ bi ), Xi00 = Xi I(|Xi | > bi ) for n ≤ i ≤
N. Noting that EXi = 0 for n ≤ i ≤ N, it follows that
N
X
Xi
i=n
bi
=
N
X
X 0 − EX 0
i
i=n
i
bi
+
N
X
X 00 − EX 00
i
i=n
i
bi
.
By using the Markov’s inequality and Lemma 1, we have
N
n¯ X
o
Xi ¯¯
¯
P ¯
¯>²
b
i=n i
N
N
n¯ X
n¯ X
Xi0 − EXi0 ¯¯
²o
Xi00 − EXi00 ¯¯
²o
¯
¯
≤P ¯
+P ¯
¯>
¯>
bi
2
bi
2
i=n
i=n
(2.2)
N
N
2p ¯¯ X Xi0 − EXi0 ¯¯p 2 ¯¯ X Xi00 − EXi00 ¯¯
≤ p E¯
¯ + E¯
¯
²
bi
² i=n
bi
i=n
≤
N
N
2p+1 X Xi0 − EXi0 p 2 X ¯¯ Xi00 − EXi00 ¯¯
E|
E¯
|
+
¯
²p i=n
bi
² i=n
bi
N
N
22 X E|Xi00 |
22p+1 X E|Xi0 |p
+
.
≤
²p i=n bpi
² i=n bi
766
Soo Hak Sung and Andrei I. Volodin
It follows from (2.1) that on the set {x : |x| ≤ bi } we have |x|p /bpi ≤
gi (|x|)/gi (bi ). Thus we have
(2.3)
N
X
E|X 0 |p
i=n
bpi
i
≤
N
X
Egi (|X 0 |)
i
i=n
gi (bi )
≤
N
X
Egi (|Xi |)
i=n
gi (bi )
.
On the set {x : |x| > bi }, we have |x|/bi ≤ gi (|x|)/gi (bi ). Thus we get
(2.4)
N
X
E|X 00 |
i
i=n
bi
≤
N
X
Egi (|X 00 |)
i
i=n
gi (bi )
≤
N
X
Egi (|Xi |)
i=n
gi (bi )
.
The result follows by (2.2), (2.3), and (2.4).
¤
Remark 1. When p = 1, Lemma 3 holds without the independence
condition. However, the independence condition is necessary if 1 < p ≤
2. Such a counter-example can be easily obtained.
The following lemma was proved by Nam and Rosalsky [5].
Lemma 4. Let {Xn , n ≥ 1} P
be a sequence of independent and sym∞
metric randomPvariables with
n=1 Xn converges a.s. Then the tail
∞
series {Tn =
X
,
n
≥
1}
is
a well-defined sequence of random
i=n i
variables and for every ² > 0 and n ≥ 1 the inequality
n
o
P sup |Tk | > ² ≤ 2P {|Tn | > ²}
k≥n
holds.
The following lemma gives conditions so that the tail series weak law
of large numbers
(2.5)
Tn P
→0
bn
and the limit law
supk≥n |Tk | P
→0
bn
P∞
are equivalent, where Tn = i=n Xi , n ≥ 1.
(2.6)
On convergence of series
767
Lemma 5. P
Let {Xn , n ≥ 1} be a sequence of independent random
∞
variables with n=1 Xn converges a.s. Let {bn , n ≥ 1} be a sequence of
positive constants. Assume that either {bn , n ≥ 1} is a sequence of nonincreasing or {Xn , n ≥ 1} is a sequence of symmetric random variables.
Then (2.5) and (2.6) are equivalent.
Proof. Since (2.6) clearly implies (2.5), it needs to show that (2.5)
implies (2.6). When {bn } is a sequence of non-increasing, Nam and
Rosalsky [5] proved that (2.5) implies (2.6). We now let {Xn } is a
sequence of independent and symmetric random variables. By Lemma
4, we have
n sup
o
n |T |
o
n
k≥n |Tk |
P
> ² ≤ 2P
>² .
bn
bn
Hence (2.5) implies (2.6).
¤
3. Main results
With the preliminaries accounted for, the main results of this paper
P∞may be established. Theorem 1 gives conditions so that the series
n=1 Xn converges a.s.
Theorem 1. Let {Xn , n ≥ 1} be a sequence of independent random
variables with EXn = 0, n ≥ 1, and let {bn , n ≥ 1} be a sequence of
positive constants. Let {gn (x), n ≥ 1} be a sequence of functions defined
on [0, ∞) such that
(3.1)
0 ≤ gn (0) ≤ gn (x),
0 < gn (x) ↑ as n ↑ ∞ for each x > 0
and
gn (x)
↑,
x
for some 1 < p ≤ 2. If
(3.2)
(3.3)
gn (x)
↓ on (0, ∞), n ≥ 1
xp
∞
X
Egn (|Xn |) < ∞,
n=1
then {Tn , n ≥ 1} is a well-defined sequence of random variables, and for
all n ≥ 1 and all ² > 0,
∞
n sup
o ³ 24p+3
26 ´ X Egi (|Xi |)
k≥n |Tk |
P
>² ≤
+
.
bn
²p
² i=n gi (bn )
768
Soo Hak Sung and Andrei I. Volodin
Proof. First, we show that Tn is well-defined random variable, i.e.,
P∞
0
n=1 Xn converges a.s. Let Xn = Xn I(|Xn | ≤ 1), n ≥ 1. Noting that
(3.2) implies gn (x) ↑ on (0, ∞), n ≥ 1, we have
∞
X
P {|Xn | > 1} ≤
n=1
≤
∞
X
P {gn (|Xn |) ≥ gn (1)}
n=1
∞
X
Egn (|Xn |)
gn (1)
n=1
∞
1 X
≤
Egn (|Xn |) < ∞
g1 (1) n=1
by (3.1) and (3.3). It follows from (3.2) that on the set {x : |x| ≤ 1} we
have |x|2 ≤ |x|p ≤ gn (|x|)/gn (1). Thus we have
∞
X
V ar(Xn0 ) ≤
n=1
≤
∞
X
E|Xn0 |2
n=1
∞
X
1
Egn (|Xn0 |)
g
(1)
n=1 n
∞
1 X
Egn (|Xn |) < ∞.
≤
g1 (1) n=1
On the set {x : |x| > 1} we have |x| ≤ gn (|x|)/gn (1). Using this and the
fact that EXn = 0, we get
∞
X
|EXn0 | =
n=1
≤
Thus
P∞
n=1
∞
X
n=1
∞
X
|EXn I(|Xn | > 1)|
E|Xn |I(|Xn | > 1)
n=1
∞
X
≤
Egn (|Xn |I(|Xn | > 1))
gn (1)
n=1
≤
∞
1 X
Egn (|Xn |) < ∞.
g1 (1) n=1
Xn converges a.s. by the Kolmogorov’s three-series theorem.
On convergence of series
769
Now we find an upper bound of P {supk≥n |Tk |/bn > ²}. Observe that,
by Theorem 8.1.3 of Chow and Teicher [2], for ² > 0
©
ª
n
P |Tk | > ² = P
M
M
¯
¯X
¯
n¯ X
o
o
¯
¯
¯
¯
Xi ¯ > ² .
lim ¯
Xi ¯ > ² ≤ lim inf P ¯
M →∞
M →∞
i=k
i=k
Thus, it follows by Lemma 2 and Lemma 3 with bi = bn , k ≤ i ≤ M,
that
n sup
o
n |T |
²o
k
k≥n |Tk |
P
> ² ≤ 4 sup P
>
bn
bn
4
k≥n
¯P
n ¯¯ M
i=k Xi |
²o
≤ 4 sup lim inf P
>
bn
4
k≥n M →∞
≤ 4 sup lim inf
k≥n M →∞
=
³ 24p+3
²p
+
³ 22p+1
(²/4)p
∞
6´X
2
²
i=n
22 ´ X Egi (|Xi |)
²/4
gi (bn )
M
+
i=k
Egi (|Xi |)
.
gi (bn )
Hence the proof is completed.
¤
Theorem 2. Let {Xn , n ≥ 1} be a sequence of independent random
variables with EXn = 0, n ≥ 1, and let {gn (x), n ≥ 1} be a sequence of
functions defined on [0, ∞) satisfying (3.1) and (3.2). Let {bn , n ≥ 1}
be a sequence of positive constants such that
∞
X
(3.4)
Egi (|Xi |) = o(gn (bn )).
i=n
Then {Tn , n ≥ 1} is a well-defined sequence of random variables obeying
(2.6)
Proof. Since (3.4) implies (3.3), Tn is well-defined by Theorem 1.
Also, Theorem 1 implies that
P
n sup
k≥n
bn
|Tk |
o
>² ≤
≤
³ 24p+3
²p
³ 24p+3
²p
26 ´ X Egi (|Xi |)
+
² i=n gi (bn )
∞
26 ´ 1 X
+
Egi (|Xi |) = o(1).
² gn (bn ) i=n
∞
770
Soo Hak Sung and Andrei I. Volodin
Thus the proof is completed.
¤
Nam and Rosalsky [4] proved the following corollary which is the
special case gn (x) = |x|p , x ≥ 0, n ≥ 1, of Theorem 2.
Corollary 1. Let {Xn , n ≥ 1} be a sequence of independent random variables with EXn = 0, n ≥ 1, and let {bn , n ≥ 1} be a sequence
of positive constants such that
∞
X
(3.5)
E|Xi |p = o(bpn )
i=n
for some 1 < p ≤ 2. Then {Tn , n ≥ 1} is a well-defined sequence of
random variables obeying (2.6).
The following example illustrates the sharpness of Theorem 2. It
shows that if o in (3.4) is replaced by O, then Theorem 2 can fail. It
also shows that if p > 2, then Theorem 2 can fail.
Example 1. Let {Yn , n ≥ 1} be sequence of independent and identically distributed N (0, 1) random variables,
and let {an , n ≥ 1} be
P∞
2
a sequence of
positive
constants
such
that
n=1 an < ∞. Let Xn =
pP∞
p
2
an Yn , bn =
i=n ai , n ≥ 1, and gn (x) = |x| (p > 1), x ≥ 0, n ≥ 1.
Then {Xn , n ≥ 1} is a sequence of independent
and symmetric
P∞ random
P∞
variables with EXn = 0, n ≥ 1. Since n=1 V ar(Xn ) = n=1 a2n < ∞,
Tn is well defined.
P
To show that the limit law supk≥n |Tk |/bn → 0 does not hold, it
is enough to show by Lemma 5 that the tail series weak law of large
Pk
numbers does not hold. To do this, let Sn,k =
i=n Xi , k ≥ n ≥
1, and let φn,k (t) be the characteristic function of Sn,k . Then Sn,k ∼
Pk
N (0, i=n a2i ), and so
n
φn,k (t) = exp
k
∞
n 1 X
o
1 2 X 2o
2
− t
ai → exp − t
a2i
2 i=n
2 i=n
as k → ∞. Since limk→∞ φn,k (t) is the characteristic function of N (0,
P
∞
2
i=n ai ), it follows by the Lévy continuity theorem that Tn ∼ N (0,
On convergence of series
771
P∞
a2i ). Thus Tn /bn has standard normal distribution, which implies
that the tail series weak law of large numbers does not hold.
Now we let an = 1/2n , n ≥ 1. Note that
i=n
∞
X
Egi (|Xi |) = E|Y1 |p
i=n
gn (bn ) =
∞
X
1
1
∼ C1 np ,
ip
2
2
i=n
∞
³X
1 ´p/2
1
∼ C2 np
2i
2
2
i=n
for some constants C1 > 0 and C2 > 0. Thus
∞
X
Egi (|Xi |) = O(gn (bn )),
i=n
P
and as shown before Tn is well-defined, but the limit law supk≥n |Tk |/bn →
0 does not hold.
On the other hand, if we take an = 1/n, n ≥ 1, then
∞
X
Egi (|Xi |) = E|Y1 |p
i=n
gn (bn ) =
∞
X
1
1
∼ C3 p−1 ,
p
i
n
i=n
∞
³X
1
1 ´p/2
∼ C4 p/2
2
i
n
i=n
for some constants C3 > 0 and C4 > 0. Thus, when p > 2, the limit
P
law supk≥n |Tk |/bn → 0 does not hold although (3.4) holds and Tn is
well-defined.
References
[1] B. von Bahr and C.-G. Esseen, Inequalities for the r-th absolute moment of a
sum of random variables, 1 ≤ r ≤ 2, Ann. Math. Statist. 36 (1965), 299–303.
[2] Y. S. Chow and H. Teicher, Probability Theory; Independence, Interchangeability, Martingales, 2nd ed., Springer-Verlag, New York, 1988.
[3] N. Etemadi, On some classical results in probability theory, Sankhyā Ser. A 47
(1985), 215-221.
[4] E. Nam and A. Rosalsky, On the rate of convergence of series of random variables, Teor. Īmovīrnost. ta Mat. Statist. 52 (1995), 120–131, In Ukrainian, English translation in Theor. Probab. Math. Statist. 52 (1996), 129–140.
772
[5]
[6]
[7]
[8]
[9]
Soo Hak Sung and Andrei I. Volodin
, On the convergence rate of series of independent random variables, In:
E. Brunner and M. Denker (eds.), Research Developments in Probability and
Statistics: Festschrift in Honor of Madan L. Puri on the Occasion of his 65th
Birthday, pp. 33-44, VSP International Science Publ., Utrecht, The Netherlands,
1996.
E. Nam, A. Rosalsky, and A. Volodin, On convergence of series of independent
random elements in Banach spaces, Stochastic Anal. Appl. 17 (1999), 85–98.
A. Rosalsky and J. Rosenblatt, On the rate of convergence of series of Banach
space valued random elements, Nonlinear Anal. 30 (1997), 4237–4248.
, On convergence of series of random variables with applications to martingale convergence and to convergence of series with orthogonal summands,
Stochastic Anal. Appl. 16 (1998), 553–566.
W. F. Stout, Almost Sure Convergence, Academic Press, New York, 1974.
Soo Hak Sung, Department of Applied Mathematics, PaiChai University,
Taejon 302-735, Korea
E-mail: [email protected]
Andrei I. Volodin, Department of Mathematics and Statistics, University
of Regina, Regina, Saskatchewan, Canada
E-mail: [email protected]
Related documents