Download SOME RESULTS ON ASYMPTOTIC BEHAVIORS OF RANDOM

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts
Transcript
Commun. Korean Math. Soc. 25 (2010), No. 1, pp. 119–128
DOI 10.4134/CKMS.2010.25.1.119
SOME RESULTS ON ASYMPTOTIC BEHAVIORS OF
RANDOM SUMS OF INDEPENDENT IDENTICALLY
DISTRIBUTED RANDOM VARIABLES
Tran Loc Hung and Tran Thien Thanh
Abstract. Let {Xn , n ≥ 1} be a sequence of independent identically
distributed (i.i.d.) random variables (r.vs.), defined on a probability space
(Ω, A, P ), and let {Nn , n ≥ 1} be a sequence of positive integer-valued
r.vs., defined on the same probability space (Ω, A, P ). Furthermore, we
assume that the r.vs. Nn , n ≥ 1 are independent of all r.vs. Xn , n ≥ 1.
In present paper we are interested in asymptotic behaviors of the random
sum
SNn = X1 + X2 + · · · + XNn , S0 = 0,
where the r.vs. Nn , n ≥ 1 obey some defined probability laws. Since the
appearance of the Robbins’s results in 1948 ([8]), the random sums SNn
have been investigated in the theory probability and stochastic processes
for quite some time (see [1], [4], [2], [3], [5]).
Recently, the random sum approach is used in some applied problems
of stochastic processes, stochastic modeling, random walk, queue theory,
theory of network or theory of estimation (see [10], [12]).
The main aim of this paper is to establish some results related to
the asymptotic behaviors of the random sum SNn , in cases when the
Nn , n ≥ 1 are assumed to follow concrete probability laws as Poisson,
Bernoulli, binomial or geometry.
1. Introduction
Let {Xn , n ≥ 1} be a sequence of independent identically distributed (i.i.d.)
random variables (r.vs.), defined on a probability space (Ω, A, P ) and let {Nn ,
n ≥ 1} be a sequence of positive integer-valued r.vs., defined on the same
probability space (Ω, A, P ). Furthermore, we assume that the r.vs. Nn , n ≥ 1
are independent of all i.i.d.r.vs. Xn , n ≥ 1. From now on, the random sum is
defined by
(1)
SNn = X1 + X2 + · · · + XNn ,
S0 = 0.
Received October 16, 2008; Revised August 13, 2009.
2000 Mathematics Subject Classification. 60F05, 60G50.
Key words and phrases. random sum, independent identically distributed random variables, asymptotic behavior, Poisson law, Bernoulli law, binomial law, geometric law.
c
°2010
The Korean Mathematical Society
119
120
TRAN LOC HUNG AND TRAN THIEN THANH
Since the appearance of the Robbins’s results in 1948 (see [8] for more details),
the random sums SNn have been investigated in the theory probability and
stochastic processes for quite some time (see [5], [1], [2] and [12] for the complete
bibliography).
In the classical theory of limit theorems we can consider the non-random
index n in the sum Sn = X1 + X2 + · · · + Xn as a random variable degenerated
at a point n. Therefore, the replacement of the number n of the sum Sn by the
positive integer-valued r.vs. is natural. In simple terms, a random sum is a sum
of a random number of r.vs.. The number of the terms Nn , n ≥ 1 in the sum,
as well as the individual terms, can obey various probability laws. In stochastic
theory, Nn , n ≥ 1 are often assumed to follow Poisson law or geometric law.
In general, the r.vs. Nn , n ≥ 1 should satisfy any conditions. To illustrate this,
we can recall three types of classical conditions for the r.vs. Nn as follows
(2)
E(Nn ) → +∞
(3)
Nn P
−
→1
n
or
(4)
P
Nn −
→∞
as
n → ∞,
as
n → ∞,
as
n → ∞.
The condition (2) was used in well-known results of H. Robbins’s in 1948 (see
details in [8]). The condition (3) was applied in Feller’s theorem for random
sums (cf. [1]), while the last condition in (4) was presented in various papers
like [4], [5], [9], [6], [7],. . . . It is to be noticed that the conditions (2), (3) and
(4) are in following relationship
(3) ⇒ (4) ⇒ (2).
This paper is organized as follows. In Section 2 we present the main results related to the asymptotic behaviors of random sums in (1), when the
r.vs. Nn , n ≥ 1 belong to some discrete probability laws. The proofs of these
main results are presented in Section 3.
2. Main results
From now on, the random variable of standard normal law N (0, 1) will be
d
denoted by X ∗ , the notation −
→ will mean the convergence in distribution and
P
−
→ will denote the convergence in probability.
Theorem 2.1. Let {Xn , n ≥ 1} be a sequence of i.i.d.r.vs., Xj ∼ Bernoulli(p),
p ∈ (0, 1), j = 1, 2, . . . , n. Moreover, suppose that {Nn , n ≥ 1} is a sequence
of positive integer-valued r.vs. independent on all Xj , j = 1, 2, . . . , n. Then,
(i) SNn ∼ P oisson(λp),
¡ ifq Nn¢ ∼ P oisson(λ), λ > 0, n ≥ 1.
, if Nn ∼ Geometry(q), q ∈ (0, 1), p + q =
(ii) SNn ∼ Geometry p+q−pq
1, n ≥ 1 with P(Nn = k) = q(1 − q)k , k = 0, 1, . . .
SOME RESULTS ON ASYMPTOTIC BEHAVIORS OF RANDOM SUMS
121
(iii) SNn ∼ Binomial(n, pq), if Nn ∼ Binomial(n, q), q ∈ (0, 1), p + q =
1, n ≥ 1.
It is well known that the sum of independent r.vs. from Bernoulli law will belong to the same law. But in cases (i) and (ii) the random sums of independent
r.vs. of Bernoulli law will not obey the Bernoulli. The final distributions of the
random sums SNn depend on the distribution of r.vs. Nn , n ≥ 1. The same
concludes would be found in B. Gnedenko’s or V. Kruglov’s and V. Korolev’s
papers, in cases when the r.vs. Xj , j = 1, 2, . . . , n were independent standard
normal distributed while the r.vs. Nn , n ≥ 1 were uniformly distributed (see
for more details in [2], [3] and [5]). The conclusion in (iii) is a very interesting
result when the random sum SNn and the r.vs. Nn , n ≥ 1 are identically distributed. Furthermore, we can receive the Poisson’s Approximation Theorem
by using the Theorem 2.1(i) as follows
Theorem 2.2. Assume that for each n = 1, 2, . . ., {Xnk , k = 1, 2, . . . , n}
be a sequence of independent and identically Bernoulli distributed r.vs. with
parameter pn ∈ (0, 1) and pn → 0, npn → λ (λ > 0) as n → ∞. Then, as
n → ∞,
n
X
d
Sn =
Xnk −
→ P oisson(λ).
k=1
Theorem 2.3. Let {Xn , n ≥ 1} be a sequence of i.i.d.r.vs.. Suppose that
{Nn , n ≥ 1} is a sequence of positive integer-valued r.vs. independent on all
Xj , j = 1, 2, . . . , n. Furthermore, assume that Nn ∼ Geometry(p), n ≥ 1.
Then, we have
(i) SNn ∼ Exp(λp), when Xj ∼ Exp(λ), j = 1, 2, . . . , n.
(ii) SNn ∼ Geometry(pq), when Xj ∼ Geometry(q), j = 1, 2, . . . , n.
Theorem 2.4. Let {Xn , n ≥ 1} be a sequence of i.i.d.r.vs. such that E(X1 ) =
0, V ar(X1 ) = 1. Moreover, suppose that {Nn , n ≥ 1} is a sequence of r.vs. belonging to Poisson law P oisson(λn ) and independent of all Xn , n ≥ 1. If
λn
→ 1 as n → ∞,
n
then
SNn d
√ −
→ X ∗ as n → ∞.
n
Theorem 2.5. Let {Xn , n ≥ 1} be a sequence of i.i.d.r.vs. such that E(X1 ) =
µ1 , E(X12 ) = µ2 . Suppose that {Nn , n ≥ 1} is a sequence of r.vs. from Poisson
law P oisson(λn ) and independent of all Xn , n ≥ 1. Assume that
λn
→ 1 as n → ∞
n
and
µ
¶
√
λn
n
− 1 → 0 as n → ∞.
n
122
Then
TRAN LOC HUNG AND TRAN THIEN THANH
SNn − nµ1 d
−
→ X∗
√
nµ2
as
n → ∞.
Theorem 2.6. Let {Xn , n ≥ 1} be a sequence of i.i.d.r.vs. such that E(X1 ) <
∞, 0 < E(X12 ) < ∞. Furthermore, assume that the {Nn , n ≥ 1} is a sequence
of r.vs. from Poisson law P oisson(λn ) and independent of all Xn , n ≥ 1. If
λn → +∞
then
as
SNn − λn E(X1 ) d
p
−
→ X∗
λn E(X12 )
n → ∞,
as
n → ∞.
Theorem 2.7. Let {Xn , n ≥ 1} be a sequence of i.i.d.r.vs. such that E(X1 ) =
0, V ar(X1 ) = 1. Moreover, suppose that the {Nn , n ≥ 1} is a sequence of
r.vs. from Binomial law B(n, p) and they are independent of all Xn , n ≥ 1.
Then
S
d
p Nn
−
→ X ∗ as n → ∞.
E(Nn )
Theorem 2.8. Let {Xn , n ≥ 1} be a sequence of i.i.d.r.vs. with finite moments. Suppose that {Nn , n ≥ 1} is a sequence of r.vs. from Binomial law
B(n, p) and they are independent of all Xn , n ≥ 1. Then
SNn − E(SNn ) d
p
−
→ X∗
V ar(SNn )
as
n → ∞.
Theorem 2.9. Let {Xn , n ≥ 1} be a sequence of i.i.d.r.vs. such that E(X) = 0
and V ar(X) = 1. We assume that {Nn , n ≥ 1} is a sequence of random variables of Geometry law Geometry(pn ) and they are independent of all Xn , n ≥
1. If
pn → 0 as n → ∞,
then
S
d
p Nn
−
→ Z as n → ∞,
E(Nn )
where Z does not belong to N (0, 1).
Remark 1. It is worth pointing out that from Theorem 2.9, we have E(Nn ) =
1/pn → ∞, as n → ∞ but the random sum SNn does not obey central limit
theorem. Therefore, the condition E(Nn ) → ∞, as n → ∞ is not sufficient for
satisfying the central limit theorem for random sum SN .
Theorem 2.10. Let {Xn , n ≥ 1} be a sequence of independent standard normal distributed r.vs.. Suppose that {Nn , n ≥ 1} is a sequence of positive integervalued r.vs. such that the condition (2) is true, and
(5)
E|Nn − E(Nn )|
→0
E(Nn )
as
n → ∞.
SOME RESULTS ON ASYMPTOTIC BEHAVIORS OF RANDOM SUMS
Then
p
SNn
d
E(Nn )
−
→ X∗
as
123
n → ∞.
p
Remark 2. Based on the fact that E|Nn − E(Nn )| ≤ V ar(Nn ) we can conar(Nn )
clude that the condition VE(N
→ 0, as n → ∞ in Robbins’s results [8] will
n)
be stronger than the condition (5).
Theorem 2.11. Let {Xn , n ≥ 1} be a sequence of i.i.d.r.vs. such that E|X1 | <
∞, E(X1 ) = µ and suppose that {Nn , n ≥ 1} is a sequence of positive integervalued r.vs. independent of all Xn , n ≥ 1. Assume that the condition (3) true,
then
SNn P
−
→ µ as n → ∞.
n
Theorem 2.12. Let {Xn , n ≥ 1} be a sequence of i.i.d.r.vs., and assume that
Nn ∼ Binomial(n, pn ), n ≥ 1, satisfying npn → λ, as n → ∞. Then,
d
SNn −
→ SN
as
n → ∞,
where
N ∼ P oisson(λ).
3. Proofs
Proof of Theorem 2.1. (i) It is clear that the generating function of r.vs. Nn , n
≥ 1 is g(t) = eλ(t−1) and characteristic function of the r.vs. Xn , n ≥ 1 is
ϕ(t) = 1 + p(eit − 1). Then, the characteristic function of random sum SNn is
given by
ψ(t) = g(ϕ(t)) = eλp(e
it
−1)
.
Thus SNn ∼ P oisson(λp).
q
(ii) Let us denote g(t) = 1−(1−q)t
the generating function of the r.vs. Nn and
let the characteristic function of the r.vs. Xn , n ≥ 1 be ϕ(t) = 1 + p(eit − 1).
Then, the characteristic function of random sum SNn can be calculated by
ψ(t) = g(ϕ(t)) =
1−
q
p+q−pq
¢ .
¡
q
1 − p+q−pq
eit
q
Geometry( p+q−pq
).
n
Therefore SNn ∼
(iii) Let g(t) = [1 + q(t − 1)] and ϕ(t) = 1 + p(eit − 1) be the generating
function and characteristic function of r.vs. Nn , n ≥ 1 and Xj , j = 1, 2, . . . ,
respectively. Then, characteristic function of random sum SNn will be defined
by
ψ(t) = g(ϕ(t)) = [1 + pq(eit − 1)]n .
By this way, SNn ∼ Binomial(n, pq).
¤
124
TRAN LOC HUNG AND TRAN THIEN THANH
Proof of Theorem 2.2. Let consider Nn ∼ P oisson(n). According to Theorem
2.1(i) we have SNn ∼ P oisson(npn ). Then,
P(SNn = k) =
e−λ λk
e−npn (npn )k
→
k!
k!
as
n → ∞,
d
or SNn −
→ P oisson(λ). Therefore, we must only to show that
∆n (t) = |ϕSNn (t) − ϕSn (t)| → 0.
We contend that
∆n (t) = |enpn (e
it
−1)
− [1 − pn (1 − eit )]n | ≤ n|epn (e
it
−1)
− 1 − pn (eit − 1)|.
Clearly, Re(pn (eit − 1)) ≤ 0, applying the inequality |eα − 1 − α| ≤
Re(α) ≤ 0, we obtain
n|pn (eit − 1)|2
pn |eit − 1|2
= npn
→0
2
2
This completes the proof.
∆n (t) ≤
as
|α|2
2
with
n → ∞.
¤
pt
Proof of Theorem 2.3. (i) Let us denote the generating function g(t) = 1−(1−p)t
λ
and the characteristic function ϕ(t) = λ−it of Nn , n ≥ 1 and Xj , j = 1, 2, . . .
respectively. Then, the characteristic function of random sum SNn is given by
ψ(t) = g(ϕ(t)) =
λp
.
λp − it
We conclude that SNn ∼ Exp(λp).
pt
(ii) Let g(t) = 1−(1−p)t
be the generating function of Nn , n ≥ 1 and denote
it
qe
ϕ(t) = 1−(1−q)e
it the characteristic function of Xj , j = 1, 2, . . . . Then, the
characteristic function of random sum SNn , n ≥ 1 will be given by
ψ(t) = g(ϕ(t)) =
pqeit
.
1 − (1 − pq)eit
Hence, SNn ∼ Geometry(pq).
¤
Proof of Theorem 2.4. Denote gn (t) = eλn (t−1) the generating function of Nn ,
S
n ≥ 1 and ϕ, ψn are characteristic functions of X1 , √Nnn , respectively. Then,
£ ¡ t ¢ ¤
µ
¶
µ µ
¶¶
t
t
−1
λn ϕ √
n
ψn (t) = ϕSNn √
,
= gn ϕ √
=e
n
n
¡ ¢
¡ ¢
t2
where ϕ √tn = 1 − 2n
+ o n1 .
Hence
³ 1 ´i
³λ ´
h t2
λn t 2
n
+o
=−
· +o
.
ln ψn (t) = λn −
2n
n
n 2
n
2
According to assumptions, as n → ∞, we have then ln ψn (t) → − t2 . This
finishes the proof.
¤
SOME RESULTS ON ASYMPTOTIC BEHAVIORS OF RANDOM SUMS
125
Proof of Theorem 2.5. Let gn (t) = eλn (t−1) be generating function of Nn , n ≥
S
−nµ
1 and denote ϕ, ψn are characteristic functions of X1 , N√nnµ2 1 , respectively.
Then,
£ ¡ t ¢ ¤
µ
¶
nµ
nµ
−it √nµ1
−it √nµ1
λn ϕ √nµ −1
t
2 · ϕ
2 · e
2
ψn (t) = e
=
e
,
√
SN n
nµ2
t
here ϕ( √nµ
)=1+
2
From this
ln ψn (t)
itµ
√ 1
nµ2
−
t2
2n
+ o( n1 ).
³λ ´
λn µ1
λn t2
nµ1
n
+ it √
−
· +o
= −it √
nµ2
nµ2
n 2
n
´ λ
³λ ´
2
µ 1 √ ³ λn
n t
n
= it √
n
−1 −
· +o
.
µ2
n
n 2
n
2
On account of assumptions, when n → ∞, we get ln ψn (t) → − t2 . The proof
is straightforward.
¤
Proof of Theorem 2.6. Putting µ1 = E(X1 ), µ2 = E(X12 ) and let gn (t) =
eλn (t−1) be a generating function of Nn , n ≥ 1 and ϕ, ψn be are characteristic
S
−λ µ
functions of X1 , N√nλ µn 1 , respectively. Then,
n 2
£ ¡ t ¢ ¤
µ
¶
λ
µ
λ µ
t
−it √ n 1
−it √ n 1
λn ϕ √
−1
λn µ2 · ϕ
λn µ2 · e
λn µ2
√
ψn (t) = e
=e
,
SN n
λn µ 2
³
´
³ ´
t2
1
1
where ϕ √λt µ = 1 + √itµ
−
+
o
2λn
λn .
λn µ2
n 2
Hence
h
³ 1 ´i
t2
t2
ln ψn (t) = λn −
+o
= − + o(1).
2λn
λn
2
2
If n → ∞, then ln ψn (t) → − t2 . We obtain the proof.
¤
Proof of Theorem 2.7. It is easy to see that the Nn , n ≥ 1 have the generating
function gn (t) = [1+p(t−1)]n and E(Nn ) = np. Let ϕ, ψn be are characteristic
S
functions of X1 and √ Nn , respectively. Then,
E(Nn )
µ µ
¶¶ µ
· µ
¶
¸¶n
t
t
= 1+p ϕ √
−1
,
ψn (t) = gn ϕ √
np
np
³
´
¡ ¢
t2
where ϕ √tnp = 1 − 2np
+ o n1 .
Therefore,
³ 1 ´in
h
t2
+o
.
ψn (t) = 1 −
2n
n
t2
If n → ∞, then ψn (t) → e− 2 . We have the complete proof.
¤
126
TRAN LOC HUNG AND TRAN THIEN THANH
Proof of Theorem 2.8. Let us denote µ1 = E(X1 ); µ2 = E(X12 ); δ = µ2 − µ21 p.
Then,
E(SNn ) = npµ1 , V ar(SNn ) = np(µ2 − µ21 p) = npδ.
Clearly, gn (t) = [1 + p(t − 1)]n is the generating function of Nn , n ≥ 1. We
S n −npµ1
denote ϕ and ψn the characteristic functions of X1 and N√
, respectively.
npδ
Then,
¶
µ
¶
¸¶n
· µ
µ
√
√
t
t
ψn (t) = e−itµ1 np/δ · ϕSNn √
= e−itµ1 np/δ 1 + p ϕ √
−1
,
npδ
npδ
³
´
¡ ¢
t2 µ 2
t
t
where ϕ √npδ
− 2npδ
+ o n1 .
= 1 + iµ1 √npδ
Therefore
r
h
³ 1 ´in
√
t2 (δ + µ21 p)
p
−itµ1 np/δ
ψn (t) = e
1 + itµ1
−
+o
nδ
2nδ
n
r
h
³ 1 ´in
√
2 2
2
p
t
µ
p
t
1
−
−
+o
= e−itµ1 np/δ 1 + itµ1
nδ
2nδ
2n
n
h
³ 1 ´in
√
√
2
t
= e−itµ1 np/δ eitµ1 p/(nδ) −
+o
2n
n
h
³
´i
t2 −itµ1 √p/(nδ)
1 n
= 1−
e
+o
.
2n
n
t2
By letting n → ∞, ψn (t) → e− 2 . We have the proof.
¤
pn t
Proof of Theorem 2.9. Let gn (t) = 1−(1−p
be a generating function of the
n )t
random sum Nn , n ≥ 1 and suppose that E(Nn ) = p1n . We denote ϕ and ψn
√
the characteristic functions of X1 and pn SNn , respectively. Then,
√
pn ϕ(t pn )
√
ψn (t) = gn (ϕ(t pn )) =
,
√
1 − (1 − pn )ϕ(t pn )
√
where ϕ(t pn ) = 1 − pn t2 /2 + o(pn ).
Therefore
1 − pn t2 /2 + o(pn )
ψn (t) =
.
1 + t2 /2 − o(1 − pn )
From the assumptions, by letting n → ∞, then ψn (t) →
the complete proof.
2
t2 +2
t2
6= e− 2 . We get
¤
Proof of Theorem 2.10. Suppose that Xj , j = 1, 2, . . . having the characteristic
t2
function ϕ(t) = e− 2 . Let us put gn (t) = E(tNn ), pk = P(Nn = k), an =
S
E(Nn ). Moreover, suppose that ψn (t) is a characteristic function of √Nann . Then,
µ µ
¶¶ X
∞
t2 k
t
− 2a
n .
pk e
ψn (t) = gn ϕ √
=
an
k=0
SOME RESULTS ON ASYMPTOTIC BEHAVIORS OF RANDOM SUMS
127
Therefore
∞
∞
¯
¯ t2 k
¯X
³ t2 k
X
t2 ¯
t2 ´¯¯
t2 ¯¯
−
¯
¯
¯ −
− ¯
pk e 2an − e− 2 ¯ ≤
pk ¯e 2an − e− 2 ¯.
¯ψn (t) − e 2 ¯ = ¯
k=0
k=0
In another way, using the Lagrange’s Theorem for the function h(x) =
¡ − t2 ¢x
e 2
continuing on [ akn , 1] or [1, akn ], we have
¯ t2 k
¯ µk ¶
¯ ¯k
¯
t2 ¯
¯ − 2an
¯
¯ ¯
¯
−2 ¯
−e
− h(1)¯ = ¯
− 1¯|h0 (c)| (for any c)
¯e
¯ = ¯h
an
an
¯
¯
t2 ¯¯ k
t2 ¯¯ k
¯
¯
= ¯
− 1¯ · h(c) ≤
·¯
− 1¯.
2 an
2 an
(because of c ≥ 0, h(x) is decreasing)
Then
∞
¯
¯k
¯ t2
X
t2 ¯
t2 E|Nn − an |
¯
¯
¯
−2 ¯
(t)
−
e
≤
pk ¯
− 1¯ =
.
¯ψn
¯
an
2
2
an
k=0
t2
According to assumptions, by letting n → ∞, then ψn (t) → e− 2 . We have the
proof.
¤
Proof of Theorem 2.11. According to the Weak Law of Large Numbers, we
P
have Snn −
→ µ. Because of assumptions, using the considerations on relaP
tionships among the conditions (3) and (4), it follows Nn −
→ ∞. Since, for
² > 0, ∃n0 , ∀n > n0 : P(| Snn − µ| > ²) < ², and
¯
¯
µ¯
¶
µ¯
¶ X
n0
∞
∞
X
X
¯ SN
¯
¯
¯ Sk
P ¯¯ n − µ¯¯ > ²
− µ¯¯ > ² =
=
P(Nn = k)P ¯¯
+
Nn
k
k=1
k=1
k=n0 +1
≤ P(Nn ≤ n0 ) + ²,
it is easy to derive
SNn
Nn
P
−
→ µ. Then,
SNn
SNn Nn P
−
→ µ.
=
·
n
Nn
n
We obtain the proof.
¤
Proof of Theorem 2.12. Let gn (t) = [1 + pn (t − 1)]n be a generating function
of Nn and suppose that ϕ and ψn are characteristic functions of X1 and SNn ,
respectively. Then
´n
¡
¢n ³
npn
ψn = gn (ϕ(t)) = 1 + pn [ϕ(t) − 1] = 1 +
[ϕ(t) − 1] .
n
λ[ϕ(t)−1]
By putting n → ∞, then ψn (t) → e
. The proof is finished.
¤
Acknowledgements. The authors would like to express their gratitude to
referees for several constructive suggestions.
128
TRAN LOC HUNG AND TRAN THIEN THANH
References
[1] W. Feller, An Introduction to Probability Theory and Its Applications, volume II, 2nd
edition, John Wiley & Sons, New York, 1971.
[2] B. Gnedenko, Limit theorems for sums of a random number of positive independent random variables, Proceedings of the Sixth Berkeley Symposium on Mathematical Statistics
and Probability (Univ. California, Berkeley, Calif., 1970/1971), Vol. II: Probability theory, Univ. California Press, Berkeley, Calif. (1972), 537–549.
, On limit theorems for a random number of random variables, Probability theory
[3]
and mathematical statistics (Tbilisi, 1982), Lecture Notes in Math., 1021 Springer,
Berlin, (1983), 167–176.
[4] A. Krajka, Z. Rychlik, Necessary and sufficient conditions for weak convergence of
random sums of independent random variables, Comment. Math. Univ. Carolinae 34
(1993), no. 3, 465–482.
[5] V. Kruglov and V. Korolev, Limit Theorems for Random Sums, Moscow University
Press, Moscow, 1990.
[6] J. Melamed, Limit theorems in the set-up of summation of a random number of independent identically distributed random variables, Lecture Notes in Math., Vol. 1412,
Springer, Berlin, (1989), 194–228.
[7] J. Modyorodi, A central limit theorem for the sums of a random number of random
variables, Ann. Univ. Sci. Budapest. Eotvos Sect. Math. 10 (1967), 171–182.
[8] H. Robbins, The asymptotic distribution of the sum of a random number of random
variables, Bull. Amer. Math. Soc. 54 (1948), 1151–1161.
[9] Z. Rychlik and D. Szynal, On the limit behaviour of sums of a random number of
independent random variables, Colloq. Math. 28 (1973), 147–159.
[10] Z. Rychlik and T. Walczynski, Convergence in law of random sums with nonrandom
centering, J. Math. Sci. (New York) 106 (2001), 2860–2864.
[11] D. O. Selivanova, Estimates for the rate of convergence in some limit theorems for
geometric random sums, Moscow Univ. Comput. Math. Cybernet. (1995), no. 2, 27–31.
[12] Yu. B. Shvetlov and John J. Borkowski, Random sum estimators and their efficiency,
Technical Report, Department of Mathematical Science, Montana State University,
(2004), 1–20.
[13] D. Szasz, Limit theorems for the distributions of the sums of a random number of
random variables, Ann. Math. Statist. 43 (1972), 1902–1913.
[14] P. Vellaisamy and B. Chaudhuri, Poisson and compound Poisson approximations for
random sums of random variables, J. Appl. Probab. 33 (1996), 127–137.
Tran Loc Hung
Department of Mathematics
Hue University
77 Nguyen Hue, Hue city, +84-54, Vietnam
E-mail address: [email protected]
Tran Thien Thanh
Department of Mathematics
Hue University
77 Nguyen Hue, Hue city, +84-54, Vietnam
E-mail address: [email protected]