Scholarly article on topic 'Moment Inequalities and Complete Moment Convergence'

Moment Inequalities and Complete Moment Convergence Academic research paper on "Mathematics"

0
0
Share paper
Academic journal
J Inequal Appl
OECD Field of science
Keywords
{""}

Academic research paper on topic "Moment Inequalities and Complete Moment Convergence"

Hindawi Publishing Corporation Journal of Inequalities and Applications Volume 2009, Article ID 271265,14 pages doi:10.1155/2009/271265

Research Article

Moment Inequalities and Complete Moment Convergence

Soo Hak Sung

Department of Applied Mathematics, Pai Chai University, Taejon 302-735, South Korea Correspondence should be addressed to Soo Hak Sung, sungsh@pcu.ac.kr Received 22 August 2009; Accepted 26 September 2009 Recommended by Andrei Volodin

Let [Yi, 1 < i < n} and [Zi, 1 < i < n} be sequences of random variables. For any e > 0

and a > 0, bounds for E(\ 2n=i(Yi + Zi)| - ea)+ and E(max1<k<n\ 2i=i(Yi + Zi)| - ea) are obtained. From these results, we establish general methods for obtaining the complete moment convergence. The results of Chow (1988), Zhu (2007), and Wu and Zhu (2009) are generalized and extended from independent (or dependent) random variables to random variables satisfying some mild conditions. Some applications to dependent random variables are discussed.

Copyright © 2009 Soo Hak Sung. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

1. Introduction

Let [Xn, n > 1} be a sequence of random variables defined on a fixed probability space (Q, F, P). The most interesting inequalities to probability theory are probably Marcinkiewicz-Zygmund and Rosenthal inequalities. For a sequence [Xif 1 < i < n} of i.i.d. random variables with E\X1\q < to for some q > 1, Marcinkiewicz and Zygmund [1] and Rosenthal [2](1 < q < 2 and q > 2, resp.) proved that there exist positive constants Aq and Bq depending only on q such that

for 1 <q < 2,

for q > 2.

The following Marcinkiewicz-Zygmund and Rosenthal type maximal inequalities are well known. For a sequence [Xir 1 < i < n} of i.i.d. random variables with E\X\\q < to for some q > 1, there exist positive constants Cq and Dq depending only on q such that

1< k<n

E(Xi - EXi)

E(Xi - EXi)

< C^E\Xi\q fori <q < 2,

in / n \q/2~}

| ZE * \q ♦ (EE * \2) |

for q > 2. (1.4)

Note that (1.3) and (1.4) imply (1.1) and (1.2), respectively. The above inequalities have been obtained for dependent random variables by many authors. Shao [3] proved that

(1.3) and (1.4) hold for negatively associated random variables. Asadian et al. [4] proved that (1.1) and (1.2) hold for negatively orthant dependent random variables.

For a sequence of some mixing random variables, (1.4) holds. However, the constant Dq depends on both q and the sequence of mixing random variables. Shao [5] obtained

(1.4) for ^-mixing identically distributed random variables satisfying XTOU $1/2(2n) < to. Shao [6] also obtained (1.4) for p-mixing identically distributed random variables satisfying 2TO=1 p2/q(2n) < to. Utev and Peligrad [7] obtained (1.4) for p*-mixing random variables.

The concept of complete convergence was introduced by Hsu and Robbins [8]. A sequence {Xn, n > 1} of random variables is said to converge completely to the constant 6 if

( \Xn - 6\ > e) < to ve> 0. (1.5)

In view of the Borel-Cantelli lemma, this implies that Xn ^ 6 almost surely. Therefore the complete convergence is a very important tool in establishing almost sure convergence of summation of random variables. Hsu and Robbins [8] proved that the sequence of arithmetic means of i.i.d. random variables converges completely to the expected value if the variance of the summands is finite. Erdos [9] proved the converse.

The result of Hsu-Robbins-Erdos has been generalized and extended in several directions. Baum and Katz [10] proved that if {Xn, n > 1} is a sequence of i.i.d. random variables with E \x1 \ < to, E \X1 \pt < to (1 < p < 2, t > 1) is equivalent to

Z(Xi - EXi) i=1

> en1/p) < TO Ve > 0.

Chow [11] generalized the result of Baum and Katz [10] by showing the following complete moment convergence. If {Xn, n > 1} is a sequence of i.i.d. random variables with E\X1 \pt < to

Journal of Inequalities and Applications for some 1 < p < 2 and t > 1, then

Ynt-2-1'PE\

E(Xt - EXi)

- en1/p) < to Ve > 0,

where a+ = max{a,0}. Note that (1.7) implies (1.6) (see Remark 2.6).

Recently, Zhu [12] obtained a complete convergence for p*-mixing random variables. Wu and Zhu [13] obtained complete moment convergence results for negatively orthant dependent random variables.

In this paper, we give general methods for obtaining the complete moment convergence by using some moment inequalities. From these results, we generalize and extend the results of Chow [11], Zhu [12], and Wu and Zhu [13] from independent (or dependent) random variables to random variables satisfying some conditions similar to

(1.1)—(1.4).

2. Complete Moment Convergence for Random Variables

In this section, we give general methods for obtaining the complete moment convergence by using some moment inequalities. The first two lemmas are simple inequalities for real numbers.

Lemma 2.1. For any real numbers a, b, c, the inequality holds

(|a + b| - |c|)+ < (|a| - |c|)+ + |b|. (2.1)

Proof. The result follows by an elementary calculation. □

The following lemma is a slight generalization of Lemma 2.1.

Lemma 2.2. Let {aif 1 < i < n} and {bi, 1 < i < n} be two sequences of real numbers. Then for any real number c, the inequality holds

(max|ai + bi| — |cM < (max|ai|-|cM + max|bi|. (2.2)

\1<i<n / \1<i<n / 1<i<n

Proof. By Lemma 2.1, we obtain

( max|ai + bi| - |cM = max(|ai + bi| - |c|)+ < maxi(|ai| - |c|)+ + |bi|l

1<i<n 1<i<n 1<i<n

< max(|ai| - |c|)+ + max|bi| = ( max|ai| - |c| ) + max|bi|.

1<i<n 1<i<n 1<i<n 1<i<n

The next two lemmas play essential roles in the paper. Lemma 2.3 gives a moment inequality for the sum of random variables.

Lemma 2.3. Let {Yi, 1 < i < n} and {Zi, 1 < i < n} be sequences of random variables. Then for any q> 1, e> 0, a> 0,

Zy + Z)

- ea < — +

1 1 \ 1

eq q - 1) aq-1

Proof. By Lemma 2.1,

Z(yi + Z)

ea + E

On the other hand, we have by Markov's inequality that

- ea> t )dt

> ea + t )dt +

> ea+t )dt

e\Z;=1 YI"

> ea +

1 /*o J a

> t )dt

1 1 \ 1

eq q - 1 aq-1

Substituting (2.6) into (2.5), we have the result.

The following lemma gives a moment inequality for the maximum partial sum of random variables.

Lemma 2.4. Let {Yi, 1 < i < n} and {Zir 1 < i < n} be sequences of random variables. Then for any q > 1, e> 0, a> 0,

E( max

V 1< k<n

Z(Y> + Z)

- ea ) < —-

eq q - 1 / aq-1 1<k<n

■ E max

Journal of Inequalities and Applications Proof. By Lemma 2.2,

E| max

¿(Y + Z)

- ea | < E | max

1 <k<n

- ea I + Emax

1 <k<n

The rest of the proof is similar to that of Lemma 2.3 and is omitted.

Now we state and prove one of our main results. The following theorem gives a general method for obtaining the complete moment convergence for sums of random variables satisfying (2.9). The condition (2.9) is well known Marcinkiewicz-Zygmund inequality.

Theorem 2.5. Let {Xni, 1 < i < n, n > 1} be an array of random variables with E\Xni\ < to for 1 < i < n, n > 1. Let {an, n > 1} and {bn, n > 1} be sequences of positive real numbers. Suppose that the following conditions hold.

(i) For some 1 < q < 2, there exists a positive constant Cq depending only on q such that

ZX i - EXn i) i=1

< C^E\X^i\q Vn > 1

where X'ni = XniI(\Xni\ < an) + anI(Xni > an) - anI(Xni < -an)

(ii) £n=i bnanq?n=i E\Xm\qI(\Xm\ < an) < to.

(iii) S n=i WE n=i E\Xni\I (\Xm\ > an) < n.

n=1 an

(X ni EXni)

< to Ve > 0.

(2.10)

Proof. Observe that

E\X'n\ = E\Xm\qI(\Xm\ < an) + aqnP(\Xm\ > an)

< E\Xm\qI(\Xm\ < an) + at1E\Xm\I(\Xm\ > an), E\Xni - X'ni\ = E\(Xni - an) I (Xni > an) + (Xni + an) I (Xni < -an)\

< E\Xni\I(\Xm\ > an).

(2.11)

(2.12)

Then we have by Lemma 2.3, (2.9), (2.11), and (2.12) that

ni EXni)

TO — E b-qE

ZXni - EX,i) i=1

eq q - ^ n= a

Z(Xni - Xni - E(Xni - X'ni))

n=1 an

/ i i \ to - n TO - n < Cqd+ q^r)^^E|xniiq + ^arZE\xm - xn

q W n=1 ani=1 n=1 ani=1

(1 -1 \ TO ^ n

T^q + q~^E-^EEiXniiqi(lxni| < an) e q 1 / n=1 an i=1

eq q - 1

}TO - n

^JL^EiXmil(|Xnii > an).

n=1 an i=1

(2.13)

The above two series converge by (ii) and (iii). Hence the result is proved. □

Remark 2.6. If (2.10) holds, then £TO=i -nP(i E^iXni - EXn) > ean) < to for all e > 0, since

£(Xni - EXni)

C ean /

(X ni - EXni)

(X ni - EXni)

(X ni - EXni)

- ean > t )dt

> ean + t ]dt

> 2ean

(2.14)

Hence complete moment convergence is more general than complete convergence.

When q > 2, we have the following theorem. Condition (2.15) is well-known Rosenthal inequality.

Theorem 2.7. Let {Xni, 1 < i < n, n > 1} be an array of random variables with EiXnii < to for 1 < i < n, n > 1. Let {an, n > 1} and {-n, n > 1} be sequences of positive real numbers. Suppose that the following conditions hold.

(i) For some q > 2, there exists a positive constant Cq depending only on q such that

i - Exnt)

< C j EeIx:

Mn> 1,

(2.15)

where X'ni = XniI(\Xni\ < an) + aj(Xni > an) - aj(Xni < -an)

(ii) ££1 Kan m=1 E\Xm\qI(\Xm\ < an) < œ.

(iii) Eœ=1 bna- En=1 E\Xm\I(\Xm\ > an) < œ.

(iv) £œ=1 bn(En=1 E\Xni\r/arn)q/2 < œ for some 0 < r < 2.

Then (2.10) holds.

Proof. The proof is same as that of Theorem 2.5 except that

œ h / n \ q/2 œ / n X 2\q/2

e^eIKA = E^ÎEe^ )

n=1 an \i=1 / n=1 \i=1 an /

n=1 \i=1

(2.16)

^ bni^E

n=1 \i=1

n r q/2

Corollary 2.8. Let {an, n > 1} be a sequence of positive real numbers. Let {Xni, 1 < i < n, n > 1} be an array of random variables satisfying (2.15) for some q > 2. Suppose that the following conditions hold.

(i) £œ=1 a„q xn=1 E\Xni\qI(\Xni\ < an) < œ.

(ii) £œ=1 a-1 £n=1 E\Xm\I(\Xm\ > an) < œ.

(iii) E„œ=1(En=1 E\Xni\r/arn )s < œ for some 0 <r < 2 and 0 <s < q/2.

(X ni EXni)

- ean ) < œ Ve > 0,

(2.17)

and hence,

ni EXni)

> ean I < oo Ve > 0.

(2.18)

Proof. By Remark 2.6, (2.17) implies (2.18). To prove (2.17),weapplyTheorem 2.7with bn = 1. Since 0 < s < q/2,

n=1 \ i=1

\Xni\r

q/(2s)

(2.19)

,n=1 \ i=1

Hence the result follows by Theorem 2.7. □

The following theorem gives a general method for obtaining the complete moment convergence for maximum partial sums of random variables satisfying condition (2.20).

Theorem 2.9. Let {Xni, 1 < i < n, n > 1} be an array of random variables with E\Xni\ < to for 1 < i < n, n > 1. Let {an, n > 1} and {bn, n > 1} be sequences of positive real numbers. Suppose that the following conditions hold.

(i) For some 1 < q < 2, there exists a positive constant Cq depending only on q such that

ZXn i - Exn i) i=1

< CqZE\X'm\q Vn > 1

(2.20)

where X'ni = XniI(\Xni\ < an) + anI(Xni > an) - anI(Xni < -an)

(ii)STO=1 bna„^n=1 E\Xm\qI(\Xm\ < an) < to.

(iii) STO=1 bnanlZn=1 E\Xm\I(\Xm\ > an) < to.

V —E[ max

n=1 an \1<k<n

(X ni EXni)

- ean I < to Ve > 0.

(2.21)

Proof. The proof is similar to that of Theorem 2.5. We have by Lemma 2.4, (2.20), (ii), and (iii) that

V —El max

n^l an \l<k<n

ni EXni)

< (— + —^^ V bnEmax

~\eq q - l)¿{ aqn —<k<n k

V —E max

i an i<k<n

n=i ri - -

XXi - Exni)

X (Xni- xn i- E{Xni- xn i)) i=1

1 1 \ & b n & b n < Cq{ — + ^ )X bbjZE\Xm\q + 2£ bbnZE\Xm - xn

eq q -1

n=1 an i=1

n=1 an i=1

(2.22)

1 1 \ & fo n < C4 eq + rrrjE-qZE\Xni\qI(\Xm\ < an)

>e q ^ n=1 an i=1

+ \ CJ - 1

eq q - 1

■2\Z—ZE\Xni\I(\Xni\ > an) <

J n=1 an i=1

Hence the result is proved.

Remark 2.10. If (2.21) holds, then £&=1 -nP(max1<k<n\ Ek=1(Xni - EXni)\ > ean) < & for all e > 0, since, as in Remark 2.6,

(X ni EXni)

- ean ) > eanP[ max

E\(Xni - EXni) i=1

> 2ean

(2.23)

When q > 2, we have the following theorem.

Theorem 2.11. Let {Xni, 1 < i < n, n > 1} be an array of random variables with E\Xni\ < & for 1 < i < n, n > 1. Let {an, n > 1} and {bn, n > 1} be sequences of positive real numbers. Suppose that the following conditions hold.

(i) For some q > 2, there exists a positive constant Cq depending only on q such that

ZiKi - EX'ni) i=1

< Cq\ZE\X'n\ +( ZE\X,

for n > 1 ,

(2.24)

where X'ni = XmI(\Xni\ < an) + aj(Xni > an) - aj(Xni < -an).

(ii) £&=1 -na-q Xn=1 E\Xm\qI(\Xm\ < an) < &.

(iii) £&=1 -na-1 Xn=1 E\Xm\I(\Xm\ > an) < &.

(iv) £&=1 -n (£n=1 E\Xm\r/arn)q/2 < & for some 0 < r < 2. Then (2.21) holds.

Proof. The proof is similar to that of Theorem 2.9 and is omitted. □

Corollary 2.12. Let {an, n > 1} be a sequence of positive real numbers. Let {Xni, 1 < i < n, n > 1} be an array of random variables satisfying (2.24) for some q > 2. Suppose that the following conditions hold.

(i) £TO=1 anq En=1 E\Xm\qI(|Xni| < an) < to.

(ii) £TO=1 a-1 En=1 E|Xni|I(|Xni| > an) < to.

(iii) ETO=i(En=i E^il'/an )s < to for some 0 <r < 2 and 0 <s < q/2.

V 1 v(

y —El max

n^i an \1<k<n

(X ni EXni)

< to Ve> 0,

(2.25)

and hence,

V P( max

\ 1<k<n n=1 \

E(Xni n EXni) i=1

> ean ) < to Ve > 0.

(2.26)

Proof. By Remark 2.10, (2.25) implies (2.26). As in the proof of Corollary 2.8,

I(Ie10T <(|(tE¥yr < to' (227)

Hence the result follows by Theorem 2.11 with bn = 1. □

3. Corollaries

In this section, we establish some complete moment convergence results by using the results obtained in the previous section.

Throughout this section, let {¥n(f), n > 1} be a sequence of positive even functions satisfying

Mtlt Mt2 I as rnt (31)

m t ...p I as |t| T (3.1)

for some p > 1.

To obtain complete moment convergence results, the following lemmas are needed.

Lemma 3.1. Let X be a random variable and {¥n(i), n > 1} a sequence of positive even functions satisfying (3.1) for some p > 1. Then for all a> 0 and n > 1, the followings hold.

(i) If q > p, then EX*I(|X| < a)/aq < E¥n(|X|)/¥n(a).

(ii) E|X|I(|X| > a)/a < E¥n(|X|)/¥n(a).

Proof. First note by ¥„(|f|)/|f| T that ¥„(|i|) ¥„(|i|)/|i|P | implies WnQtD/W I, and so

is an increasing function. If q - p, then

?n(|X|) Vn(|X|J(|X|< a)) XqJ(|X| < a) Wn(a) - Wn(a) - aq

Hence (i) holds. Since Tn(|t|)/|t| T,

¥n(|X|) Tn(|X|J(|X| > a)) |X|J(|X| > a)

So (ii) holds.

Lemma 3.2. Let {Xni, 1 < i < n, n - 1} be an array of random variables with EXni| < to for 1 < i < n, n - 1. Let {an, n - 1} and {bn, n - 1} be sequences of positive real numbers. Assume that {Tn(t), n - 1} is a sequence of positive even functions satisfying (3.1) for some p > 1 and

zAZ - - - <

n=1 i=1

T (an)

Then the followings hold.

(i) If q - p, then 1 bnanq 2=1 EX^I(|X„i| < an) < to.

(ii) 2TO=1 bna- 2n=1 E|Xra| J(Xn^ > an) < to.

Proof. The result follows from Lemma 3.1. □

By using Lemma 3.2, we can obtain Corollaries 3.3, 3.4, 3.5, 3.6 from Theorem 2.5, Corollary 2.8, Theorem 2.9, Corollary 2.12, respectively.

Corollary 3.3. Let {an, n - 1} and {bn, n - 1} be sequences of positive real numbers {¥n (t), n -1} a sequence of positive even functions satisfying (3.1) for some 1 <p < 2. Assume that {Xni, 1 < i < n, n - 1} is an array of random variables satisfying (2.9) for q = p and (3.4). Then (2.10) holds.

Corollary 3.4. Let {an, n - 1} be a sequence of positive real numbers {Tn(t), n - 1} a sequence of positive even functions satisfying (3.1) for some p > 2. Assume that {Xni, 1 < i < n, n - 1} is an array of random variables satisfying (2.15) for some q - max{p, 2s} (s is the same as in (3.6)),

lE^^^ < to for some 0 <r < 2, s > 0. (3.6)

n=1\ i=1 an /

Then (2.17) holds and hence, (2.18) holds.

Corollary 3.5. Let {an, n > 1} and {bn, n > 1} be sequences of positive real numbers {¥„ (f), n > 1} a sequence of positive even functions satisfying (3.1) for some 1 <p < 2. Assume that {Xni, 1 < i < n, n > 1} is an array of random variables satisfying (2.20) for q = p and (3.4). Then (2.21) holds.

Corollary 3.6. Let {an, n > 1} be a sequence of positive real numbers {¥n(f), n > 1} a sequence of positive even functions satisfying (3.1) for some p > 2. Assume that {Xni, 1 < i < n, n > 1} is an array of random variables satisfying (2.24) for some q > max{p, 2s} (s is the same as in (3.6)), (3.5), and (3.6). Then (2.25) holds and hence, (2.26) holds.

Remark 3.7. Marcinkiewicz-Zygmund and Rosenthal (type) inequalities hold for dependent random variables as well as independent random variables.

(1) For an array {Xni, 1 < i < n, n > 1} of rowwise negatively associated random variables, condition (2.20) holds if 1 < q < 2, and (2.24) holds if q > 2 by Shao's [3] results. Note that {Xni, 1 < i < n, n > 1} is still an array of rowwise negatively associated random variables. Hence Corollaries 3.3-3.6 hold for arrays of rowwise negatively associated random variables.

(2) For an array {Xni, 1 < i < n, n > 1} of rowwise negatively orthant dependent random variables, condition (2.9) holds if 1 < q < 2, and (2.15) holds if q > 2 by the results of Asadian et al. [4]. Hence Corollaries 3.3 and 3.4 hold for arrays of rowwise negatively orthant dependent random variables. These results also were proved by Wu and Zhu [13]. Hence Corollaries 3.3 and 3.4 extend the results of Wu and Zhu [13] from an array of negatively orthant dependent random variables to an array of random variables satisfying (2.9) and (2.15).

(3) For an array {Xni, 1 < i < n, n > 1} of rowwise p*-mixing random variables, condition (2.24) does not necessarily hold if q > 2. As mentioned in Section 1, Utev and Peligrad [7] proved (1.4) for p*-mixing random variables. However, the constant Dq depends on both q and the sequence of p*-mixing random variables. Hence condition (2.24) holds for an array of rowwise p*-mixing random variables under the additional condition that Dq depending on the sequence of random variables in each row are bounded. So Corollary 3.6 holds for arrays of rowwise p*-mixing random variables satisfying this additional condition. Zhu [12] obtained only (2.26) in Corollary 3.6 when the array is rowwise p*-mixing random variables satisfying the additional condition. This additional condition should be added in Zhu [12]. Hence Corollary 3.6 generalizes and extends Zhu's [12] result from p*-mixing random variables to more general random variables.

Finally, we apply the complete moment convergence results obtained in the previous section to a sequence of identically distributed random variables.

Corollary 3.8. Let {Xn, n > 1} be a sequence of identically distributed random variables with E|Xi|pi < go for some 1 < p < 2 and f > 1. Assume that for any q > 2, there exists a positive constant Cq depending only on q such that

- EX*m)

< Cq{¿E\Xni\q +(¿E^n/V7 (3.7)

^ i=1 \i=1 / J

i=1 \i=1

where X*ni = XtI(|Xi| < n1/p) + n1/pI(X{ > n1/p) - n1/pI(X{ < -n1/p). Then (1.7) holds.

Proof. Let Xni = Xi for 1 < i < n, n > 1. We apply Theorem 2.7 with an = n1/p and bn = nt-2. Take r and q > 2 such that p <r < min{2,pt}, q/p -1 > 0, and (r/p - 1)(q/2) - t + 1 > 0. Then it is easy to see that

^nt-2-q/p^E\Xi\q^|Xi| < n1/p) < œ,

n=1 i=1

^nt-2-1/^E\Xi\^IXil > n1/p) < œ, (3.8)

n=1 i=1

œ / n I v \r \ q/2

§E>X) <œ

Hence the result follows from Theorem 2.7. □

Corollary 3.9. Let {Xn, n > 1} be a sequence of identically distributed random variables with E| xi | pt < to for some 1 < p < 2 and t > 1. Assume that for any q > 2, there exists a positive constant Cq depending only on q such that

E(X™ - EX*m) i=1

where X*ni = X{I(\Xi\ < n1/p) + n1/pI(X{ > n1/p) - n1/pI(X{ < -n1/p). Then

nt-2-1/pE max 1< k<n

E(Xi - EXi)

- en1/p) <œ Ve> 0.

(3.10)

Proof. As in the proof of Corollary 3.8, (3.8) are satisfied. So the result follows from Theorem 2.11. □

Remark 3.10. If {Xn, n > 1} is a sequence of i.i.d. random variables, then conditions (3.7) and (3.9) are satisfied when q > 2. Hence Corollaries 3.8 and 3.9 generalize and extend the result of Chow [11]. There are many sequences of dependent random variables satisfying (3.7) for all q > 2. Examples include sequences of negatively orthant dependent random variables, negatively associated random variables, p*-mixing random variables, mixing identically distributed random variables satisfying ^œ=i $1/2(2n) < œ, and p-mixing identically distributed random variables satisfying ^œ=1 p2/q(2n) < œ. The above sequences of dependent random variables except negatively orthant dependent random variables also satisfy (3.9) when q > 2. Hence Corollaries 3.8 and 3.9 hold for many dependent random variables as well as independent random variables.

Acknowledgments

The author would like to thank the referees for the helpful comments and suggestions.

This work was supported by the Korea Science and Engineering Foundation (KOSEF) grant

funded by the Korea government (MOST) (no. R01-2007-000-20053-0).

References

[1] J. Marcinkiewicz and A. Zygmund, "Sur les fonctions independantes," Fundamenta Mathematicae, vol. 29, pp. 60-90,1937.

[2] H. P. Rosenthal, "On the subspaces of Lp(p > 2) spanned by sequences of independent random variables," Israel Journal of Mathematics, vol. 8, pp. 273-303,1970.

[3] Q.-M. Shao, "A comparison theorem on moment inequalities between negatively associated and independent random variables," Journal of Theoretical Probability, vol. 13, no. 2, pp. 343-356, 2000.

[4] N. Asadian, V. Fakoor, and A. Bozorgnia, "Rosenthal's type inequalities for negatively orthant dependent random variables," Journal of the Iranian Statistical Society, vol. 5, pp. 69-75, 2006.

[5] Q.-M. Shao, "A moment inequality and its applications," Acta Mathematica Sinica, vol. 31, no. 6, pp. 736-747,1988 (Chinese).

[6] Q.-M. Shao, "Maximal inequalities for partial sums of p-mixing sequences," The Annals of Probability, vol. 23, no. 2, pp. 948-965,1995.

[7] S. Utev and M. Peligrad, "Maximal inequalities and an invariance principle for a class of weakly dependent random variables," Journal of Theoretical Probability, vol. 16, no. 1, pp. 101-115, 2003.

[8] P. L. Hsu and H. Robbins, "Complete convergence and the law of large numbers," Proceedings of the National Academy of Sciences of the United States of America, vol. 33, pp. 25-31,1947.

[9] P. Erdos, "On a theorem of Hsu and Robbins," Annals of Mathematical Statistics, vol. 20, pp. 286-291, 1949.

[10] L. E. Baum and M. Katz, "Convergence rates in the law of large numbers," Transactions of the American Mathematical Society, vol. 120, pp. 108-123,1965.

[11] Y. S. Chow, "On the rate of moment convergence of sample sums and extremes," Bulletin of the Institute of Mathematics. Academia Sinica, vol. 16, no. 3, pp. 177-201,1988.

[12] M.-H. Zhu, "Strong laws of large numbers for arrays of rowwise p*-mixing random variables," Discrete Dynamics in Nature and Society, vol. 2007, Article ID 74296, 6 pages, 2007.

[13] Y.-F. Wu and D.-J. Zhu, "Convergence properties of partial sums for arrays of rowwise negatively orthant dependent random variables," Journal of the Korean Statistical Society. In press.

Copyright of Journal of Inequalities & Applications is the property of Hindawi Publishing Corporation and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use.