Hindawi Publishing Corporation Journal of Probability and Statistics Volume 2011, Article ID 708087,10 pages doi:10.1155/2011/708087

Research Article

Strong Laws of Large Numbers for Arrays of Rowwise NA and LNQD Random Variables

Jiangfeng Wang and Qunying Wu

College of Science, Guilin University of Technology, Guilin 541004, China

Correspondence should be addressed to Jiangfeng Wang, yuanzhang987@163.com and Qunying Wu, wqy666@glite.edu.cn

Received 25 May 2011; Revised 21 October 2011; Accepted 23 October 2011 Academic Editor: Man Lai Tang

Copyright © 2011 J. Wang and Q. Wu. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Some strong laws of large numbers and strong convergence properties for arrays of rowwise negatively associated and linearly negative quadrant dependent random variables are obtained. The results obtained not only generalize the result of Hu and Taylor to negatively associated and linearly negative quadrant dependent random variables, but also improve it.

1. Introduction

Let |X„)„eN be a sequence of independent distributed random variables. The Marcinkiewicz-Zygmund strong law of large numbers (SLLN) provides that

ZX - EXi) 0 a.s. for 1 < a< 2,

n1/a n ¿=1

YxX —» 0 a.s. for 0 < a < 1 as n —> <x>

n1/a n ¿=1

if and only if E|X|a < to. The case a = 1 is due to Kolmogorov. In the case of independence (but not necessarily identically distributed), Hu and Taylor [1] proved the following strong law of large numbers.

Theorem 1.1. Let {Xni, 1 < i < n,n > 1) be a triangular array of rowwise independent random variables. Let {an)neN be a sequence of positive real numbers such that 0 < an 1 to. Let y(t)

2 Journal of Probability and Statistics

be a positive, even function such that y (t)/\t\p is an increasing function of \t\ and y(t)/\t\p+1 is a decreasing function of \t\, respectively, that is,

^(t) - V(t) i, as \t\ 1, (1.2)

\t\p+1

for some positive integer p. If p > 2 and

EXni = 0, y yEf( \ Xm \) <

¿1 & f(\an\) C0, (1.3)

K K £ )2)2k< -

where k is a positive integer, then

— ~ZXni 0 a.s. (1.4)

an i=1

Definition 1.2 (cf. [2]). A finite family of random variables {Xn}neN is said to be negatively associated (NA, in short) if, for any disjoint subsets A and B of {1,2,...,n} and any real coordinate-wise nondecreasing functions f on on RA and g on RB,

Cov(f (Xi,i e A),g(Yj, j e B)) < 0, (1.5)

whenever the covariance exists. An infinite family of random variables is NA if every finite subfamily is NA. This concept was introduced by Joag-Dev and Proschan [2].

Definition 1.3 (cf. [3, 4]). Two random variables X and Y are said to be negative quadrant dependent (NQD, in short) if, for any x,y e R,

P(X < x,Y < y) < P(X < x)P(Y < y). (1.6)

A sequence {Xn}ne^ of random variables is said to be pairwise NQD if Xi and Xj are NQD for all i,j e N+ and ik j.

Definition 1.4 (cf. [5]). A sequence {Xn}neN of random variables is said to be linearly negative quadrant dependent (LNQD, in short) if, for any disjoint subsets A,B e Z+ and positive rjs,

Y/kXk, Y/jXj are NQD.

keA jeB

Remark 1.5. It is easily seen that if {Xn}neN is a sequence of LNQD random variables, then {aXn + b}neN is still a sequence of LNQD random variables, where a and b are real numbers.

The NA property has aroused wide interest because of numerous applications in reliability theory, percolation theory, and multivariate statistical analysis. In the past decades, a lot of effort was dedicated to proving the limit theorems of NA random variables. A Kolmogorov-type strong law of large numbers of NA random variables was established by Matuia in [6], which is the same as I.I.D. sequence, and Marcinkiewicz-type strong law of large Numbers was obtained by Su and Wang [7] for NA random variable sequence with assumptions of identical distribution; Yang et al. [8] gave the strong law of large Numbers of a general method.

The concept of LNQD sequence was introduced by Newman [5]. Some applications for LNQD sequence have been found. See, for example, Newman [5] who established the central limit theorem for a strictly stationary LNQD process. Wang and Zhang [9] provided uniform rates of convergence in the central limit theorem for LNQD sequence. Ko et al. [10] obtained the Hoeffding-type inequality for LNQD sequence. Ko et al. [11] studied the strong convergence for weighted sums of LNQD arrays, and so forth.

The aim of this paper is to establish a strong law of large numbers for arrays of NA and LNQD random variables. The result obtained not only extends Theorem 1.1 for independent sequence above to the case of NA and LNQD random variables sequence, but also improves it.

Lemma 1.6 (cf. [12]). Let {Xn,n > 1} be NA random variables, EXn = 0, E\Xn\q < to, n > 1, q > 2. Then, there exists a positive constant c such that

, Vn > 1. (1.8)

Let c denote a positive constant which is not necessary the same in its each appearance. Lemma 1.7 (cf. [3, 4]). Let random variables X and Y be NQD, then

(1) EXY < EXEY;

(2) P(X <x, Y <y) < P(X< x)P(Y < y);

(3) if f and g are both nondecreasing (or both non increasing) functions, then f (X) and g(Y) are NQD.

±E\Xi\q +( ±EX,

Lemma 1.8. Let {Xn,n > 1} be LNQD random variables sequences with mean zero and 0 < Bn = £2=1 EX2k .Then,

P(\Sn\ > x) < ¿P(\Sk\ > y) + 2exp(x - X log(1 + By)), (1.10)

k=1 \y y \ Bn / /

for any x > 0, y > 0.

This lemma is easily proved by following Fuk and Nagaev [13]. Here, we omit the details of the proof.

2. Main Results

Theorem 2.1. Let {Xni; i > 1,n > 1} be an array of rowwise NA random variables. Let {an}neN be a sequence of positive real numbers such that 0 < an tv Let y(t) be a positive, even function such that y(t)/ \ t \ is an increasing function of \ t \ and y (t)/ \ t \p is a decreasing function of \ t \, respectively, that is,

y(t) t y(t)

in 1 i,ip

i, as \ t\ t

for some nonnegative integer P. If p > 2 and

EXni = 0,

Vycy(\Xm\) ^ § § < v

nz> < v,

nk1 ik1

where v is a positive integer and v > p, then

\ 1<k<n

> e l < v, for any e> 0.

Proof of Theorem 2.1. For all i > 1, let X(n) = -anI (Xni < -an) +XniI ( \ Xni \ < an) + anI (Xni > an),

j = (1/an)Ii'i=1(Xin> - EXf>), then, for all e > 0,

l v(n) pY(n)

an i=1

> e\ < P^max|X;

+ P ( max1 T(n) 1 > e - max

1<j<n j 1<j<n

First, we show that

¿Z EX,

0, as n —> v .

In fact, by EXni = 0, y(t)/\t\ T as\t\ T and Z^I^.yQXniD/ya)) < to, then

< max—

1<j<n an

EXmI(\Xm\ < an)

%E(anI(\Xni\ > an))

< max— ( V\EXniI(\Xni\ < an)\ +

Y.(anI(\Xni\ > an))

= max— ( V\EXniI(\Xni\ > an)\ +

1<j<n an

E(anI(\Xni\ > an))

E\Xm\I(\Xm\ > an)

Ef(\Xm\)I(\Xm\ > an) y(an)

n Ey(\Xni\) 0 < 2>-;—---> 0, as n —>to .

y (an)

From (2.4) and (2.5), it follows that, for n sufficiently large,

< ¿P (\Xnj\> an) + p( max|7f| > f). (2.7)

Hence, we need only to prove that

I =:X EP (\Xnj\>an) <

n=1j=1

TO / c \

II =: Y P( max\r(n)\ <

n=1 V 1<j<n\ j \ 2)

From the fact that 2TO=12n=1E(y(\Xni\)/y(an)) < to, it follows easily that

I = £Zp(\Xnj\>an) <ZZE

n=1j=1

n=1j=1

yj\Xnj\) y(an)

By v > p and y(t)/\t\p I as \t\ T, then y(t)/\t\v | as \t\ T.

Journal of Probability and Statistics By the Markov inequality, Lemma 1.6, and X«=i (Xii E(Xni/an)2)v/2 < v, we have

II = > PI max I T(n)| >f)<

(2) ^ an

r, I ' 0 / £max|Tj

, \1</<nl j I 2/ l<;<nl j

n=1 x 1 ' n=1 1

e\-v O 1

< <2 ¿EeX'"1! + <2 ± (2 E\X

n=1 "n j=1

n=1"n \j=1

< ^2^IVI(!X"J ! < "•)+1 )+ <2^ (2E\X,

\ + l( vvEix(n)i2N

n=1 an j=1

JE|Xj 'y

< <2 2E^ + <2 Ov (^E\X«\2

n=1 ¿=1

n=1 n j=1

y(lxn¿ I) +VE/Xmv\

^ VVjt^(I XniI ) v/ vwXn

(2.10)

Now we complete the proof of Theorem 2.1. Corollary 2.2. Under the conditions of Theorem 2.1, then

—^Xrn 0 a.s.

On ¿=1

(2.11)

Proof of Corollary 2.2. By Theorem 2.1, the proof of Corollary 2.2 is obvious.

Remark 2.3. Corollary 2.2 not only generalizes the result of Hu and Taylor to NA random variables, but also improves it.

Theorem 2.4. Let {Xni; i > 1,n > 1} be an array of rowwise LNQD random variables. Let {an}nm be a sequence of positive real numbers such that 0 < an tv Let y (t) be a positive, even function such that y(t)/ \ t \ is an increasing function of \ t \ and y (t)/ \ t \p is a decreasing function of \ t \, respectively, that is,

f î'f < "s I t IT

(2.12)

Journal of Probability and Statistics for some positive integer p. If 1 <p < 2 and

EXni = 0,

V n^f(\Xm\) „

(2. 1 3)

an i=1

> e 1 < v, for any £> 0.

(2.14)

Proof of Theorem 2.4. For any 1 < k < n, n > 1, let

Ynk = -anI(Xnk < -an) + XnkI(\Xnk\ < an) + anI(Xnk > an), Znk = Xnk - Ynk = (Xnk + an)I(Xnk < -an) + (Xnk - an)I(Xnk > an).

(2.15)

To prove (2.14), it suffices to show that

—%Znk —> 0 completely,

an k=1

—J] (Ynk - EYnk) —> 0 completely,

an k=1

— yfiYnk —> 0 as n —> v. an k=1

(2.16)

(2.17)

(2.18)

Firstly, we prove (2.16):

. „ . ^ V E^n=1 Znk I . r V Vjt\Xnk11 (\Xnk \ > an) Vjt

>£ I < Z——— < CZZE-a-<

n=1 <.

n=1k=1

9 (\Xni\)

=1 tf 9(an)

(2.19)

Secondly, we prove (2.17). By Lemma 1.7, we know that {Ynk - EYnk, 1 < k < n,n > 1} is an array of rowwise LNQD mean zero random variables. Let B'n = ^n=i E(Ynk - EYnk)2. Take x — ean, y = 2ean/v, and v > 1. By Lemma 1.8, for all e> 0,

£(Ynk - EYnk) k=1

>£1 < n=1 № - EYk>T) + ^ B, + ^

:= I1 +12.

(2.20)

From (2.12), (2.13), the Markov inequality, and Cr-inequality,

I1 = X I Ynk - EYnk I >fOn) < < CVE^

n=1k=1

n=1k=1 an

n=1k=1 an

(2.21)

< Cf fEfk ( I Ynk I ) < Cv vEfk ( I Xnk I ) < ^

< n=1 k= n(an) < n=1 k=1 n(an)

Note that \ Ynk\ < \Xnk\, n >1 and 1 <p < 2. From (2.1 2), (2. 1 3), and the Cr-inequality,

I2 < ct(:ta-2E(Ynk - EYnk)2) < C^f ¿E Ynk P

n=1 \k=1

n=1 \ k=1 an

i E\Ym i'V

< c p i

\n=1k=1 an /

(2.22)

fk( I Ynk I )

n=1 £f fk(an)

fk ( I Xnk I )

n=1 k=1 fk (an)

Finally, we prove (2.18). For 1 < k < n, n > 1, EXnk = 0, then EYnk = -EZnk. From the definition of Znk if Xnk > an, then 0 < Znk = Xnk - an < Xnk, if Xnk < -an, then Xnk < Znk = Xnk + an< 0. So \Znk\ < \Xnk\I( \Xnk\ > an). Consequently,

y,EZnk

< ^yiE I Znk I < y^E I Xnk 11 ( I Xnk I > On ) k=1 On k=1 On

(2.23)

nEfk (Xnk ) „

< y -;—---> 0 as n —> o.

k= fk (an)

The proof is completed.

Theorem 2.5. Let {Xni; i > 1,n > 1} bean array of rowwise LNQD random variables. Let {an}neN, be a sequence of positive real numbers such that 0 < an tv Let {yn(t) }neN, be a sequence of positive even functions and satisfy (2.12) for p > 2. Suppose that

E§E(xf> ' < -

(2.24)

where v is a positive integer, v > p, the conditions (2.13) and (2.24) imply (2.24).

Proof of Theorem 2.5. Following the notations and the methods of the proof in Theorem 2.4, (2.16), (2.18), and I1 < to hold. So, we only need to show that I2 < to. Let n > v/2. By (2.24), we have

TO / n \n TO

h < X ( !>n2E(Ynfc - EYnk)2 ) < C£

n=1 \ k=1 / n=1

n ey2,x v/2

'EX2nk

n=1 \ k=1 an

(2.25)

The proof is completed.

Corollary 2.6. Under the conditions of Theorem 2.4 or Theorem 2.5, then

- ¿Xn

an i=1

0 a.s.

(2.26)

Remark 2.7. Because of the maximal inequality of LNQD, the result of LNQD we have obtained generalizes and improves the result of Hu and Taylor.

Acknowledgments

The work is supported by the National Natural Science Foundation of China (11061012), the Guangxi China Science Foundation (2011GXNSFA018147), and the Innovation Project of Guangxi Graduate Education (2010105960202M32).

References

[1] T. C. Hu and R. L. Taylor, "On the strong law for arrays and for the bootstrap mean and variance," International Journal of Mathematics and Mathematical Sciences, vol. 20, no. 2, pp. 375-382,1997.

[2] K. Joag-Dev and F. Proschan, "Negative association of random variables, with applications," Annals of Statistics, vol. 11, no. 1, pp. 286-295,1983.

[3] E. L. Lehmann, "Some concepts of dependence," Annals of Mathematical Statistics, vol. 37, pp. 11371153,1966.

[4] E. L. Lehmann, "Some concepts of dependence," Annals of Mathematical Statistics, vol. 78, pp. 794-803, 1966.

[5] C. M. Newman, "Asymptotic independence and limit theorems for positively and negatively dependent random variables," in Inequalities in Statistics and Probability, Y. L. Tong, Ed., vol. 5 of IMS Lecture Notes—Monograph Series, pp. 127-140, Institute of Mathematical Statistics, Hayward, Calif, USA, 1984.

[6] P. Matula, "A note on the almost sure convergence of sums of negatively dependent random variables," Statistics and Probability Letters, vol. 15, no. 3, pp. 209-213,1992.

[7] C. Su and Y. B. Wang, "The strong convergence of the identical distribution NA sequence," Chinese Journal of Applied Probability and Statistics, vol. 14, no. 2, pp. 131-140,1998.

[8] S. Yang, C. Su, and K. Yu, "A general method to the strong law of large numbers and its applications," Statistics and Probability Letters, vol. 78, no. 6, pp. 794-803, 2008.

[9] J. F. Wang and L. X. Zhang, "A Berry-Esseen theorem for weakly negatively dependent random variables and its applications," Acta Mathematica Hungarica, vol. 110, no. 4, pp. 293-308, 2006.

[10] M. H. Ko, Y. K. Choi, and Y. S. Choi, "Exponential probability inequality for linearly negative quadrant dependent random variables," Korean Mathematical Society. Communications, vol. 22, no. 1, pp. 137-143, 2007.

[11] M. H. Ko, D. H. Ryu, and T. S. Kim, "Limiting behaviors of weighted sums for linearly negative quadrant dependent random variables," Taiwanese Journal of Mathematics, vol. 11, no. 2, pp. 511-522, 2007.

[12] Q. M. Shao, "A comparison theorem on moment inequalities between negatively associated and independent random variables," Journal of Theoretical Probability, vol. 13, no. 2, pp. 343-356, 2000.

[13] D. H. Fuk and S. V. Nagaev, "Probabilistic inequalities for sums of independent random variables," Theory of Probability and Its Applications, vol. 16, pp. 643-660,1971.

Copyright of Journal of Probability & Statistics is the property of Hindawi Publishing Corporation and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use.