Scholarly article on topic 'Covariance and comparison inequalities under quadrant dependence'

Covariance and comparison inequalities under quadrant dependence Academic research paper on "Mathematics"

0
0
Share paper
Academic journal
Periodica Mathematica Hungarica
OECD Field of science
Keywords
{""}

Academic research paper on topic "Covariance and comparison inequalities under quadrant dependence"

Period Math Hung

DOI 10.1007/s10998-014-0077-5

Covariance and comparison inequalities under quadrant dependence

Przemystaw Matuta • Maciej Ziemba

© The Author(s) 2015. This article is published with open access at Springerlink.com

Abstract We study the difference between a distribution of the random vectors [X, Y] and [X', Y'], where X' and Y' are independent and X' has the same law as X and Y' as Y. Particular interest is focused on positively quadrant dependent random variables X and Y, in this case the bounds for the difference in question are expressed in terms of the covariance of X and Y.

Keywords Covariance • Positive and negative dependence • Probabilistic inequalities • Comparison theorems

Mathematics Subject Classification 60E15 • 62H20

1 Introduction and the notation

Positive and negative dependence concepts play very important role not only in mathematical statistics but also in applications of probability theory, in particular in mathematical physics. The definitions of positively and negatively quadrant dependent random variables (r.v.'s) were introduced by Lehmann in 1966 (cf. [5]) and soon after extended to the multivariate case by Esary at al. and Joag-Dev and Proschan, who introduced the notion of positive and negative association (cf. [3,4]). Nowadays, a comprehensive study of this topic is contained in the monographs of Bulinski and Shashkin (cf. [2]), Oliveira (cf. [10]) and Prakasa Rao (cf. [11]).

P. Matula (B)

Institute of Mathematics, Marie Curie-Sklodowska University, pl. M.C.-Sklodowskiej 1, 20-031 Lublin, Poland e-mail: matula@hektor.umcs.lublin.pl

M. Ziemba

Department of Mathematics, Lublin University of Technology, ul. Nadbystrzycka 38d, 20-618 Lublin, Poland e-mail: maciek.ziemba@gmail.com

Published online: 06 January 2015

a Springer

Let us recall that the random variables X, Y are positively quadrant dependent (PQD) if

HXj(t, s) := P(X < t, Y < s) - P(X < t)P(Y < s) > 0

for all t, s e R and X, Y are negatively quadrant dependent (NQD) if Hx,y (t, s) < 0. It is well known that X, Y are NQD iff X, —Y are PQD. In view of this duality, we shall focus only on the PQD case later on.

Denote by

/TO r TO

/ HXY (t, s)dtds

-TO J — TO

the so-called Hoeffding covariance (it is always well defined for PQD or NQD r.v.'s, even though it may be infinite and if the usual product moment covariance exists, then it is equal to the Hoeffding covariance). From this fact it follows that uncorrelated PQD (or NQD) r.v.'s are independent. Therefore, in the study of limit theorems, covariance is usually used to "control" the dependence of r.v.'s. It is also well known that monotonic functions of positively (negatively) dependent r.v.'s inherit such properties. In particular, the indicators I(-TO,i)(X), I(—TO,s) (Y) are PQD (NQD), provided X, Y are PQD (NQD). Here and in the sequel I a (x) denotes the indicator function of a set A.

It is easy to see that

Cov (l(—TO,t) (X), I(—to,s) (Y)) = Hx,y(t, s),

thus, in the study of limit theorems for empirical processes based on positively or negatively dependent observations, it is important to control Hx, y (t, s). In this context the upper bounds for Hx,y (t, s) in terms of Covh(X, Y) are very useful. On the other hand, it is interesting to establish how far the random variables X, Y are from the independent ones, in the sense of the difference between the joint distribution and the product of its marginals. The bounds for the covariance of the indicator functions in terms of the covariance of the random variables are called the covariance inequalities.

The first inequalities of the form

sup HXY(t, s) < C • 0(Cov(X, Y)), (1.1)

where X, Y are associated and absolutely continuous and C depends on the densities of X, Y, were obtained by Bagai and Prakasa Rao (cf.[1]) and Roussas ([13]). These authors studied properties of the estimators of the survival function and kernel estimators of the density based on the sample of associated r.v.'s. The inequalities (1.1) were intensively studied in [6,7] and [8]. In particular in [8], it was proved that if X, Y are absolutely continuous PQD r.v.'s with bounded densities fx, fY (in the LTO norm), then

/3 y/3

sup Hx,y (t, s) < -II fx II to II fY II to Covh (X, Y) I . (1.2)

t,seR V2 /

The discrete case was considered in [6] and it was proved that if X, Y are integer-valued PQD r.v.'s, then

sup HXY(t, s) < CovH(X, Y). (1.3)

We would also like to refer the reader to the monograph [2], where the covariance inequalities for Lipschitz functions of associated r.v.'s are studied.

In recent years, however, live interest in positively and negatively dependent r.v.'s have led to fruitful results not only in range of covariance inequalities. Several limit theorems have

been proved under this kind of dependence, in particular: laws of large numbers, CLT and the rate of convergence in the CLT, invariance principle, moment bounds, convergence of empirical processes etc. (see [2,10,11] where further references are given).

Now let X, Y be any random variables and X', Y' independent copies of X and Y, i.e. X' has the same distribution as X, and Y' as Y and XY' are independent. We may write

where Q(t, s) = (—to, t} x (—to, s} is a quadrant. Let us introduce the following notation

for D c R2. Because the notions of uncorrelatedness and independence coincide, the key assumption in limit theorems for positively or negatively dependent r.v.'s always involves their covariance structure. Indeed covariance plays the role of the measure of dependence between r.v.'s. Taking this into account, it appears to be important to establish how, in fact, the covariance assesses the dependence. The answer may be given by finding the bounds for Hx,y (D) in terms of Covh (X, Y) (we shall call from now on comparison inequalities). This is the main goal of the paper.

The paper is organized as follows. In the second section we consider the comparison inequalities for integer-valued r.v.' s while the third one is devoted to the absolutely continuous random vectors. Section 4 presents the special case of Farlie-Gumbel-Morgenstern (FGM) r.v.'s.

2 Discrete case

With a view to stating the main result of this section we shall introduce the notion of a 5-hull of a set D c R2 as follows

Let Z denote the set of integers. For integer-valued r.v.'s we have the following general comparison theorem.

Theorem 2.1 Let X, Y be any random variables with values in Z and D c R2 be any set, then

D(1/2)

For the proof of this theorem and the results of the next section, we shall need the following identity of Newman (cf. (4.10) in [9]).

Lemma 2.2 Let g1 and g2 be absolutely continuous functions and X, Y random variables such that g1(X) and g2(Y) are square-integrable. Then

Hx,y(t, s) = P ([X, Y] e Q(t, s)) — P ([X', Y'] e Q(t, s)),

Hx,y (D) = P ([X, Y] e D) — P ([XY'] e D),

D(S) = {(x, y) e R2 : x = t + a, y = s + b, for some (t, s) e D and |a| < S, |b| < 5}.

Cov(g1(X), g2(Y))

—to j —to

g1 (t)g'2(s) [P(X > t, Y > s) — P(X > t)P(Y > s)] dtds

—to j —to

It is worth mentioning here, that

P (X > t, Y > s) — P (X > t) P (Y > s) = P (X < t, Y < s) — P (X < t) P (Y < s)

= Hx,y(t, s).

Proof of Theorem 2.1 For i e Z let us define the functions

fi (x) = max (1 — 2|x — i |, 0),

which are absolutely continuous and differentiable except the points i — 1/2, i, i + 1/2. We shall use these functions to approximate the indicators. In fact, we have fi (X) = }(X). Let us observe that for (i, j) e Z, we have

Hx,y(D) = X (P(X = i, Y = j) — P(X' = i, Y' = j)) = £ Cov (ft (X), fj (Y))

(i,j )eD (i,j )eD

and by Lemma 2.2, we get

t+to ¡-+to

/ + TO /- + TO

/ z f«'(t) fj (s)Hx,y (t, s)dtds.

TO —TO

It is easy to see that

(i,j )eD

X fi(t)fj(s)

(i,j )e D

< 4 ^ i(« — 1/2,i + 1/2) x {j — 1/2,j + 1/2) (t, s),

(i,j )eD

\Hx,y(d)\ <

e + TO Z- + TO

—TO —TO

X fj(t)fj(s)

(i,j )eD

\Hx,y(t, s)\ dtds

J J \ HX Y (t, s)\ dtds.

D(1/2)

For PQD r.v.'s we immediately obtain the following corollary.

Corollary 2.3 Let X, Y be PQD r.v. 's with values in Z and D C R2 be any set, then

IHx,Y(D)|<4COVh(X, Y). (2.2)

The inequality (2.2) is optimal up to a constant in this sense, that the left and the right hand-side may approach 0 with the same speed. Let us illustrate it with an example.

Example 2.4 Let Xn, Yn have the following distribution: P(Xn = 0, Yn = 0) = P(Xn = 1, Yn = 1) = 4 + 1, P(Xn = 0, Yn = 1) = P(Xn = 1, Yn = 0) = 4 — 1, n > 5. Then Xn,Yn are PQD and P ([Xn, Yn]e{(0, 0); (1, 1)}) = 2 + n, P ([X'n, Yn] e ((0, 0); (1, 1)}) = 1. Thus Hxn Jn (((0, 0); (1, 1)}) = 2. Moreover Cov( Xn, Yn) = n.

The inequality (2.2) may also be easily extended to the case of r.v.'s taking values in a lattice. Let L (a, b) = (ia + b, i e Z} be a set of lattice points, here a > 0. If X, Y are PQD, assuming their values in L(a, b), then U = and V = Y—b are PQD as well with values in Z. Furthermore, Covh (U, V) = Covh(X, Y). Therefore we get another corollary.

Corollary 2.5 Let X, Y be PQD r.v. 's with values in the lattice L(a, b) and D C R2 be any set, then

I Hx,Y (D) |<—COVh( X, Y). (2.3)

3 Absolutely continuous case

In this section we shall study absolutely continuous random vectors [X, Y]. Denote by fx, Y (x, y) the joint density of [X, Y] and by fx (x), fY (y) the marginal densities of X and Y respectively. We shall assume that these densities are bounded in the essential supremum norm (Lnorm). We put

Cf := || fx,YIU+IIfxlU-IIfYIU.

We will obtain bounds for Hx,y (D), where D C R2 is a compact set which boundary is a Peano curve F (continuous, piecewise C1, without self-intersections). The length of F will be denoted by L and the planar measure of D by ¡(D). Recall that, by the isoperimetric inequality, we have ¡¡(D) < 4^L2.

Theorem 3.1 Let the r.v. 's X, Y and the set D be as above, then

\Hx,y(D)\ < C • 4/ sup \Hx,y(t, s)\, (3.1)

V (t,s)£D

where C = (2^2 (^ + l)(Cf + 1) + 8V2) max(L2, 1). In the proof we shall use the following elementary lemma.

Lemma 3.2 Let [X, Y] be a random vector with bounded density fx,Y■ Let : R2 ^ R be the measurable functions such that 0 < < 1. Let us also put

A = {(x, y) e R2 : y(x, y) = ^(x, y)} ,

IEV(X, Y) - E f(X, Y)| < ¡(A) • J fx,Y L ■

Proof of Theorem 3.1 For i, j e Z,S > 0 and 0 < n < 8/2 let us introduce the following notation. Consider a family of squares with vertices in the lattice L (8, 0) and disjoint interiors

Si,j,8 = <i8, (i + 1)8) x (j8, (j + 1)8).

Define a family of "square-rings"

Ri,j,8,n = Si,j,8 \ (i8 + n, (i + 1)8 - n) x (j8 + n, (j + 1)8 - n)

and define a family of "hat-like" functions

f , , . ( ( 8 |x - 8(i + 1/2)| \\ fi,8,n(x) = min I max I —----^ )/ ■

These functions are absolutely continuous, equal to 0 for x e (-to, i8) U <(i + 1)8, +ro), equal to 1 for x e <i8 + n, (i + 1)8 - n) and are linear otherwise. They are differentiable except four points and | f{s n(x)| < 1/n.

We begin with a comparison inequality for a square Si,j,s. Let us observe that

I Hx,Y (Si,j,s) |

= |eI(is,(i+i)3) (x) I(js,(j+i)s) (Y) - EI(is,(i+i)s) (x') 1j3,(j+1)«) (Y01

< |EI(is,(i+i)3) (X) I(js,(j+i)3) (Y) - Efi,s,v(X) fj,s,v(Y)I

+ |EI(i3,(i + i)3) (X0 !(j3,(j + i)3) (Y0 - Efi,3,n(X') fj,3,n(Y7)|

+ 1 Efi3n(X) fj,3,n(Y) - Efi3n(X') fj,3,n(Y')|

< 4(3n - n2) | fx,Y L + 4(3n - n2) II fx IL II fY IL + |Cov (fiAn(.X), fj,3,n(Y)(|

by Lemma 3.2, therefore by Lemma 2.2 we get

^xj (Si,j,3(|< 4(3n - n2)Cf + / f |Hx,y(t, s^dtds.

Ri, i,s,,

Let {Si, j.^^i j)ei be a cover of D, i.e. (i, j) e I C Z2 iff Si, j,$ n D = 0. Let us split the set of indices I into two disjoint parts Ir and Iint. The family {Si, j.a}^ j)ejr covers the boundary r, i.e. (i, j) e Ir iff Si, j,s n r =0. The family {Si, jiS) . infills the interior of D, i.e. (i, j) e Iint iff Si,j,s C intD. Let us observe that

Card (Iint) <

H(D) 1 L2

< ^T + 1 < 4jt ^ + 1

Further

Card (Ir) < 4

< iL+0

which follows from the fact that r may be divided into at most f j] consecutive parts of the length S (the last at most S), each may be covered by a square of the side S parallel to the axis, which in turn may be covered by at most four squares Si, j,s. Now, we have

I Hx,y (D)\

P [X, Y] € U j - W[X', Y']€ U Si,i,s

V (i,i)€lint J \ (i,i )€lint

+ P [X, Y] € U Si,i,s n D + P [X', Y'] € U Si,i,s n D

V (i,i)€lr ) V (i,i)€lr )

< X IHxy (Si,i,s)|+ X P ([X, Y] € Si,i,s) + X P ([X', Y'] € Si,i,s).

(i,i )€lint (i,i )e!r (i,i)e!r

Therefore, from (3.2), (3.3) and (3.4), it follows that

\Hx,y (d)\ (3.6)

<4(8n-n2)Cf(4nI2+i)+n2 /f \Hx,y(t,s)\dd

U(i,i)elint RUj,8,n

+4( 8+0 82Cf < 48n (4n£ + 0 Cf + 72 {-L12 + 048n^(t,s)\ ( 8+1>

+4( 8 + M 8"Cf.

Let us put h = sup^ s)eD \Hx,y(t, s)\ and assume that h > 0. If h = 0, then it is easy to prove that \Hx,y(D)\ = 0. Further, let L = max(L, 1) and assume that 8 < 1. Then

^ (D)\ < I + 012Cf + Tn{ 4n + 012 h + 88~L2Cf

We put 8 = \p24h and n = \fh to obtain an optimal exponent at h. Since h < 1 we see that 8 < 1 and n < 8/2. Thus we get

\Hx,y(D)\ <VhL2^2V2(Jn + 1) (Cf + 1) + 8V2)

and the proof is completed. □

By direct application of (1.2) to Theorem 3.1, for PQD r.v.'s, we get the following inequality

\Hx,y(D)\ < C • (COVh(X, Y))1/12 ,

where C = ^2^2 (4^ + 1) (Cf + 1) + ^V^) (|Cf )1/12max(L2, 1).

A more careful study of the proof of Theorem 3.1, in the case of PQD r.v.'s, leads to the following result.

Theorem 3.3 Let the assumptions of Theorem 3.1 be satisfied and X, Y be PQD r.v. 's, then

\Hx,y(D)\ < C • (COVh(X, Y))1/5 , where C = 4 + Cf (^ + 10) max(L2, 1). Proof As in the proof of Theorem 3.1, from (3.6) we get

\Hx,y(D)\ < 4^+ 1) L2Cf + -1 COVh(X, Y) + 88L2Cf.

If Covh (X, Y) > 1 then the conclusion is a trivial inequality. Assume Covh (X, Y) < 1 and take 8 = v^Covh(X, Y) and n = (Covh(X, Y))2/5 /2 to optimize the exponent in Covh (X, Y) and arrive at the conclusion. □

4 FGM case

It is said that the r.v.'s X, Y have the joint FGM distribution function if

Fx,y(x, y) = Fx(x)Fy(y) + PFx(x)(1 - Fx(x))Fy(y)(1 - Fy(y)),

where p e [-1, 1], Fxj, Fx, Fy are the joint distribution function and marginal d.f.'s respectively. Denote by fxj, fx, fY their densities. The corresponding copula takes the form

C(u, v) = uv + pu(1 - u)v(1 - v). (4.1)

(For details on copulas see [12]). In this section we shall consider a more general form of

(4.1).

Let H be a family of monotonically nonincreasing functions h : <0, 1) ^ R such that

(1) l|h|U < 1

(2) /0 h(t)dt = 0

(3) /0 h(t)dt > 0 for x e <0, 1).

C(u, v) = uv + pH1(u)H2(v), (4.2)

where H1(u) = J^ h1 (t)dt, H2(v) = fj h2(t)dt for some h1, h2 e H and p e (0, 1). It is easy to see that C (u, v) is now a copula with PQD property. Covariance inequalities for absolutely continuous r.v.'s X, Y with copula of the form (4.2) were studied in [7], where measurability of the functions h1 and h2 is only demanded.

From the monotonicity of the functions hi e H, i e {1, 2}, we conclude that Hi (x) is a concave function. As a result, we get

Hi (x )dx > - Hi (x), for any x e <0, 1) (4.3)

which means that under the graph of Hi one can fit in a triangle with the unit base and height

Hi (x).

Now, let us put xi = sup {x e <0, 1) : hi (x) > 0}. From ^ hi (t)dt = 0, we get J0' hi (t)dt = - J^1 hi (t)dt. Thus, it is easy to see that

i h (t)dt =i 'hi (t)dt -/ hi (t)dt = 2/'hi (t)dt = 2H (xt). (4.4)

0 0 xi 0

Formulas (4.3) and (4.4) lead to

i h (t)|dt < 4/ Hi (x)dx. (4.5)

The above inequality enables us to prove the following comparison theorem for the FGM distributed r.v.'s.

Theorem 4.1 Let X, Y be absolutely continuous r.v. 's with the copula (4.2) and bounded densities. Then, for any Borel set B C R2

\Hx,y(B)\ < 16|| fxIUI fYIItoCovh(X, Y). (4.6)

Proof Under our assumptions

Fx,y(x, y) = Fx(x)Fy(y) + pHi(Fx(x))H2F(y))

fx,Y(x, y) = fx(X) fY(y) + phi(Fx(x)) fx(x)h2(Fy(y)) fY(y). Thus, for any Borel set B C R2 we get

Hx,y(B) = |P([X, Y] e B) - P([X', Y'] e B)|

= / Ib Ph1 (Fx(x)) fx(x)h2F(y)) fY(y)dxdy

< p / \hi(u)h2(v)\ dudv < 16p / / Hi(u)H2(v)dudv Jo Jo Jo Jo

= f (C(u, v) - uv) dudv < 16|| fxIUI fYIL Covh(X, Y),

by applying the transformation u = Fx(x), v = Fy(y) and inequality (4.5). □

Similarly as in Example 2.4, it may be shown that (4.6) is optimal up to a constant, i.e. the left and the right-hand side of this inequality may converge to o with the same rate.

Example 4.2 Let

1, t e (o, 2)

hi(t) = h2(t) =

-1, t € (I, i)

Consider a random vector [X, Y], such that X and Y have uniform distributions on the interval (o, 1) and the distribution of [X, Y] coincides with the copula and is given by (4.2) with p = 1 . Then

M (0,2) x (0,2) u( 2, ^ x( i, ^=2-L

Cov (X, Y) =

!6«2'

Open Access This article is distributed under the terms of the Creative Commons Attribution License which permits any use, distribution, and reproduction in any medium, provided the original author(s) and the source are credited.

References

1. I. Bagai, B.L.S. Prakasa Rao, Estimation of the survival function for stationary associated processes. Stat. Probab. Lett. 12, 385-39i (i99i)

2. A. Bulinski, A. Shashkin, Limit Theorems for Associated Random Fields and Related Systems, Advanced Series on Statistical Science and Applied Probability, vol. i0 (World Scientific, Hackensack, 2007)

3. J. Esary, F. Proschan, D. Walkup, Association of random variables with applications. Ann. Math. Stat. 38, i466-i474 (i967)

4. K. Joag-Dev, F. Proschan, Negative association of random variables with applications. Ann. Stat. 11, 286-295 (i983)

5. E.L. Lehmann, Some concepts of dependence. Ann. Math. Stat. 37, 1137-1153 (1966)

6. P. Matula, On some inequalities for positively and negatively dependent random variables with applications. Publ. Math. Debr. 63/4, 511-521 (2003)

7. P. Matula, A note on some inequalities for certain classes of positively dependent random variables. Probab. Math. Stat. 24, 17-26 (2004)

8. P. Matula, M. Ziemba, Generalized covariance inequalities. Cent. Eur. J. Math. 9(2), 281-293 (2011)

9. C.M. Newman, Asymptotic independence and limit theorems for positively and negatively dependent random variables, in Inequalities in Statistics and Probability, ed. by Y.L. Tong (Institute of Mathematical Statistics, Hayward, 1984)

10. P.E. Oliveira, Asymptotics for Associated Random Variables (Springer, Berlin, 2012)

11. B.L.S. Prakasa Rao, Associated Sequences, Demimartingales and Nonparametric Inference (Birkhauser, Basel, 2012)

12. R.B. Nelsen, An Introduction to Copulas, 2nd edn. (Springer, New York, 2006)

13. G.G. Roussas, Kernel estimates under association. Strong uniform consistency. Stat. Probab. Lett. 12, 393-403 (1991)