# Complete moment convergence for weighted sums of negatively superadditive dependent random variablesAcademic research paper on "Mathematics"

0
0
Share paper
J Inequal Appl
OECD Field of science
Keywords
{""}

## Academic research paper on topic "Complete moment convergence for weighted sums of negatively superadditive dependent random variables"

﻿Xueet al. Journal of Inequalities and Applications (2015) 2015:117 DOI 10.1186/s13660-015-0635-2

O Journal of Inequalities and Applications

a SpringerOpen Journal

RESEARCH

Open Access

Complete moment convergence for weighted sums of negatively superadditive dependent random variables

Zhen Xue*, Liangliang Zhang, Yingjie Lei and Zhigang Chen

Correspondence: mathnuc@126.com Schoolof Science, North University of China, Taiyuan, 030051, P.R. China

Abstract

In this paper, we investigate the complete moment convergence for maximal partial sum of negatively superadditive dependent (NSD) random variables under some more general conditions. The results obtained in the paper generalize and improve some known ones.

MSC: 60F15

Keywords: complete moment convergence; negatively superadditive dependent random variables

ft Spri

ringer

1 Introduction

Let {Xni, 1 < i < n, n > 1} be an array of rowwise random variables defined on a fixed probability space (fi, F, P) and {bni, 1 < i < n, n > 1} be an array of real numbers. As we know, the limiting behavior for the maximum of weighted sums max1<m<n Ymi bniX„i is very useful in many probabilistic derivations and stochastic models. There exist several versions available in the literature for independent random variables with assumption of control on their moments. If the independent case is classical in the literature, the treatment of dependent variables is more recent.

One of the dependence structures that has attracted the interest of probabilists and statisticians is negative association. The concept of negatively associated random variables was introduced by Alam and Saxena [1] and carefully studied by Joag-Dev and Proschan [2].

A finite family of random variables {Xi, 1 < i < n} is said to be negatively associated (NA, in short) if for every pair of disjoint subsets A, B c{1,2,..., n},

Covf (Xi, i e A),g(Xj, j e B)) < 0,

whenever f and g are coordinatewise nondecreasing such that this covariance exists. An infinite family of random variables is negatively associated if every finite subfamily is negatively associated.

The next dependence notion is negatively superadditive dependence, which is weaker than negative association. The concept of negatively superadditive dependent random variables was introduced by Hu [3] as follows.

© 2015 Xue et al.; licensee Springer. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the originalworkis properly credited.

Definition 1.1 (cf. Kemperman [4]) A function \$ : M" ^ M is called superadditive if \$(x v y) + \$(x A y) > \$(x) + \$(y) for all x,y e R", where v stands for componentwise maximum and A stands for componentwise minimum.

Definition 1.2 (cf. Hu [3]) A random vector X = (Xi,X2,...,Xn) is said to be negatively superadditive dependent (NSD) if

where X*,X2,...,X" are independent such that X* and Xi have the same distribution for each i and \$ is a superadditive function such that the expectations in (i.i) exist.

A sequence {X", n > 1} of random variables is said to be NSD if for all n > 1, (Xi,X2, ...,X") is NSD.

An array {Xni, i > 1, n > 1} of random variables is said to be rowwise NSD if for all n > 1, {Xni, i > 1} is NSD.

The concept of NSD random variables was introduced by Hu [3], which was based on the class of superadditive functions. Hu [3] gave an example illustrating that NSD does not imply NA, and he posed an open problem whether NA implies NSD. In addition, Hu [3] provided some basic properties and three structural theorems of NSD. Christofides and Vaggelatou [5] solved this open problem and indicated that NA implies NSD. NSD structure is an extension of negatively associated structure and sometimes more useful than it and can be used to get many important probability inequalities. Eghbal et al. [6] derived two maximal inequalities and strong law of large numbers of quadratic forms of NSD random variables under the assumption that {Xi, i > 1} is a sequence of nonnegative NSD random variables with EXr < to for all i > 1 and some r >1. Shen et al. [7] established the strong limit theorems for NSD random variables. Wang et al. [8] investigated the complete convergence for arrays of rowwise NSD random variables and gave its applications to nonparametric regression model. Wang et al. [9] obtained the complete convergence for weighted sums of NSD random variables and its application in the EV regression model. The main purpose of this work is to further study the complete convergence for weighted sums of arrays of rowwise NSD random variables without identical distribution, which generalizes and improves some known results of random variables.

Definition 1.3 A sequence of random variables {Un, n > 1} is said to converge completely to a constant a if for any e >0,

E\$(X1,X2,...,Xn) < E\$(X*,X*,...,Xn),

In this case, we write Un ^ a completely. This notion was given first by Hsu and Robbins [10].

Definition 1.4 Let {Zn, n > 1} be a sequence of random variables and an > 0, bn > 0, q > 0. If

^^anE{b-1|Zn| - e} +< to for all e > 0,

then the above result was called the complete moment convergence by Chow [11].

Let {Xnk, k > 1, n > 1} bea sequence of NSD random variables, {an, n > 1} be a sequence of positive real numbers such that an |to and {*k (t), k > 1} be a sequence of positive even functions such that

^ t and ; as |t| t (1.2)

|t|q |tp

for some 1 < q < p and each k > 1. In order to prove our results, we mention the following conditions:

EXnk = 0, k > 1, n > 1, (1.3)

V V „ *k(Xnk) ,.. .,

E £ E (

to / n /Y \ 2 \ v/2 gfeE£) ) <- (

where v > p is a positive integer.

The following examples of function (t) satisfying assumption (1.2): (t) = |t|p for some q < p <p or (t) = |t|qlog(1 + |t|p-q) for t e (-to, +to). Note that these functions are nonmonotone on t e (-to, +to), while it is simple to show that, under condition (1.2), the function (t) is an increasing function for t > 0. In fact, (t) = • |t|q, t >0, and |t|q t as |t| t, then we have (t) t.

Recently Shen etal. [7] obtained the following complete convergence for weighted sums of NSD random variables.

Theorem A Let {Xn, n > 1} be a sequence of NSD random variables. Assume that {gn(x), n > 1} is a sequence of even functions defined of R, positive and nondecreasing on the half-line x >0. Suppose that one or the other of the following conditions is satisfied for every n > 1:

(i) for some 0 < r < 1, xr /gn(x) is a nondecreasing function ofx on the half-line x >0;

(ii) for some 1 < r < 2, x/gn(x) and gn(x)/xr are nonincreasingfunctions ofx on the half-line x >0, EXn = 0.

For any positive sequence {an, n > 1} with an t to, if we assume that

EEgn(Xn)

—-— < to, (1.6)

n=1 gn (an)

then ^¡t converges almost surely and therefore limn^TO -¡-Y^,n=1 Xi = 0, a.s.

For more details about this type of complete convergence, one can refer to Wu [12,13], Gan and Chen [14], Yang [15], Shao [16], Wu [17, 18], Chen and Sung [19], and so on. The purpose of this paper is extending Theorem A to the complete moment convergence, which is a more general version of the complete convergence. In this work, the symbol C always stands for a generic positive constant, which may vary from one place to another.

2 Preliminary lemmas

In this section, we give the following lemmas which will be used to prove our main results.

Lemma 2.1 (cf. Hu [3]) If (Xi,X2,...,Xn) isNSD andgi,g2,...,gn are nondecreasingfunc-tions, then (g1(X1), g2(X2),...,gn(Xn)) is NSD.

Lemma 2.2 (cf. Wang et al. [8]) Let p >1. Let {Xn, n > 1} be a sequence of NSD random variables with E\Xi\p < to for each i > 1. Then, for all n > 1,

Lemma 2.3 Let {Xnk, k > 1, n > 1} be a sequence of NSD random variables, and let {an, n > 1} be a sequence of positive real numbers such that an \ to. Also, let {^k (t), k > 1} beaposi-tive even function satisfying (1.2) for 1 < q < p. Then (1.4) implies the following statements: (i) for r > 1, 0 < u < q,

for 1 < p < 2

forp >2. (2.2)

(ii) for v > p,

Proof From (1.2) and (1.4), we get

n=1 k=1

n=1 k=1

E\Xnk\vI(\Xnk| < an)

E\Xnk\pI(\Xnk\ <an)

< y Ye< C

"¿1 (an)

where r > 1, 0 < u < q and v > p. The proof is complete.

3 Main results and their proofs

Theorem 3.1 Let {Xnk, k > 1, n > 1} be a sequence of NSD random variables, and let {an, n > 1} be a sequence of positive real numbers such that an \ c. Also, let {^k (t), k > 1} be a positive even function satisfying (1.2) for 1 < q < p < 2. Then, under conditions (1.3) and (1.4), we have

EajE{ max

n 1<j<n

- ean > < c, Ve >0.

Proof For n > 1, denote Mn(X) = max1<j<n \ Xnk\. It is easy to check that

= E a-q l P{Mn (X) - ean > t1/q} dt

n=1 Jo

C / /• an f C

= E a-q / ^Mn(X)>ean + tllq}dt + P{Mn(X)> ean + t1/q}dt

n=1 V -/0 ¿4

C C - C

< y P{Mn(X) >eanj ^y a-q / P{Mn(X) > t11^ dt =

= I1+ I2.

To prove (3.1), it suffices to prove that I1 < c and I2 < c. Now let us prove them step by step. Firstly, we prove that I1 < c. For all n > 1, define

Xf = XnkI(\Xnk\ < an), j = - y(Xf -EXf),

then for all e > 0, it is easy to have

— y Xnk > el an )

p(max \Xnk\ > + Pi max ITjn) I > e - max — YEX(n) |.

\1<j<n / \1<j<n' ' 1 1<j<n an k J

By (1.2), (1.3), (1.4) and Lemma 2.3, we have

TT. EXk

■ y^^EXnklj \Xnk | < an

— £ EXnkl( \Xnk \ > an)

EE\Xnk\I(\Xnk\ > an)

--► 0 as n ^to.

P| max

\ 1<j<n

>^ < itP(\Xnk\ >an) + ^ma<xjTjin)| > 0.

From (3.2) and (3.3), it follows that for n large enough,

■ y^Xnk

Hence we only need to prove that

I = ^ |Xnk | > ¡n) < TO,

n=1 k=1

II = p( max Ijl > - | < to.

^ \i<j'<n j 1 2/

For I, it follows by Lemma 2.3 that

to n to n

I= ££ \Xnk \ > a^ <££

n=1 k=1 n=1 k=1

(.4) (.5)

E\Xnk\qI(\Xnk\ > an)

For II, taking r > 2. Since p < 2, r > p, we have by Markov's inequality, Lemma 2.1, Cr-inequality, and Lemma 2.3 that

TO / \ -r

II <y (-) E max |r<n)|r " 2! i<j<n j 1

TO / \ -r -,

< C £2 On

£E|Xkn)|r + E|X<n)|2'

TO n pi v(n) ir TO / n pi v(n)|2 \r/2

< C EE^X1 + C £ E^

n=1 k=1

n=1 \k=1

„E E E\Xnk\pI(\Xnk\ <an) ^E/ E E\Xnk\pI(\Xnk\ <an) < C££--3-+ C£ £--3-

n=1 k=1

n=1 \ k=1

rE£E\Xnk \pI( \Xnk \ <an) JeEE\Xnk\pI(\Xnk \ < an) i

< C££-tp-+ C ££-TP- <TO.

n=1 k=1

V«=1 k=1

Next we prove that I2 < to. Denote Ynk = XnkI(| Xnk| < tllq), Znk = Xnk - Ynk, and Mn(Y) = maxi</<„ | Ynk |. Obviously,

P{Mn(X) > tl/q} < £P{|Xnk| > I11«} + ^Mn(Y) > tl/q}.

Hence,

TO n „ to TO „ TO

h <Y, Ha-q L P{ Xnk | > + E a-q / P{Mn(Y)>il/q}di

n=l k=l ^ n=l ^an

= Is +14.

For I3, by Lemma 2.3, we have

TO n „ TO

I3 = ^ J2a-q I P{|Xnk|I(|Xnk| > an) > tl/q}dt

n=l k=l ^

TO n /. TO

< J2Ha-q HXnk|I(|Xnk| > an) > il/q}di

,„_1 7, 1 «/0

^T-^EWnk|qI(|Xnk| > an)

= LL-^-<TO.

n=l k=l

max max t

t>al l<j<n

Now let us prove that I4 < c. Firstly, it follows by (1.3) and Lemma 2.3 that I j I

J2 EYnk I j I

^ EZnk I k=1 I

< maxt-1/^E\Xnk\I(\Xnk\ > t11")

= max max t-l/q t>aqn 1<J<n

a-lE|Xnk|I(|Xnk| >an)

^EXnk|qI(|Xnk| > an) < > -q--► 0 as n ^ to.

k=l an

Therefore, for n sufficiently large,

J2 EYnk

Then, for n sufficiently large,

< -, i > ^

P{Mn(Y)>il/q} < P\ max

J^Ynk - EYnk)

> Ç , t > 4

Let dn = [an] + 1. By (3.6), Lemma 2.1, and Cr-inequality, we can see that

TO /.TO /

I4 < C^O? I t-2/qEi n=1 Jai V

t ""'Ll max

I 1<i<n

^(Fnk - EYnk)

nk)2 dt

TO /. to n

J2a-q L t-2lqJ2E(Ynk -EYn

n=1 k=1

TO n /.TO

< CEE"? t t-2/qEY2kdt

n=1 k=1 J¡n

TO n /. TO

= C££a-q / t-2'qEX2nkl{ |Xnk | <dn)dt

n=1 k=1 aqn

TO n /. TO

+ C^J2a-q t-2lqEXlkl(dn < |Xnk | < t1/q) dt

n-1 1--1 dW

= 741 + I42.

For I41, since q < 2, we have

I41 = C a

YsHanEXlA Xk |< dn) /

n=1 k=1 a

t-2/qdt

VV V EX^I(|Xnk| < dn)

<¡2 n=1 k=1 an

_ r V V EX^I(|Xnk | < ¡n) V V EX2nkI(an < |Xnk | < dn) = C ^^ ¡2 + C ^^ ¡2

n=1 k=1

= I41 + I4fi.

n=1 k=1

Since p < 2, by Lemma 2.3, it implies < to. Now we prove that I41 < to. Since q <2 and (an + 1)/an ^ 1 as n ^ to, by Lemma 2.3 we have

to n d 2-q I4i < C £ £-V E|Xnk |qI(an < |Xnk | < dn)

n=1 k=1

n=1 k=1

On + 1\2-qE|Xnk|qI(|Xnk| > On)

„ v^ V^ EXnk|qI(|Xnk| > On)

< CLL--q-< TO.

n=1 k=1

Let t = uq in I42. Note that for q <2,

J^ uq-3EX2nkI(dn < |Xnk | < u) du

= uq-3EX2nkI(Xk| > dn) • I(|Xnk| < u) du

X2nkl(\Xnk| > d„) uq-3l(\Xnk| < u) du

L J\Xnk \ J

Xlkl(\Xnk\ > dn) uq-3 du L ^\Xnk \ J

< CE\Xnk\ql(\Xnk\ > dn). Then, by Lemma 2.3 and dn > an, we have

C n /. C

I42 = C £ E a-q / uq-3EXlkl(dn < \Xnk \ < u) du

n=l k=l ^dn C n

< Cyj2a-nqE\Xnk\ql (\Xnk \ > an) < c.

This completes the proof of Theorem 3.l.

Theorem 3.2 Let {Xnk, k > l, n > 1} be a sequence of NSD random variables, and let {an, n > 1} be a sequence of positive real numbers such that an \ c. Also, let (t), k > 1} be a positive even function satisfying (1.2) for 1 < q < p and p > 2. Then conditions (1.3)-(1.5) imply (3.1).

Proof Following the notation, by a similar argument as in the proof of Theorem 3.1, we can easily prove that l1 < c, l3 < c, and that (3.2) and (3.3) hold. To complete the proof, we only need to prove that l4 < c.

Let S > p and dn = [an] +1. By (3.6), Markov's inequality, Lemma 2.2, and Cr-inequality, we can get

l4 < C > a

n=1 an

,„_1 ^ a n

t-S/qE

J2(Ynk - EYnk)

■ -S/q

£>\Ynk\S + (£ EY;

c n c c n

= Cyj2a-q I t-S/qE\Ynk\S dt + Cj2a?

n=1 k=1 aqn n=1 aqn k=1

n2k dt

= Z43 + l44.

For l43, we have

l43 = Cyj2a-q q t-S/qE\Xnk\Sl( \Xnk \ < dn) dt

n=1 k=1 aqn

C n /. C

+ Cyj2a-q t-S/qE\Xnk\S l(dn < \Xnk \ < tyq) dt

/—i J dn

= l4 3 + l4r3.

By a similar argument as in the proof of l41 < c and l42 < c (replacing the exponent 2 by S), we can get l'43 < c, l4'3 < c.

For I44, since S > 2, we can see that

TO /. TO / n

I44 = Cj2a-q q t-S/^^EX2nkI{|Xnk| < On)

n=1 aqn k=1

n \ S/2

+ £ EX^kI{an < |Xnk | < ) dt k=1

TO ,, TO / n \S/2

< Cj2a-q q t-S/q[J2EX2nkI(|Xnk| < ¡^ dt

n=1 ^¡qn \ k=1 /

to c to / n \S/2

Ea-'J q t-2/qJ2EX2nkI{an < |Xnk| < t1/q) dt

n=1 aqn k=1

+ C>'a-q/ 11 2 EXikHa,

= ^4 4 + I4f4.

Since S > p > q, from (1.4) we have

TO / n \ f° I44 = C ^EXn2^|Xnk| < On) I / q t-S/qdt

TO / n EX2kI (|Xnk |< On AS/2

n=1 k=1

TO / n Cv2 \ S/2 EX

n=1 k=1 an

nM < TO.

Next we prove that I4'4 < to. To start with, we consider the case 1 < q < 2. Since S > 2, by Lemma 2.3 we have

TO f to/ n \S/2

I44 < Cj2a-q q t-^E|Xnk|q^an < |Xnk| < t1/q) dt

n=1 -M \ k=1 /

TO ,, TO / n \S/2

Ha-q q t-1 EE|Xnk|qI(|Xnk| > ¡n) I dt

n=1 \ k=1 /

< C}_.an \q (t-^E|X '|q

n=1 Jan \ k

TO / n \S/2 ,, TO

= ^¡-^^E|Xnk|qI(|Xnk| >On) I t-S/2dt

n=1 k=1 aqn

EEXk|qI(|Xnk| > On) 1

< ^^^-15- < TO.

n=1 k=1

Finally, we prove that I4'4 < to in the case 2 < q < p. Since S > q and S > 2, we have by Lemma 2.3 that

TO f to/ n \S/2

I44 < Cj^a-" q t-2lqJ2EX2„kI{|Xnk| > ¡n)

n=1 \ k=1 /

a-nq I I t-2/q £>X;2tI(|Xnk| > ¡^ dt

n=1 J°n \ k=

TO / n \S/2 ,, TO

= ^ E/ ¡«^ ^ E/ EX^kI(|Xnk | > ¡^ / t-S/qdt

n=1 k=1

n=1 k=1 < C.

EXn2kl(|Xnk| > an)

Thus we get the desired result immediately. The proof is completed.

Corollary 3.1 Let {Xnk, k > 1, n > 1} be a sequence ofNSD random variables with mean zero. lffor some a >0 and v > 2,

max E\Xnk\v = O(na),

1<k<n v '

where - - a > max{ 2, 2}, v > 2, then for any e >0,

En E max

-enq 1 < c.

Proof Put ^k(\t\) = \t\v,p = v + S, S > 0, an = n1/q. ce v > 2

^k (\t\)

Since v > 2, v - a > max{ v, 2}, then

= \t\v-q t,

(\t\) \t\v 1

-= — = —r I as \ t\ t c.

t p t p t S

It follows by (3.7) and -q - a >2 that

£ £ E^k(Xnk) = ££E \ Xnk \ v < c £ 1 ¿—I ¿—I (a ) = ¿—I ¿—I v < v -a-1

n=1 k=1 Wk (an) n=1 k=1 nq n=1 nq

Since v > 2, by Jensen's inequality it follows that

£E \ Xk\2 < £ (E\Xk \ v)2 < C 1

k=1 n q

k=1 nq

< 2-2a _

nq v -

Clearly q - v -1 > 0. Take s >p such that § (2 - f -1) > 1. Therefore,

E Xk 2

k=1 nq

Combining Theorem 3.2 and (3.9) and (3.10), we can prove Corollary 3.1 immediately.

Remark 3.1 Noting that, in this paper we consider the case 1 < q < p, which is a wider scope than the case q = 1 in Shen etal. [7]. In addition, compared with NSD random variables, the arrays ofNSD random variables not only have many related properties, but they also have a wide range of application. So it is very significant to study it.

Remark 3.2 Under the condition of Theorem 3.1, we have

YJ Xnk

Y; Xnk

a"q El max ti n 11<i<n

a„* I max

o 1<i<"

n=1 Jo

TO r^ o"

¡nq L

n=1 Jo

- san >

n > t1/q

I 1<i<n

S ¡n > S¡n dt

= s^y^H max

tr I1«"

> 2san >.

Then max1<i<n |Y^k=r Xnk | 0, the result of Theorem A is obtained directly. So the result of Theorem 3.1 implies Theorem A, it generalizes the corresponding result of Theorem A.

Competing interests

The authors declare that they have no competing interests. Authors' contributions

All authors contributed equally to the writing of this paper. All authors read and approved the final manuscript. Acknowledgements

We thank the referee very much for valuable suggestions and comments which improved the paper. It is with great pleasure that we express to him our sincere thanks.

Received: 16 December 2014 Accepted: 18 March 2015 Published online: 01 April 2015 References

1. Alam, K, Saxena, KML: Positive dependence in multivariate distributions. Commun. Stat., Theory Methods 10, 1183-1196(1981)

2. Joag-Dev, K, Proschan, F: Negative association of random variables with applications. Ann. Stat. 11(1), 286-295 (1983)

3. Hu, TZ: Negatively superadditive dependence of random variables with applications. Chinese J. Appl. Probab. Statist. 16,133-144 (2000)

, Kemperman, JHB: On the FKG-inequalities for measures on a partially ordered space. Proc. K. Ned. Akad. Wet., Ser. A,

Indag. Math. 80,313-331 (1977) . Christofides, TC, Vaggelatou, E: A connection between supermodular ordering and positive/negative association.

J. Multivar. Anal. 88,138-151 (2004) '. Eghbal, N, Amini, M, Bozorgnia, A: Some maximal inequalities for quadratic forms of negative superadditive

dependence random variables. Stat. Probab. Lett. 80,587-591 (2010) '. Shen, Y, Wang, XJ, Yang, WZ, Hu, SH: Almost sure convergence theorem and strong stability for weighted sums of

NSD random variables. Acta Math. Sin. Engl. Ser. 29(4), 743-756 (2013) '. Wang, XJ, Deng, X, Zheng, LL, Hu, SH: Complete convergence for arrays of rowwise negatively superadditive

dependent random variables and its applications. Statistics 48(4), 834-850 (2014) '. Wang, XJ, Shen, AT, Chen, ZY, Hu, SH: Complete convergence for weighted sums of NSD random variables and its

application in the EV regression model. TEST 24,166-184 (2015) l Hsu, PL, Robbins, H: Complete convergence and the law of large numbers. Proc. Natl. Acad. Sci. USA 33(2), 25-31 (1947)

. Chow, YS: On the rate of moment complete convergence of sample sums and extremes. Bull. Inst. Math. Acad. Sin. 16,177-201 (1988)

. Wu, QY: A complete convergence theorem for weighted sums of arrays of rowwise negatively dependent random

variables. J. Inequal. Appl. 2012, Article ID 50 (2012) . Wu, QY: Probability Limit Theory for Mixing Sequences. Science Press of China, Beijing (2006) -. Gan, SX, Chen, PY: Some limit theorems for sequences of pairwise NQD random variables. Acta Math. Sci. 28(2), 269-281 (2008)

. Yang, SC: Maximal moment inequality for partial sum of strong mixing sequences and application. Acta Math. Sin.

Engl. Ser. 23(6), 1013-1024 (2007) '. Shao, QM: Almost sure invariance principles for mixing sequences of random variables. Stoch. Process. Appl. 48(2), 319-334(1993)

'. Wu, YF: Limiting behavior for arrays of rowwise p*-mixing random variables. Lith. Math. J. 52,214-221 (2012)

18. Wu, YF: On limiting behavior for arrays of rowwise negatively orthant dependent random variables. J. Korean Stat. Soc. 42, 61-70(2013)

19. Chen, PY, Sung, SH: On the strong convergence for weighted sums of negatively associated random variables. Stat. Probab. Lett. 92,45-52 (2014)

Submit your manuscript to a SpringerOpen journal and benefit from:

► Convenient online submission

► Rigorous peer review

► Immediate publication on acceptance

► Open access: articles freely available online

► High visibility within the field