El e c t ro nic
Journ a l of
Pr
ob a b il i t y
Vol. 14 (2009), Paper no. 35, pages 978–1011.
Journal URL
http://www.math.washington.edu/~ejpecp/
Rates of convergence for minimal distances in the central limit theorem under projective criteria
Jérôme Dedecker∗ Florence Merlevède† Emmanuel Rio‡
Abstract
In this paper, we give estimates of ideal or minimal distances between the distribution of the normalized partial sum and the limiting Gaussian distribution for stationary martingale differ- ence sequences or stationary sequences satisfying projective criteria. Applications to functions of linear processes and to functions of expanding maps of the interval are given.
Key words:Minimal and ideal distances, rates of convergence, Martingale difference sequences, stationary sequences, projective criteria, weak dependence, uniform mixing.
AMS 2000 Subject Classification:Primary 60F05.
Submitted to EJP on November 5, 2007, final version accepted June 6, 2008.
∗Université Paris 6, LSTA, 175 rue du Chevaleret, 75013 Paris, FRANCE. E-mail: jerome.dedecker@upmc.fr
†Université Paris Est, Laboratoire de mathématiques, UMR 8050 CNRS, Bâtiment Copernic, 5 Boulevard Descartes, 77435 Champs-Sur-Marne, FRANCE. E-mail: florence.merlevede@univ-mlv.fr
‡Université de Versailles, Laboratoire de mathématiques, UMR 8100 CNRS, Bâtiment Fermat, 45 Avenue des Etats- Unis, 78035 Versailles, FRANCE. E-mail: rio@math.uvsq.fr
1 Introduction and Notations
LetX1,X2, . . . be a strictly stationary sequence of real-valued random variables (r.v.) with mean zero and finite variance. Set Sn = X1+X2+· · ·+Xn. By Pn−1/2Sn we denote the law of n−1/2Sn and by Gσ2 the normal distribution N(0,σ2). In this paper, we shall give quantitative estimates of the approximation ofPn−1/2Sn byGσ2 in terms of minimal or ideal metrics.
LetL(µ,ν) be the set of the probability laws onR2 with marginalsµ andν. Let us consider the following minimal distances (sometimes called Wasserstein distances of orderr)
Wr(µ,ν) =
infnZ
|x−y|rP(d x,d y):P∈ L(µ,ν)o
if 0<r<1 infnZ
|x−y|rP(d x,d y)1/r
:P∈ L(µ,ν)o
ifr≥1 .
It is well known that for two probability measuresµandν onRwith respective distributions func- tions (d.f.) F andG,
Wr(µ,ν) =Z 1 0
|F−1(u)−G−1(u)|rdu1/r
for anyr≥1. (1.1)
We consider also the following ideal distances of order r (Zolotarev distances of order r). For two probability measuresµandν, andr a positive real, let
ζr(µ,ν) =supnZ
f dµ− Z
f dν: f ∈Λro ,
whereΛr is defined as follows: denoting byl the natural integer such thatl < r ≤l+1,Λr is the class of real functions f which arel-times continuously differentiable and such that
|f(l)(x)−f(l)(y)| ≤ |x−y|r−l for any(x,y)∈R×R. (1.2) It follows from the Kantorovich-Rubinstein theorem (1958) that for any 0<r≤1,
Wr(µ,ν) =ζr(µ,ν). (1.3)
For probability laws on the real line, Rio (1998) proved that for anyr>1, Wr(µ,ν)≤cr ζr(µ,ν)1/r
, (1.4)
wherecris a constant depending only onr.
For independent random variables, Ibragimov (1966) established that if X1 ∈ Lp for p ∈]2, 3], then W1(Pn−1/2Sn,Gσ2) = O(n1−p/2) (see his Theorem 4.3). Still in the case of independent r.v.’s, Zolotarev (1976) obtained the following upper bound for the ideal distance: ifX1∈Lpforp∈]2, 3], thenζp(Pn−1/2Sn,Gσ2) =O(n1−p/2). From (1.4), the result of Zolotarev entails that, for p ∈]2, 3], Wp(Pn−1/2Sn,Gσ2) =O(n1/p−1/2)(which was obtained by Sakhanenko (1985) for any p>2). From (1.1) and Hölder’s inequality, we easily get that for independent random variables inLp withp ∈ ]2, 3],
Wr(Pn−1/2Sn,Gσ2) =O(n−(p−2)/2r) for any 1≤r≤p. (1.5)
In this paper, we are interested in extensions of (1.5) to sequences of dependent random variables.
More precisely, for X1 ∈ Lp and p in ]2, 3] we shall give Lp-projective criteria under which: for r∈[p−2,p]and(r,p)6= (1, 3),
Wr(Pn−1/2Sn,Gσ2) =O(n−(p−2)/2 max(1,r)). (1.6) As we shall see in Remark 2.4, (1.6) applied tor =p−2 provides the rate of convergenceO(n−
p−2 2(p−1)) in the Berry-Esseen theorem.
When (r,p) = (1, 3), Dedecker and Rio (2008) obtained that W1(Pn−1/2Sn,Gσ2) = O(n−1/2) for stationary sequences of random variables in L3 satisfying L1 projective criteria or weak depen- dence assumptions (a similar result was obtained by Pène (2005) in the case where the vari- ables are bounded). In this particular case our approach provides a new criterion under which W1(Pn−1/2Sn,Gσ2) =O(n−1/2logn).
Our paper is organized as follows. In Section 2, we give projective conditions for stationary martin- gales differences sequences to satisfy (1.6) in the case(r,p)6= (1, 3). To be more precise, let(Xi)i∈Z
be a stationary sequence of martingale differences with respect to some σ-algebras (Fi)i∈Z (see Section 1.1 below for the definition of(Fi)i∈Z). As a consequence of our Theorem 2.1, we obtain that if(Xi)i∈Zis inLpwithp∈]2, 3]and satisfies
X∞
n=1
1 n2−p/2
E
S2n n F0
−σ2
p/2<∞, (1.7)
then the upper bound (1.6) holds provided that(r,p)6= (1, 3). In the case r = 1 and p = 3, we obtain the upper boundW1(Pn−1/2Sn,Gσ2) =O(n−1/2logn).
In Section 3, starting from the coboundary decomposition going back to Gordin (1969), and using the results of Section 2, we obtain Lp-projective criteria ensuring (1.6) (if (r,p) 6= (1, 3)). For instance, if(Xi)i∈Zis a stationary sequence ofLp random variables adapted to(Fi)i∈Z, we obtain (1.6) for any p ∈]2, 3[ and any r ∈[p−2,p]provided that (1.7) holds and the series E(Sn|F0) converge inLp. In the case wherep=3, this last condition has to be strengthened. Our approach makes also possible to treat the case of non-adapted sequences.
Section 4 is devoted to applications. In particular, we give sufficient conditions for some functions of Harris recurrent Markov chains and for functions of linear processes to satisfy the bound (1.6) in the case (r,p) 6= (1, 3) and the rate O(n−1/2logn) when r = 1 and p = 3. Since projective criteria are verified under weak dependence assumptions, we give an application to functions of φ-dependent sequences in the sense of Dedecker and Prieur (2007). These conditions apply to unbounded functions of uniformly expanding maps.
1.1 Preliminary notations
Throughout the paper, Y is aN(0, 1)-distributed random variable. We shall also use the following notations. Let(Ω,A,P)be a probability space, andT :Ω7→Ωbe a bijective bimeasurable transfor- mation preserving the probabilityP. For aσ-algebraF0 satisfyingF0 ⊆ T−1(F0), we define the nondecreasing filtration(Fi)i∈Z byFi = T−i(F0). LetF−∞ =T
k∈ZFk andF∞=W
k∈ZFk. We shall denote sometimes byEi the conditional expectation with respect toFi. LetX0be a zero mean random variable with finite variance, and define the stationary sequence(Xi)i∈ZbyXi=X0◦Ti.
2 Stationary sequences of martingale differences.
In this section we give bounds for the ideal distance of order r in the central limit theorem for stationary martingale differences sequences(Xi)i∈Zunder projective conditions.
Notation 2.1. For anyp>2, define the envelope normk.k1,Φ,p by kXk1,Φ,p=
Z 1
0
(1∨Φ−1(1−u/2))p−2QX(u)du
whereΦdenotes the d.f. of theN(0, 1)law, andQX denotes the quantile function of|X|, that is the cadlag inverse of the tail function x →P(|X|>x).
Theorem 2.1. Let(Xi)i∈Zbe a stationary martingale differences sequence with respect to(Fi)i∈Z. Let σdenote the standard deviation of X0. Let p∈]2, 3]. Assume thatE|X0|p<∞and that
X∞
n=1
1 n2−p/2
E
Sn2 n F0
−σ2
1,Φ,p<∞, (2.1)
and ∞
X
n=1
1 n2/p
E
S2n n F0
−σ2
p/2<∞. (2.2)
Then, for any r ∈ [p−2,p] with (r,p) 6= (1, 3), ζr(Pn−1/2Sn,Gσ2) = O(n1−p/2), and for p = 3, ζ1(Pn−1/2Sn,Gσ2) =O(n−1/2logn).
Remark 2.1. Leta>1 and p>2. Applying Hölder’s inequality, we see that there exists a positive constant C(p,a) such thatkXk1,Φ,p ≤ C(p,a)kXka. Consequently, if p ∈]2, 3], the two conditions (2.1) and (2.2) are implied by the condition (1.7) given in the introduction.
Remark 2.2. Under the assumptions of Theorem 2.1, ζr(Pn−1/2Sn,Gσ2) = O(n−r/2) if r < p−2.
Indeed, let p′ =r+2. Since p′< p, if the conditions (2.1) and (2.2) are satisfied for p, they also hold forp′. Hence Theorem 2.1 applies withp′.
From (1.3) and (1.4), the following result holds for the Wasserstein distances of orderr.
Corollary 2.1. Under the conditions of Theorem 2.1, Wr(Pn−1/2Sn,Gσ2) =O(n−(p−2)/2 max(1,r))for any r in[p−2,p], provided that(r,p)6= (1, 3).
Remark 2.3. Forpin]2, 3],Wp(Pn−1/2Sn,Gσ2) =O(n−(p−2)/2p). This bound was obtained by Sakha- nenko (1985) in the independent case. For p < 3, we have W1(Pn−1/2Sn,Gσ2) = O(n1−p/2). This bound was obtained by Ibragimov (1966) in the independent case.
Remark 2.4. Recall that for two real valued random variables X,Y, the Ky Fan metricα(X,Y) is defined byα(X,Y) =inf{ǫ >0 :P(|X−Y|> ǫ)≤ǫ}. LetΠ(µ,ν)be the Prokhorov distance between µandν. By Theorem 11.3.5 in Dudley (1989) and Markov inequality, one has, for anyr>0,
Π(PX,PY)≤α(X,Y)≤(E(|X−Y|r))1/(r+1).
Taking the minimum over the random couples (X,Y) with law L(µ,ν), we obtain that, for any 0< r ≤1, Π(µ,ν)≤(Wr(µ,ν))1/(r+1). Hence, if Πn is the Prokhorov distance between the law of n−1/2Sn and the normal distributionN(0,σ2),
Πn≤(Wr(Pn−1/2Sn,Gσ2))1/(r+1)for any 0<r≤1 . Taking r=p−2, it follows that under the assumptions of Theorem 2.1,
Πn=O(n−
p−2
2(p−1)) if p<3 andΠn=O(n−1/4p
logn) if p=3. (2.3) For p in]2, 4], under (2.2), we have thatkPn
i=1E(X2i −σ2|Fi−1)kp/2 =O(n2/p)(apply Theorem 2 in Wu and Zhao (2006)). Applying then the result in Heyde and Brown (1970), we get that if (Xi)i∈Zis a stationary martingale difference sequence inLpsuch that (2.2) is satisfied then
kFn−Φσk∞=O n−
p−2 2(p+1)
.
whereFnis the distribution function ofn−1/2SnandΦσis the d.f. of Gσ2. Now kFn−Φσk∞≤ 1+σ−1(2π)−1/2
Πn.
Consequently the bounds obtained in (2.3) improve the one given in Heyde and Brown (1970), provided that (2.1) holds.
Remark 2.5. If (Xi)i∈Zis a stationary martingale difference sequence inL3 such thatE(X02) =σ2
and X
k>0
k−1/2kE(Xk2|F0)−σ2k3/2<∞, (2.4) then, according to Remark 2.1, the conditions (2.1) and (2.2) hold forp=3. Consequently, if (2.4) holds, then Remark 2.4 giveskFn−Φσk∞=O n−1/4p
logn
. This result has to be compared with Theorem 6 in Jan (2001), which states thatkFn−Φσk∞=O(n−1/4) if P
k>0kE(Xk2|F0)−σ2k3/2<
∞.
Remark 2.6. Notice that if(Xi)i∈Zis a stationary martingale differences sequence, then the condi- tions (2.1) and (2.2) are respectively equivalent to
X
j≥0
2j(p/2−1)k2−jE(S22j|F0)−σ2k1,Φ,p<∞, and X
j≥0
2j(1−2/p)k2−jE(S22j|F0)−σ2kp/2<∞. To see this, letAn=kE(Sn2|F0
−E(Sn2)k1,Φ,pandBn=kE(Sn2|F0
−E(Sn2)kp/2. We first show that AnandBnare subadditive sequences. Indeed, by the martingale property and the stationarity of the sequence, for all positiveiand j
Ai+j = kE(S2i + (Si+j−Si)2|F0
−E(Si2+ (Si+j−Si)2)k1,Φ,p
≤ Ai+kE (Si+j−Si)2−E(S2j)|F0 k1,Φ,p.
Proceeding as in the proof of (4.6), p. 65 in Rio (2000), one can prove that, for anyσ-fieldA and any integrable random variableX,kE(X|A)k1,Φ,p≤ kXk1,Φ,p. Hence
kE (Si+j−Si)2−E(S2j)|F0
k1,Φ,p≤ kE((Si+j−Si)2−E(S2j)|Fi
k1,Φ,p.
By stationarity, it follows thatAi+j≤Ai+Aj. SimilarlyBi+j≤Bi+Bj. The proof of the equivalences then follows by using the same arguments as in the proof of Lemma 2.7 in Peligrad and Utev (2005).
3 Rates of convergence for stationary sequences
In this section, we give estimates for the ideal distances of order r for stationary sequences which are not necessarily adapted toFi.
Theorem 3.1. Let(Xi)i∈Zbe a stationary sequence of centered random variables inLp with p∈]2, 3[, and letσ2n=n−1E(S2n). Assume that
X
n>0
E(Xn|F0)and X
n>0
(X−n−E(X−n|F0))converge inLp, (3.1)
and X
n≥1
n−2+p/2kn−1E(S2n|F0)−σ2nkp/2<∞. (3.2) Then the seriesP
k∈ZCov(X0,Xk)converges to some nonnegativeσ2, and 1. ζr(Pn−1/2Sn,Gσ2) =O(n1−p/2)for r∈[p−2, 2],
2. ζr(Pn−1/2Sn,Gσ2
n) =O(n1−p/2)for r∈]2,p].
Remark 3.1. According to the bound (5.40), we infer that, under the assumptions of Theorem 3.1, the condition (3.2) is equivalent to
X
n≥1
n−2+p/2kn−1E(S2n|F0)−σ2kp/2<∞. (3.3) The same remark applies to the next theorem withp=3.
Remark 3.2. The result of item 1 is valid withσn instead ofσ. On the contrary, the result of item 2 is no longer true ifσn is replaced byσ, because for r ∈]2, 3], a necessary condition forζr(µ,ν) to be finite is that the two first moments ofν and µ are equal. Note that under the assumptions of Theorem 3.1, both Wr(Pn−1/2Sn,Gσ2) and Wr(Pn−1/2Sn,Gσ2
n) are of the order of n−(p−2)/2 max(1,r). Indeed, in the case wherer∈]2,p], one has that
Wr(Pn−1/2Sn,Gσ2)≤Wr(Pn−1/2Sn,Gσ2
n) +Wr(Gσ2 n,Gσ2), and the second term is of order|σ−σn|=O(n−1/2).
In the case wherep=3, the condition (3.1) has to be strengthened.
Theorem 3.2. Let(Xi)i∈Z be a stationary sequence of centered random variables inL3, and letσ2n = n−1E(S2n). Assume that (3.1) holds for p=3and that
X
n≥1
1 n
X
k≥n
E(Xk|F0)
3<∞ and X
n≥1
1 n
X
k≥n
(X−k−E(X−k|F0))
3<∞. (3.4) Assume in addition that
X
n≥1
n−1/2kn−1E(S2n|F0)−σ2nk3/2<∞. (3.5) Then the seriesP
k∈ZCov(X0,Xk)converges to some nonnegativeσ2and
1. ζ1(Pn−1/2Sn,Gσ2) =O(n−1/2logn),
2. ζr(Pn−1/2Sn,Gσ2) =O(n−1/2)for r∈]1, 2], 3. ζr(Pn−1/2Sn,Gσ2
n) =O(n−1/2)for r∈]2, 3].
4 Applications
4.1 Martingale differences sequences and functions of Markov chains
Recall that the strong mixing coefficient of Rosenblatt (1956) between twoσ-algebras A and B is defined by α(A,B) =sup{|P(A∩B)−P(A)P(B)| : (A,B)∈ A × B }. For a strictly stationary sequence(Xi)i∈Z, letFi=σ(Xk,k≤i). Define the mixing coefficientsα1(n)of the sequence(Xi)i∈Z
by
α1(n) =α(F0,σ(Xn)).
For the sake of brevity, letQ=QX0 (see Notation 2.1 for the definition). According to the results of Section 2, the following proposition holds.
Proposition 4.1. Let (Xi)i∈Z be a stationary martingale difference sequence in Lp with p ∈]2, 3].
Assume moreover that the series X
k≥1
1 k2−p/2
Z α1(k)
0
(1∨log(1/u))(p−2)/2Q2(u)du and X
k≥1
1 k2/p
Z α1(k) 0
Qp(u)du2/p
(4.1) are convergent.Then the conclusions of Theorem 2.1 hold.
Remark 4.1. From Theorem 2.1(b) in Dedecker and Rio (2008), a sufficient condition to get W1(Pn−1/2Sn,Gσ2) =O(n−1/2logn)is
X
k≥0
Z α1(n)
0
Q3(u)du<∞.
This condition is always strictly stronger than the condition (4.1) whenp=3.
We now give an example. Consider the homogeneous Markov chain(Yi)i∈Zwith state spaceZde- scribed at page 320 in Davydov (1973). The transition probabilities are given bypn,n+1=p−n,−n−1= an forn≥0, pn,0=p−n,0=1−an forn>0, p0,0=0,a0=1/2 and 1/2≤an<1 for n≥1. This chain is irreducible and aperiodic. It is Harris positively recurrent as soon asP
n≥2Πnk=1−1ak<∞. In that case the stationary chain is strongly mixing in the sense of Rosenblatt (1956).
Denote by K the Markov kernel of the chain (Yi)i∈Z. The functions f such that K(f) =0 almost everywhere are obtained by linear combinations of the two functions f1 and f2 given by f1(1) =1, f1(−1) =−1 and f1(n) = f1(−n) =0 ifn6=1, and f2(0) =1, f2(1) = f2(−1) =0 and f2(n+1) = f2(−n−1) =1−an−1ifn>0. Hence the functions f such thatK(f) =0 are bounded.
If(Xi)i∈Zis defined byXi= f(Yi), with K(f) =0, then Proposition 4.1 applies if
α1(n) =O(n1−p/2(logn)−p/2−ε) for someε >0, (4.2)
which holds as soon as P0(τ = n) = O(n−1−p/2(logn)−p/2−ε), where P0 is the probability of the chain starting from 0, and τ= inf{n> 0,Xn = 0}. Now P0(τ = n) = (1−an)Πni=1−1ai for n≥ 2.
Consequently, if
ai=1− p 2i
1+ 1+ε logi
forilarge enough , the condition (4.2) is satisfied and the conclusion of Theorem 2.1 holds.
Remark 4.2. If f is bounded and K(f) 6= 0, the central limit theorem may fail to hold forSn = Pn
i=1(f(Yi)−E(f(Yi))). We refer to the Example 2, page 321, given by Davydov (1973), whereSn properly normalized converges to a stable law with exponent strictly less than 2.
Proof of Proposition 4.1. Let Bp(F0) be the set of F0-measurable random variables such that kZkp≤1. We first notice that
kE(X2k|F0)−σ2kp/2= sup
Z∈Bp/(p−2)(F0)
Cov(Z,Xk2). Applying Rio’s covariance inequality (1993), we get that
kE(Xk2|F0)−σ2kp/2≤2Z α1(k) 0
Qp(u)du2/p
,
which shows that the convergence of the second series in (4.1) implies (2.2). Now, from Fréchet (1957), we have that
kE(X2k|F0)−σ2k1,Φ,p=supE((1∨ |Z|p−2)|E(Xk2|F0)−σ2|),Z F0-measurable, Z∼ N(0, 1) . Hence, settingǫk=sign(E(Xk2|F0)−σ2),
kE(Xk2|F0)−σ2k1,Φ,p=sup
Cov(ǫk(1∨ |Z|p−2),Xk2),ZF0-measurable,Z∼ N(0, 1) . Applying again Rio’s covariance inequality (1993), we get that
kE(X2k|F0)−σ2k1,Φ,p≤CZ α1(k) 0
(1∨log(u−1))(p−2)/2Q2(u)du , which shows that the convergence of the first series in (4.1) implies (2.1).
4.2 Linear processes and functions of linear processes In what follows we say that the series P
i∈Zai converges if the two series P
i≥0ai and P
i<0ai converge.
Theorem 4.1. Let(ai)i∈Z be a sequence of real numbers in ℓ2 such that P
i∈Zai converges to some real A. Let (ǫi)i∈Z be a stationary sequence of martingale differences in Lp for p ∈]2, 3]. Let Xk = P
j∈Zajǫk−j, andσ2n=n−1E(S2n). Let b0=a0−A and bj=aj for j6=0. Let An=P
j∈Z(Pn
k=1bk−j)2. If An=o(n), thenσ2n converges toσ2=A2E(ǫ02). If moreover
X∞
n=1
1 n2−p/2
E
1 n
Xn
j=1
ǫj2 F0
−E(ǫ20)
p/2<∞, (4.3)
then we have
1. If An=O(1), thenζ1(Pn−1/2Sn,Gσ2) =O(n−1/2log(n)), for p=3,
2. If An=O(n(r+2−p)/r), thenζr(Pn−1/2Sn,Gσ2) =O(n1−p/2), for r∈[p−2, 1]and p6=3, 3. If An=O(n3−p), thenζr(Pn−1/2Sn,Gσ2) =O(n1−p/2), for r∈]1, 2],
4. If An=O(n3−p), thenζr(Pn−1/2Sn,Gσ2
n) =O(n1−p/2), for r∈]2,p].
Remark 4.3. If the condition given by Heyde (1975) holds, that is X∞
n=1
X
k≥n
ak2
<∞ and X∞
n=1
X
k≤−n
ak2
<∞, (4.4)
thenAn=O(1), so that it satisfies all the conditions of items 1-4.
Remark 4.4. Under the additional assumptionP
i∈Z|ai|<∞, one has the bound An≤4Bn, where Bn=
Xn
k=1
X
j≥k
|aj|2
+ X
j≤−k
|aj|2
. (4.5)
Proof of Theorem 4.1.We start with the following decomposition:
Sn=A
n
X
j=1
ǫj+ X∞
j=−∞
Xn
k=1
bk−j
ǫj. (4.6)
LetRn =P∞
j=−∞(Pn
k=1bk−j)ǫj. SincekRnk22 =Ankǫ0k22 and since|σn−σ| ≤ n−1/2kRnk2, the fact that An = o(n) implies that σn converges to σ. We now give an upper bound for kRnkp. From Burkholder’s inequality, there exists a constantC such that
kRnkp≤Cn
X∞
j=−∞
Xn
k=1
bk−j2
ǫ2j p/2
o1/2
≤Ckǫ0kp
pAn. (4.7)
According to Remark 2.1, since (4.3) holds, the two conditions (2.1) and (2.2) of Theorem 2.1 are satisfied by the martingale Mn = APn
k=1ǫk. To conclude the proof, we use Lemma 5.2 given in Section 5.2, with the upper bound (4.7).
Proof of Remarks 4.3 and 4.4. To prove Remark 4.3, note first that An=
Xn
j=1
X−j
l=−∞
al+ X∞
l=n+1−j
al2
+ X∞
i=1
n+iX−1
l=i
al2
+ X∞
i=1
X−i
l=−i−n+1
al2
.
It follows easily thatAn=O(1)under (4.4). To prove the bound (4.5), note first that An≤3Bn+
X∞
i=n+1
n+iX−1
l=i
|al|2
+ X∞
i=n+1
X−i
l=−i−n+1
|al|2
.
LetTi=P∞
l=i|al|andQi =P−i
l=−∞|al|. We have that X∞
i=n+1
n+i−1X
l=i
|al|2
≤ Tn+1 X∞
i=n+1
(Ti−Tn+i)≤nTn+12 X∞
i=n+1
X−i
l=−i−n+1
|al|2
≤ Qn+1 X∞
i=n+1
(Qi−Qn+i)≤nQ2n+1.
Sincen(Tn+12 +Q2n+1)≤Bn, (4.5) follows.
In the next result, we shall focus on functions of real-valued linear processes Xk=h X
i∈Z
aiǫk−i
−E
h X
i∈Z
aiǫk−i
, (4.8)
where(ǫi)i∈Zis a sequence of iid random variables. Denote by wh(.,M)the modulus of continuity of the functionhon the interval[−M,M], that is
wh(t,M) =sup{|h(x)−h(y)|,|x−y| ≤t,|x| ≤M,|y| ≤M}.
Theorem 4.2. Let(ai)i∈Zbe a sequence of real numbers inℓ2 and(ǫi)i∈Zbe a sequence of iid random variables inL2. Let Xk be defined as in (4.8) andσ2n= n−1E(Sn2). Assume that h isγ-Hölder on any compact set, with wh(t,M)≤C tγMα, for some C>0,γ∈]0, 1]andα≥0. If for some p∈]2, 3],
E(|ǫ0|2∨(α+γ)p)<∞ and X
i≥1
ip/2−1 X
|j|≥i
a2jγ/2
<∞, (4.9)
then the seriesP
k∈ZCov(X0,Xk)converges to some nonnegativeσ2, and 1. ζ1(Pn−1/2Sn,Gσ2) =O(n−1/2logn), for p=3,
2. ζr(Pn−1/2Sn,Gσ2) =O(n1−p/2)for r∈[p−2, 2]and(r,p)6= (1, 3), 3. ζr(Pn−1/2Sn,Gσ2
n) =O(n1−p/2)for r∈]2,p].
Proof of Theorem 4.2.Theorem 4.2 is a consequence of the following proposition:
Proposition 4.2. Let(ai)i∈Z,(ǫi)i∈Zand(Xi)i∈Zbe as in Theorem 4.2. Let(ǫi′)i∈Zbe an independent copy of(ǫi)i∈Z. Let V0=P
i∈Zaiǫ−i and M1,i=|V0| ∨
X
j<i
ajǫ−j+X
j≥i
ajǫ−′ j
and M2,i=|V0| ∨
X
j<i
ajǫ−′j+X
j≥i
ajǫ−j . If for some p∈]2, 3],
X
i≥1
ip/2−1 wh
X
j≥i
ajǫ−j
,M1,i
p<∞ and X
i≥1
ip/2−1 wh
X
j<−i
ajǫ−j
,M2,−i p<∞,
(4.10) then the conclusions of Theorem 4.2 hold.
To prove Theorem 4.2, it remains to check (4.10). We only check the first condition. Since wh(t,M)≤C tγMα and the random variablesǫi are iid, we have
wh
X
j≥i
ajǫ−j
,M1,i
p ≤ C
X
j≥i
ajǫ−j
γ
|V0|α p+C
X
j≥i
ajǫ−j
γ
pk|V0|αkp, so that
wh
X
j≥i
ajǫ−j
,M1,i p
≤C 2α
X
j≥i
ajǫ−j
α+γ p+
X
j≥i
ajǫ−j
γ p
k|V0|αkp+2α
X
j<i
ajǫ−j
α p
. From Burkholder’s inequality, for anyβ >0,
X
j≥i
ajǫ−j
β p=
X
j≥i
ajǫ−j
β
βp≤K X
j≥i
a2jβ/2
kǫ0kβ2∨βp.
Applying this inequality withβ=γorβ =α+γ, we infer that the first part of (4.10) holds under (4.9). The second part can be handled in the same way.
Proof of Proposition 4.2. Let Fi = σ(ǫk,k ≤ i). We shall first prove that the condition (3.2) of Theorem 3.1 holds. We write
kE(S2n|F0) − E(S2n)kp/2≤2 Xn
i=1
Xn−i
k=0
kE(XiXk+i|F0)−E(XiXk+i)kp/2
≤ 4 Xn
i=1
Xn
k=i
kE(XiXk+i|F0)kp/2+2 Xn
i=1
Xi
k=1
kE(XiXk+i|F0)−E(XiXk+i)kp/2. We first control the second term. Let ǫ′ be an independent copy of ǫ, and denote by Eǫ(·) the conditional expectation with respect toǫ. Define
Yi=X
j<i
ajǫi−j, Yi′=X
j<i
ajǫ′i−j,Zi=X
j≥i
ajǫi−j, and Zi′=X
j≥i
ajǫ′i−j.
TakingFℓ=σ(ǫi,i≤ℓ), and settingh0=h−E(h(P
i∈Zaiǫi)), we have kE(XiXk+i|F0)−E(XiXk+i)kp/2
= Eǫ
h0(Yi′+Zi)h0(Yk+i′ +Zk+i)
−Eǫ
h0(Yi′+Zi′)h0(Yk+i′ +Zk+i′ ) p/2. Applying first the triangle inequality, and next Hölder’s inequality, we get that
kE(XiXk+i|F0)−E(XiXk+i)kp/2 ≤ kh0(Yk+i′ +Zk+i)kpkh0(Yi′+Zi)−h0(Yi′+Zi′)kp
+ kh0(Yi′+Zi′)kpkh0(Yk+i′ +Zk+i)−h0(Yk+i′ +Zk+i′ )kp.