Skip to main content

Complete moment convergence for product sums of sequence of extended negatively dependent random variables

Abstract

Complete moment convergence for product sums of sequences of extended negatively dependent random variables are discussed by utilizing the method of Wang et al. (Chin. Ann. Math., Ser. A 22:701-706, 2001). The sufficient conditions of complete moment convergence for product sums of sequence of extended negatively dependent random variables are obtained.

1 Introduction and main results

Let \(\{\Omega,\mathfrak{F},P\}\) ba a complete probability space. The random variables we deal with are all defined on \(\{\Omega,\mathfrak {F},P\}\). The concept of complete convergence of a sequence of random variables was introduced by Hsu and Robbins [1] as follows. A sequence \(\{U_{n}, n\ge1\}\) of random variables converges completely to the constant θ if

$$\sum_{n=1}^{\infty}P\bigl(|U_{n}-\theta|> \epsilon\bigr)< \infty \quad\mbox{for all } \epsilon> 0. $$

Moreover, they proved that the sequence of arithmetic means of independent identically distribution (i.i.d.) random variables converges completely to the expected value if the variance of the summands is finite. This result has been generalized and extended in several directions, one can refer to [2–10] and so on.

Chow [11] first investigated the complete moment convergence, which is more exact than complete convergence. He obtained the following result: Let \(p>1/\alpha\) and \(1/2<\alpha\le1\). Let \(\{X, X_{n},n\ge1\}\) be a sequence of i.i.d. random variables. Assume that \(EX=0\) and \(E\{|X|^{p}+|X|\log(1+|X|)\}<\infty\). Then

$$\sum_{n=1}^{\infty}n^{(p-1)\alpha-2}E \Biggl\{ \Biggl\vert \sum_{k=1}^{n} X_{k}\Biggr\vert -\epsilon n^{\alpha}\Biggr\} _{+}< \infty \quad\mbox{for all } \epsilon>0, $$

where \(x_{+}=\max\{0, x\}\). Chow’s result has been generalized and extended in several directions, please see [12–18] and so forth.

Definition 1.1

A sequence of random variables \(\{ X_{n}, n\ge1\} \) is said to be extended negatively dependent (END) if there exists a constant \(M\ge1\) such that for each \(n \ge2\),

$$P(X_{1}\le x_{1}, \ldots, X_{n}\le x_{n})\le M\prod_{i=1}^{n} P(X_{i}\le x_{i}) $$

and

$$P(X_{1}> x_{1}, \ldots, X_{n}> x_{n})\le M\prod_{i=1}^{n} P(X_{i}> x_{i}) $$

hold for every sequence \(\{x_{1}, \ldots, x_{n}\}\) of real numbers.

The concept was introduced by Liu [19]. When \(M=1\), the notion of END random variables reduces to the well-known notion of so-called negatively dependent (ND) random variables which was introduced by Alam and Saxena [20], Block et al. [21], Joag-Dev and Proschan [22]. As mentioned by Liu [19], the END structure is substantially more comprehensive than the ND structure in that it can reflect not only a negative dependence structure but also a positive one, to some extent. Liu [19] pointed out that the END random variables can be taken as negatively or positively dependent and provided some interesting examples to support this idea. Joag-Dev and Proschan [22] also pointed out that negatively associated (NA) random variables must be ND and ND is not necessarily NA, thus NA random variables are END. A great numbers of articles for ND random variables have appeared in the literature. For further research on END random variables, please see [5, 7–9, 23–28] and so on.

The aim of this paper is to extend and improve Chow’s result in i.i.d. case to extended negatively dependent (END) random variables. Some new sufficient conditions of complete moment convergence results for product sums of sequence of extended negatively dependent (END) random variables are obtained.

Definition 1.2

A sequence of random variables \(\{ X_{n}, n\ge1\} \) is said to be stochastically dominated by a random variable X in Cesàro meaning if there exists a constant \(D>0\) such that for all \(x>0\), \(n\ge1\),

$$\sum_{i=1}^{n} P\bigl(|X_{i}|>x\bigr)\le Dn P\bigl(|X|>x\bigr). $$

In this case we write \(\{X_{n},n\ge1\}\prec X\).

Now we state the main results, some lemmas will be given in Section 2 and the proofs of the main results will be given in Section 3.

Theorem 1.1

Let \(\alpha>1/2\), \(p>1/\alpha\), m be a positive integer. Let \(\{X_{n},n\ge1\}\) be a sequence of END random variables with \(\{X_{n},n\ge1\}\prec X\). Moreover, additionally assume that for \(\alpha\le1\), \(EX_{n}=0\) for all \(n\ge1\). If

$$ E|X|^{p}< \infty. $$
(1.1)

Then the following statements hold:

$$\begin{aligned}& \sum_{n=m}^{\infty}n^{\alpha p-2 } P \Biggl(\max_{m\le k \le n} \Biggl\vert \sum _{1\le i_{1}< i_{2}< \cdots< i_{m}\le k}\prod_{j=1}^{m} X_{i_{j}}\Biggr\vert >\epsilon n^{m\alpha} \Biggr)< \infty \quad\textit{for all } \epsilon>0, \end{aligned}$$
(1.2)
$$\begin{aligned}& \sum_{n=m}^{\infty}n^{\alpha p-2 } P \Biggl(\sup_{k\ge n} k^{-m\alpha }\Biggl\vert \sum _{1\le i_{1}< i_{2}< \cdots< i_{m}\le k}\prod_{j=1}^{m} X_{i_{j}}\Biggr\vert >\epsilon \Biggr)< \infty \quad\textit{for all } \epsilon>0. \end{aligned}$$
(1.3)

Theorem 1.2

Let \(q>0\), \(\alpha>1/2\), \(p>1/\alpha\), m, v be positive integers such that \(1\le v\le m\). Let \(\{X_{n},n\ge1\}\) be a sequence of END random variables with \(\{X_{n},n\ge1\}\prec X\). Moreover, additionally assume that for \(\alpha\le1\), \(EX_{n}=0\) for all \(n\ge1\). Assume

$$ \textstyle\begin{cases} E|X|^{p}< \infty,& mq< p,\\ E|X|^{p}\log(1+|X|)< \infty,& mq=p,\\ E|X|^{mq}< \infty,& mq>p. \end{cases} $$
(1.4)

Then the following statements hold:

$$\begin{aligned}& \sum_{n=m}^{\infty}n^{\alpha( p-mq)-2 } E \Biggl\{ \max_{m\le k \le n} \Biggl\vert \sum_{1\le i_{1}< i_{2}< \cdots< i_{m}\le k} \prod_{j=1}^{m} X_{i_{j}}\Biggr\vert -\epsilon n^{m\alpha} \Biggr\} _{+}^{q}< \infty\quad\textit{for all } \epsilon>0, \end{aligned}$$
(1.5)
$$\begin{aligned}& \sum_{n=m}^{\infty}n^{\alpha p-2 } E \Biggl\{ \sup_{ k \ge n} k^{-m\alpha }\Biggl\vert \sum _{1\le i_{1}< i_{2}< \cdots< i_{m}\le k}\prod_{j=1}^{m} X_{i_{j}}\Biggr\vert -\epsilon \Biggr\} _{+}^{q}< \infty \quad\textit{for all } \epsilon>0, \end{aligned}$$
(1.6)

where \(x_{+}^{q}=(x_{+})^{q}\).

Throughout this paper, C will represent positive constant not depending on n which may change from one place to another, the symbol ♯A denotes the number of elements in the set A.

2 Lemmas

In order to prove our main result, we need the following lemmas.

Lemma 2.1

(Liu [19])

Let \(X_{1},X_{2},\ldots,X_{n}\) be END random variables. Assume that \(f_{1},f_{2},\ldots,f_{n}\) are Borel functions all of which are monotone increasing (or monotone decreasing). Then \(f_{1}(X_{1}),f_{2}(X_{2}),\ldots,f_{n}(X_{n})\) are END random variables.

Lemma 2.2

(Shen [26])

For any \(s\ge2\), there is a positive constant \(C_{s}\) depending only on s such that if \(\{X_{n}, n\ge1\}\) is a sequence of END random variables with \(EX_{n}=0\) for every \(n\ge1\), then for all \(n\ge1\),

$$E\Biggl\vert \sum_{j=1}^{n}X_{j} \Biggr\vert ^{s} \le C_{s} \Biggl\{ \sum _{j=1}^{n} E|X_{j}|^{s}+ \Biggl(\sum_{j=1}^{n }E|X_{j}|^{2} \Biggr)^{s/2} \Biggr\} . $$

By Lemma 2.2 and the same argument as Theorem 2.3.1 in Stout [29], the following lemma holds.

Lemma 2.3

For any \(s\ge2\), there is a positive constant \(C_{s}\) depending only on s such that if \(\{X_{n}, n\ge1\}\) is a sequence of END random variables with \(EX_{n}=0\) for every \(n\ge1\), then for all \(n\ge1\),

$$E\max_{1\le k\le n}\Biggl\vert \sum_{j=1}^{k}X_{j} \Biggr\vert ^{s} \le C_{s} \bigl(\log (4n) \bigr)^{s} \Biggl\{ \sum_{j=1}^{n} E|X_{j}|^{s}+ \Biggl(\sum_{j=1}^{n}E|X_{j}|^{2} \Biggr)^{s/2} \Biggr\} . $$

Lemma 2.4

(Kuczmaszewska [4])

Let s, x be positive constants. Let \(\{X_{n},n\ge1\}\) be a sequence of random variables with \(\{X_{n},n\ge1\}\prec X\).

  1. (i)

    If \(E|X|^{s}<\infty\), then \(\frac{1}{n}\sum_{j=1}^{n} E|X_{j}|^{s} \le CE|X|^{s} \);

  2. (ii)

    \(\frac{1}{n}\sum_{j=1}^{n} E|X_{j}|^{s} I(|X_{j}|\le x)\le C \{E|X|^{s} I(|X|\le x)+x^{s} P(|X|>x) \}\);

  3. (iii)

    \(\frac{1}{n}\sum_{j=1}^{n} E|X_{j}|^{s} I(|X_{j}|>x) \le C E|X|^{s} I(|X|>x) \).

Lemma 2.5

(Wang et al. [30])

Let m, n be positive integers such that \(1\le m\le n\) and \(\{x_{n}, n\ge1\}\) be a sequence of real numbers. Then

$$\sum_{1\le i_{1}< i_{2}< \cdots< i_{m}\le n } \prod_{k=1}^{m} x_{i_{k}}=\sum_{\sum _{k=1}^{m}r_{k}s_{k}=m} A(m,r_{k},s_{k}:k=1, \ldots,m)\prod_{k=1}^{m}\Biggl(\sum _{j=1}^{n} x_{j}^{r_{k}} \Biggr)^{s_{k}}, $$

where \(A(m,r_{k},s_{k}:k=1,\ldots,m)\) are constants and \(r_{k}\), \(s_{k}\) are positive integers depending only on m.

Lemma 2.6

Let α, β be positive constants such that \(\alpha+\beta=1\). Let X, Y be random variables. Then for all \(\varepsilon>0\),

$$\bigl(|X+Y|>\varepsilon\bigr)\subseteq\bigl(|X|> \alpha\varepsilon\bigr) \cup\bigl(|Y|>\beta \varepsilon\bigr),\qquad \bigl(|XY|>\varepsilon\bigr)\subseteq\bigl(|X|>\varepsilon^{\alpha}\bigr)\cup \bigl(|Y|>\varepsilon^{\beta}\bigr). $$

Proof

Note that \(\omega\notin (|X|> \alpha\varepsilon )\cup(|Y|>\beta\varepsilon)\), we have \(\omega\notin (|X|> \alpha \varepsilon)\) and \(\omega\notin (|Y|>\beta\varepsilon)\). Since \(|X(\omega)+Y(\omega)|\le|X(\omega)|+|Y(\omega)|\le \alpha\varepsilon +\beta\varepsilon= \varepsilon\), we have \(\omega\notin (|X+Y|>\varepsilon)\). Thus, \((|X+Y|>\varepsilon)\subseteq(|X|> \alpha \varepsilon)\cup(|Y|>\beta\varepsilon)\) holds. Similarly, \((|XY|>\varepsilon)\subseteq(|X|>\varepsilon^{\alpha})\cup (|Y|>\varepsilon^{\beta})\) is true. □

Lemma 2.7

Let \(\alpha>0\), \(p>0\), \(q>0\), \(0<\gamma<1\), m be a positive integer. Let \(\{X_{n},n\ge1\}\) be a sequence of random variables. If \(s>\max\{p,mq\}/(1-\gamma)\), then the following statements hold:

$$\begin{aligned} & \sum_{n=m}^{\infty}n^{\alpha( p-mq)-2 }\bigl( \log(4n)\bigr)^{s}\int_{n^{mq \alpha }}^{\infty}t^{-(1-\gamma)s/(mq)} \Biggl( \sum_{j=1}^{n} P \bigl(|X_{j}|>t^{\gamma /(mq)}\bigr) \Biggr)\,dt< \infty, \\ & \sum_{n=m}^{\infty}n^{\alpha( p-mq)-2 }\bigl( \log(4n)\bigr)^{s}\int_{n^{mq \alpha }}^{\infty}t^{-s/(mq)} \Biggl( \sum_{j=1}^{n}E|X_{j}|^{s}I \bigl(|X_{j}|\le t^{\gamma/(mq)}\bigr) \Biggr)\,dt < \infty. \end{aligned}$$

Proof

Since \(s>\max\{p,mq\}/(1-\gamma)\), we obtain

$$\begin{aligned} &\sum_{n=m}^{\infty}n^{\alpha( p-mq)-2 }\bigl( \log(4n)\bigr)^{s}\int_{n^{mq \alpha}}^{\infty}t^{-(1-\gamma)s/(mq)} \Biggl( \sum_{j=1}^{n} P \bigl(|X_{j}|>t^{\gamma/(mq)}\bigr) \Biggr)\,dt \\ & \quad\le \sum_{n=m}^{\infty}n^{\alpha( p-mq)-1 }\bigl(\log(4n)\bigr)^{s}\int_{n^{mq \alpha}}^{\infty}t^{-(1-\gamma)s/(mq)}\,dt \\ & \quad= \sum_{n=m}^{\infty}n^{\alpha p-(1-\gamma)s\alpha-1 } \bigl(\log(4n)\bigr)^{s} \\ &\quad < \infty \end{aligned}$$

and

$$\begin{aligned} & \sum_{n=m}^{\infty}n^{\alpha( p-mq)-2 }\bigl( \log(4n)\bigr)^{s}\int_{n^{mq \alpha}}^{\infty}t^{-s/(mq)} \Biggl( \sum_{j=1}^{n}E|X_{j}|^{s}I \bigl(|X_{j}|\le t^{\gamma/(mq)}\bigr) \Biggr)\,dt \\ &\quad\le \sum_{n=m}^{\infty}n^{\alpha( p-mq)-1 }\bigl(\log(4n)\bigr)^{s}\int_{n^{mq \alpha}}^{\infty}t^{-(1-\gamma)s/(mq)}\,dt < \infty. \end{aligned}$$

 □

Lemma 2.8

Let \(\alpha>0\), \(p>0\), \(q>0\), \(0<\gamma<1\), m, v be positive integers such that \(1\le v\le m\). Let \(\{X_{n},n\ge1\}\) be a sequence of random variables with \(\{X_{n},n\ge1\} \prec X\). Moreover, additionally assume that for \(\alpha\le1\), \(EX_{n}=0\) for all \(n\ge1\). If (1.4) holds, then the following statements hold:

$$\begin{aligned} & \sum_{n=m}^{\infty}n^{\alpha( p-m q)-2 }\int _{n^{mq \alpha}}^{\infty}\Biggl( \sum _{j=1}^{n} P\bigl(|X_{j}|>t^{1/(mq)} \bigr) \Biggr)\,dt < \infty, \\ & \sum_{n=m}^{\infty}n^{\alpha( p-mq)-2 }\int _{n^{mq \alpha}}^{\infty}t^{-vs/(mq)} \Biggl( \sum _{j=1}^{n}E|X_{j}|^{vs}I \bigl(|X_{j}|\le t^{1/(mq)}\bigr) \Biggr)\,dt< \infty\\ &\quad\textit{for all } s>\max\{p,mq\}/v. \end{aligned}$$

Proof

By the mean-value theorem, a standard computation and (1.4), we have

$$\begin{aligned} &\sum_{n=m}^{\infty}n^{\alpha( p-m q)-2 }\int _{n^{mq \alpha }}^{\infty}\Biggl( \sum _{j=1}^{n} P\bigl(|X_{j}|>t^{1/(mq)} \bigr) \Biggr)\,dt \\ &\quad \le C\sum_{n=m}^{\infty}n^{\alpha( p-m q)- 1}\int_{n^{mq \alpha }}^{\infty}P \bigl(|X|>t^{1/(mq)}\bigr)\,dt \\ &\quad = C\sum_{n=m}^{\infty} n^{\alpha( p-m q)-1 } \sum_{k=n}^{\infty}\int_{k^{mq \alpha}}^{(k+1)^{mq \alpha}} P\bigl(|X|> t^{1/(mq}\bigr)\,dt \\ &\quad \le C\sum_{n=1}^{\infty} n^{\alpha( p-m q)-1 } \sum_{k=n}^{\infty}k^{mq \alpha-1} P\bigl(|X|> k^{\alpha}\bigr) \\ &\quad = C\sum_{k=1}^{\infty}k^{mq \alpha-1} P\bigl(|X|> k^{\alpha}\bigr) \sum _{n=1}^{k} n^{\alpha( p-m q)-1 } \\ &\quad \le \textstyle\begin{cases} C\sum_{k=1}^{\infty}k^{\alpha p-1} P(|X|> k^{\alpha}) , & \mbox{if } mq< p ,\\ C\sum_{k=1}^{\infty}k^{\alpha p-1}\log(1+k) P(|X|> k^{\alpha}) ,& \mbox{if } mq=p ,\\ C\sum_{k=1}^{\infty}k^{mq \alpha-1} P(|X|> k^{\alpha}) ,&\mbox{if } mq>p \end{cases}\displaystyle \displaystyle \\ &\quad \le \textstyle\begin{cases} C E|X|^{p},& \mbox{if } mq< p ,\\ CE|X|^{p}\log(1+|X|),& \mbox{if } mq=p,\\ CE|X|^{mq},& \mbox{if } mq>p \end{cases}\displaystyle \displaystyle \displaystyle \displaystyle \displaystyle \\ & \quad< \infty. \end{aligned}$$

Since \(s>\max\{p,mq\}/v\), by Lemma 2.4(ii), the mean-value theorem, a standard computation, (1.4), and the argument above, we also have

$$\begin{aligned} &\sum_{n=m}^{\infty}n^{\alpha( p-mq)-2 }\int _{n^{mq \alpha }}^{\infty}t^{-vs/(mq)} \Biggl( \sum _{j=1}^{n}E|X_{j}|^{vs}I \bigl(|X_{j}|\le t^{1/(mq)}\bigr) \Biggr)\,dt \\ &\quad \le C\sum_{n=m}^{\infty}n^{\alpha( p-mq)-1 }\int_{n^{mq \alpha }}^{\infty}\bigl( t^{-vs/(mq)} E|X|^{vs}I\bigl(|X|\le t^{1/(mq)}\bigr)+P \bigl(|X|>t^{1/(mq)}\bigr) \bigr)\,dt \\ &\quad \le C\sum_{n=1}^{\infty} n^{\alpha( p-mq)-1} \sum_{k=n}^{\infty}\int _{k^{mq \alpha}}^{(k+1)^{mq \alpha}} x^{-v s/(mq)} E|X|^{vs}I \bigl(|X|\le x^{1/(mq)}\bigr)\,dx+C \\ & \quad\le C\sum_{n=1}^{\infty} n^{\alpha( p-mq)-1} \sum_{k=n}^{\infty}k^{mq \alpha-vs\alpha-1} E|X|^{v s}I\bigl(|X|\le(k+1)^{\alpha}\bigr)+C \\ &\quad \le C\sum_{k=1}^{\infty}k^{mq \alpha-vs\alpha-1} E|X|^{v s}I\bigl(|X|\le (k+1)^{\alpha}\bigr) \sum _{n=1}^{k} n^{\alpha( p-mq)-1}+C \\ & \quad\le \textstyle\begin{cases} C\sum_{k=1}^{\infty} k^{\alpha p-v s\alpha-1} E|X|^{v s} I(|X|\le k^{\alpha})+C ,& \mbox{if } mq< p ,\\ C\sum_{k=1}^{\infty} k^{\alpha p-v s \alpha-1}\log(1+k) E|X|^{v s}I(|X|\le k^{\alpha})+C,&\mbox{if } mq=p ,\\ C\sum_{k=1}^{\infty} k^{mq \alpha-vs\alpha-1} E|X|^{v s}I(|X|\le k^{\alpha})+C,& \mbox{if } mq>p \end{cases}\displaystyle \displaystyle \\ &\quad \le \textstyle\begin{cases} C E|X|^{p}+C,& \mbox{if } mq< p ,\\ CE|X|^{p}\log(1+|X|)+C,& \mbox{if } mq=p,\\ CE|X|^{mq}+C,& \mbox{if } mq>p \end{cases}\displaystyle \displaystyle \displaystyle \displaystyle \displaystyle \\ &\quad < \infty. \end{aligned}$$

 □

3 Proofs

Proof of Theorem 1.1

First, we prove (1.2). By Lemma 2.5, Lemma 2.6, and the Jensen inequality (see Ragusa and Tachikawa [31]), in order to prove (1.2), it suffices to show that

$$ \sum_{n=m}^{\infty}n^{\alpha p-2} P \Biggl(\max_{m\le k\le n}\Biggl\vert \sum _{j=1}^{k} X_{j}\Biggr\vert >\epsilon n^{\alpha} \Biggr)< \infty,\quad \forall \varepsilon>0, $$
(3.1)

and

$$ \sum_{n=m}^{\infty}n^{\alpha p-2}P \Biggl( \max_{m\le k\le n}\sum_{j=1}^{k} X_{j}^{2}>\epsilon n^{2\alpha} \Biggr)< \infty, \quad\forall \varepsilon>0. $$
(3.2)

Utilizing similar method to the proof of Theorem 2.1 of Qiu et al. [6] we can prove (3.1). Now, we prove (3.2). Since \(X_{n}^{2}=X_{n}^{2}I(X_{n}<0)+X_{n}^{2}I(X_{n}\ge0)\), thus, without loss of generality, we assume that \(X_{n}\ge0\), \(n\ge1\). Note that \(\{X_{n}^{2},n\ge 1\}\prec X^{2}\), \(E(X^{2})^{p/2}= E|X|^{p}<\infty\), and \(2\alpha>1\), hence, (3.2) holds by Lemma 2.1 and (3.1). Therefore, (1.2) holds.

Next, we prove (1.3). For any fixed positive integer m, there exists a positive integer \(i_{0}\) such that \(2^{i_{0}-1}\le m<2^{i_{0}}\). Thus, by (1.2) and \(\alpha p>1\) we have

$$\begin{aligned} &\sum_{n=2^{i_{0}}}^{\infty} n^{\alpha p-2} P \Biggl( \sup_{ k\ge n}k^{-m/p}\Biggl\vert \sum _{1\le i_{1}< i_{2}< \cdots< i_{m}\le k}\prod_{j=1}^{m} X_{i_{j}}\Biggr\vert >\varepsilon \Biggr) \\ &\quad = \sum_{i=i_{0}}^{\infty} \sum _{n=2^{i}}^{2^{i+1}-1}n^{\alpha p-2} P \Biggl(\sup _{ k\ge n}k^{-m/p}\Biggl\vert \sum _{1\le i_{1}< i_{2}< \cdots< i_{m}\le k}\prod_{j=1}^{m} X_{i_{j}}\Biggr\vert >\varepsilon \Biggr) \\ &\quad \le C\sum_{i=i_{0}}^{\infty} 2^{i(\alpha p-1)} \sum_{l=i}^{\infty}P \Biggl( \max_{ 2^{l}\le k< 2^{l+1}}k^{-m/p}\Biggl\vert \sum _{1\le i_{1}< i_{2}< \cdots < i_{m}\le k}\prod_{j=1}^{m} X_{i_{j}}\Biggr\vert >\varepsilon \Biggr) \\ & \quad\le C\sum_{l=i_{0}}^{\infty} 2^{l(\alpha p-1)} P \Biggl(\max_{ 2^{l}\le k< 2^{l+1}}\Biggl\vert \sum _{1\le i_{1}< i_{2}< \cdots< i_{m}\le k}\prod_{j=1}^{m} X_{i_{j}}\Biggr\vert >\varepsilon2^{ml/p} \Biggr) \\ &\quad \le C\sum_{l=i_{0}}^{\infty} 2^{l(\alpha p-1)} P \Biggl(\max_{m\le k< 2^{l+1}}\Biggl\vert \sum _{1\le i_{1}< i_{2}< \cdots< i_{m}\le k}\prod_{j=1}^{m} X_{i_{j}}\Biggr\vert >\varepsilon2^{ml/p} \Biggr) \\ & \quad\le C\sum_{n=m}^{\infty} n^{\alpha p-2} P \Biggl(\max_{m\le k\le n}\Biggl\vert \sum _{1\le i_{1}< i_{2}< \cdots< i_{m}\le k}\prod_{j=1}^{m} X_{i_{j}}\Biggr\vert >\varepsilon_{0} n^{m/p} \Biggr) < \infty, \end{aligned}$$

where \(\varepsilon_{0}=2^{-m/p}\varepsilon\). Therefore, (1.3) holds. □

Proof of Theorem 1.2

First, we prove (1.5). Denote \(h(n)=n^{\alpha( p-mq)-2 }\). Note that for all \(\varepsilon>0\),

$$\begin{aligned} &\sum_{n=m}^{\infty}h(n) E \Biggl\{ \max _{m\le k \le n} \Biggl\vert \sum_{1\le i_{1}< i_{2}< \cdots< i_{m}\le k} \prod_{j=1}^{m} X_{i_{j}}\Biggr\vert -\epsilon n^{m\alpha} \Biggr\} _{+}^{q} \\ &\quad= \sum_{n=m}^{\infty}h(n)\int _{0}^{\infty}P \Biggl(\max_{m\le k \le n} \Biggl\vert \sum_{1\le i_{1}< i_{2}< \cdots< i_{m}\le k}\prod _{j=1}^{m} X_{i_{j}}\Biggr\vert - \varepsilon n^{m\alpha}>t^{1/q} \Biggr)\,dt \\ &\quad= \sum_{n=m}^{\infty}h(n)\int _{0}^{n^{m q \alpha}} P \Biggl(\max_{m\le k \le n} \Biggl\vert \sum_{1\le i_{1}< i_{2}< \cdots< i_{m}\le k}\prod _{j=1}^{m} X_{i_{j}}\Biggr\vert - \varepsilon n^{m\alpha}>t^{1/q} \Biggr)\,dt \\ &\qquad{} + \sum_{n=m}^{\infty}h(n)\int _{n^{mq \alpha}}^{\infty}P \Biggl(\max_{m\le k \le n} \Biggl\vert \sum_{1\le i_{1}< i_{2}< \cdots< i_{m}\le k}\prod _{j=1}^{m} X_{i_{j}}\Biggr\vert - \varepsilon n^{m\alpha}>t^{1/q} \Biggr)\,dt \\ &\quad\le \sum_{n=m}^{\infty}n^{\alpha p-2 } P \Biggl(\max_{m\le k \le n} \Biggl\vert \sum _{1\le i_{1}< i_{2}< \cdots< i_{m}\le k}\prod_{j=1}^{m} X_{i_{j}}\Biggr\vert >\varepsilon n^{m\alpha} \Biggr) \\ &\qquad{} + \sum_{n=m}^{\infty}h(n)\int _{n^{mq \alpha}}^{\infty}P \Biggl(\max_{m\le k \le n} \Biggl\vert \sum_{1\le i_{1}< i_{2}< \cdots< i_{m}\le k}\prod _{j=1}^{m} X_{i_{j}}\Biggr\vert >t^{1/q} \Biggr)\,dt. \end{aligned}$$

By Lemma 2.5 and Lemma 2.6, we have

$$\begin{aligned} &P \Biggl(\max_{m\le k\le n}\Biggl\vert \sum _{1\le i_{1}< i_{2}< \cdots< i_{m}\le k}\prod_{j=1}^{m} X_{i_{j}}\Biggr\vert >t^{1/q} \Biggr) \\ &\quad\le\sum_{\sum_{j=1}^{m}r_{j}s_{j}=m} P \Biggl(\max _{m\le k\le n}\Biggl\vert \prod_{j=1}^{m} \Biggl(\sum_{i=1}^{k} X_{i}^{r_{j}} \Biggr)^{s_{j}}\Biggr\vert > \bigl(b A(m,r_{j},s_{j}:j=1, \ldots,m) \bigr)^{-1}t^{1/q} \Biggr) \\ &\quad \le\sum_{\sum_{j=1}^{m}r_{j}s_{j}=m}\sum ^{m}_{j=1} P \Biggl(\max_{m\le k\le n} \Biggl\vert \sum_{i=1}^{k} X_{i}^{r_{j}}\Biggr\vert ^{s_{j}}> \bigl(bA(m,r_{j},s_{j}:j=1,\ldots,m) \bigr)^{-r_{j}s_{j}/m}t^{r_{j}s_{j}/(mq)} \Biggr) \\ &\quad =\sum_{\sum_{j=1}^{m}r_{j}s_{j}=m}\sum ^{m}_{j=1} P \Biggl(\max_{m\le k\le n} \Biggl\vert \sum_{i=1}^{k} X_{i}^{r_{j}}\Biggr\vert > \bigl(bA(m,r_{j},s_{j}:j=1, \ldots ,m) \bigr)^{-r_{j}/m}t^{r_{j}/(mq)} \Biggr), \end{aligned}$$

where \(b=\sharp\{(r_{j},s_{j}:j=1,\ldots,m):\sum_{j=1}^{m}r_{j}s_{j}=m\}\). Obviously, b is a constant depending only on m. Therefore, in order to prove (1.5), by Theorem 1.1 and the above inequality, it is enough to show that for all integers \(v: 1\le v\le m\) we have

$$ \sum_{n=m}^{\infty}h(n)\int _{n^{mq \alpha}}^{\infty}P \Biggl(\max_{m\le k\le n} \Biggl\vert \sum_{j=1}^{k} X_{j}^{v} \Biggr\vert >\varepsilon t^{v/(mq)} \Biggr)\,dt< \infty\quad\mbox{for all }\varepsilon>0. $$
(3.3)

We first prove that for any fixed positive integer \(n\ge m\), \(\forall \varepsilon>0\), \(1\le v\le m\),

$$\begin{aligned} M_{n}&:=\int_{n^{mq \alpha}}^{\infty}P \Biggl(\sum _{j=1}^{n} \bigl( |X_{j}|^{v}I \bigl(|X_{j}|\le t^{1/(mq)}\bigr)+t^{v/(mq)}I \bigl(|X_{j}|>t^{1/(mq)}\bigr) \bigr)> \varepsilon t^{v/(mq)} \Biggr)\,dt \\ &< \infty. \end{aligned}$$
(3.4)

If \(\max\{p, mq\}< v\), let \(s=1\), by Lemma 2.8 we have

$$\begin{aligned} M_{n} \le{}& \int_{n^{mq \alpha}}^{\infty}P \Biggl( \sum_{j=1}^{n} |X_{j}|^{v}I \bigl(|X_{j}|\le t^{1/(mq)}\bigr)> \varepsilon t^{v/(mq)}/2 \Biggr)\,dt \\ &{} +\int_{n^{mq \alpha}}^{\infty}P \Biggl(\sum _{j=1}^{n} t^{v/(mq)}I\bigl(|X_{j}|>t^{1/(mq)} \bigr)> \varepsilon t^{v/(mq)}/2 \Biggr)\,dt \\ \le{}& 2 \varepsilon^{-1} \int_{n^{mq \alpha}}^{\infty}t^{-v/(mq)} \Biggl(\sum_{j=1}^{n} E |X_{j}|^{v}I\bigl(|X_{j}|\le t^{1/(mq)} \bigr) \Biggr)\,dt \\ &{}+\int_{n^{mq \alpha}}^{\infty}\Biggl(\sum _{j=1}^{n} P\bigl( |X_{j}|>t^{1/(mq)} \bigr) \Biggr)\,dt \\ < {}& \infty. \end{aligned}$$

If \(\max\{p, mq\}\ge v\), by (1.4), the \(C_{r}\) inequality, and Lemma 2.4(i) we have

$$\begin{aligned} M_{n}& \le \int_{n^{mq \alpha}}^{\infty}P \Biggl( \sum_{j=1}^{n} |X_{j}|^{v}> \varepsilon t^{v/(mq)} \Biggr)\,dt \\ & \le\int_{0}^{\infty}P \Biggl( \Biggl(\sum _{j=1}^{n} |X_{j}|^{v} \Biggr)^{mq/v}>\varepsilon^{mq/v} t \Biggr)\,dt = CE \Biggl(\sum _{j=1}^{n} |X_{j}|^{v} \Biggr)^{mq/v} \\ & \le \textstyle\begin{cases} C\sum_{j=1}^{n} E|X_{j}|^{mq},& \mbox{if } mq\le v ,\\ Cn^{mq/v-1}\sum_{j=1}^{n} E|X_{j}|^{mq},& \mbox{if } mq>v \end{cases}\displaystyle \\ & \le \textstyle\begin{cases} Cn E|X|^{mq},& \mbox{if } mq\le v ,\\ Cn^{mq/v} E|X|^{mq},& \mbox{if } mq>v \end{cases}\displaystyle \\ & < \infty. \end{aligned}$$

Therefore, (3.4) holds. To prove (3.3), we consider two cases.

Case 1: \(2\le v\le m\). We note

$$\begin{aligned} &\sum_{n=m}^{\infty}h(n)\int _{n^{mq \alpha}}^{\infty}P \Biggl(\max_{m\le k\le n} \Biggl\vert \sum_{j=1}^{k} X_{j}^{v} \Biggr\vert >\varepsilon t^{\frac {v}{mq}} \Biggr)\,dt \\ &\quad\le \sum_{n=m}^{\infty}h(n)\int _{n^{mq \alpha}}^{\infty}P \Biggl(\sum _{j=1}^{n} |X_{j}|^{v} > \varepsilon t^{\frac{v}{mq}} \Biggr)\,dt. \end{aligned}$$
(3.5)

and

$$|X_{j}|^{v}=|X_{j}|^{v}I(X_{j}< 0)+|X_{j}|^{v}I(X_{j} \ge0). $$

Therefore, without loss of generality, we assume that \(X_{j}\ge0\), \(j\ge 1\). Note that

$$X_{j}^{v}= X_{j}^{v}I \bigl(X_{j}\le t^{1/(mq)}\bigr)+t^{v/(mq)}I \bigl(X_{j}>t^{1/(mq)}\bigr)+\bigl(X_{j}^{v}-t^{v/(mq)} \bigr)I\bigl(X_{j}>t^{1/(mq)}\bigr),\quad j\ge1. $$

Denote \(Y_{j}^{(v,t)}=X_{j}^{v}I(X_{j}\le t^{1/(mq)})+t^{v/(mq)}I(X_{j}>t^{1/(mq)})\), hence, we have

$$\begin{aligned} &\sum_{n=m}^{\infty}h(n)\int _{n^{mq \alpha}}^{\infty}P \Biggl(\sum _{j=1}^{n} X_{j}^{v} > \varepsilon t^{v/(mq)} \Biggr)\,dt \\ & \quad\le \sum_{n=m}^{\infty}h(n)\int _{n^{mq \alpha}}^{\infty}P \Biggl(\sum _{j=1}^{n} \bigl(X_{j}^{v}-t^{v/(mq)} \bigr)I\bigl(X_{j}>t^{1/(mq)} \bigr) >\varepsilon t^{v/(mq)}/2 \Biggr)\,dt \\ & \qquad{} +\sum_{n=m}^{\infty}h(n)\int _{n^{mq \alpha}}^{\infty}P \Biggl(\sum _{j=1}^{n} Y_{j}^{(v,t)}> \varepsilon t^{v/(mq)}/2 \Biggr)\,dt \\ &\quad = I_{1}^{(v)}+I_{2}^{(v)}. \end{aligned}$$

By Lemma 2.8 we have

$$I_{1}^{(v)} \le\sum_{n=m}^{\infty}h(n)\int_{n^{mq \alpha}}^{\infty}\Biggl(\sum _{j=1}^{n} P\bigl( X_{j}>t^{1/(mq)} \bigr) \Biggr)\,dt < \infty. $$

For \(I_{2}^{(v)}\), first, we prove

$$ D_{n}:=\sup_{t\ge n^{mq\alpha}}t^{-v/(mq)}\sum _{j=1}^{n} EY_{j}^{(v,t)}\to0, \quad n\to\infty. $$
(3.6)

We consider two cases. The first one is \(p< v\). Using Lemma 2.4(i), (1.4), and \(\alpha p>1\), we get

$$ \begin{aligned}[b] D_{n}&\le\sup_{t\ge n^{mq\alpha}}t^{-v/(mq)}\sum _{j=1}^{n} t^{(v-p)/(mq)} E X_{j}^{p} \le Cn \sup_{t\ge n^{mq\alpha}}t^{-p/(mq)} E |X|^{p} \\ &\le Cn^{1-\alpha p}\to0,\quad n\to\infty. \end{aligned} $$
(3.7)

Let us now consider the second case: \(p\ge v\). Note that \(E|X|^{v}<\infty \) by (1.4), thus, we also get by Lemma 2.4(i) and \(\alpha>1/2\)

$$ D_{n} \le\sup_{t\ge n^{mq\alpha}}t^{-v/(mq)}\sum _{j=1}^{n} E X_{j}^{v} \le Cn \sup_{t\ge n^{mq\alpha}}t^{-v/(mq)} E |X|^{v} \le Cn^{1-v\alpha}\to0,\quad n\to\infty. $$
(3.8)

Therefore, (3.6) holds. In order to prove \(I_{2}^{(v)}<\infty\), by (3.4) and (3.6), it is enough to show that

$$ I_{2}^{(v*)} = \sum_{n=m}^{\infty}h(n)\int_{n^{mq \alpha}}^{\infty}P \Biggl( \sum _{j=1}^{n} \bigl(Y_{j}^{(v,t)}-EY_{j}^{(v,t)} \bigr)>\varepsilon t^{v/(mq)}/4 \Biggr)\,dt < \infty. $$
(3.9)

Take s such that \(s>\max\{2, p,mq,2mq/p,2(\alpha p-1)/(2v\alpha-1)\} \), using the Markov inequality, Lemma 2.1, Lemma 2.2, and the Jensen inequality, we have

$$\begin{aligned} I_{2}^{(v*)} \le{}& C\sum_{n=m}^{\infty}h(n)\int_{n^{mq \alpha}}^{\infty}t^{-vs/(mq)} \Biggl\{ \sum _{j=1}^{n}E \bigl(Y_{j}^{(v,t)} \bigr)^{s}+ \Biggl(\sum_{j=1}^{n}E \bigl(Y_{j}^{(v,t)} \bigr)^{2} \Biggr)^{s/2} \Biggr\} \,dt \\ \le{}& C\sum_{n=m}^{\infty}h(n)\int _{n^{mq \alpha}}^{\infty}t^{-vs/(mq)} \Biggl\{ \sum _{j=1}^{n} \bigl(EX_{j}^{vs}I \bigl(X_{j}\le t^{1/(mq)}\bigr)+t^{vs/(mq)} P \bigl(X_{j}>t^{1/(mq)}\bigr) \bigr) \Biggr\} \,dt \\ &{} + C\sum_{n=m}^{\infty}h(n)\int _{n^{mq \alpha}}^{\infty}t^{-vs/(mq)} \Biggl\{ \sum _{j=1}^{n} \bigl(EX_{j}^{2v}I \bigl(X_{j}\le t^{1/(mq)}\bigr)\\ &{}+t^{2v/(mq)} P \bigl(X_{j}>t^{1/(mq)}\bigr) \bigr) \Biggr\} ^{s/2}\,dt \\ ={}& CI_{21}^{(v*)}+CI_{22}^{(v*)}. \end{aligned}$$

By Lemma 2.8, we get \(I_{21}^{(v*)}<\infty\). For \(I_{22}^{(v*)}\), in the case \(p< 2v\), we have by Lemma 2.4(i) and (1.4)

$$\begin{aligned} I_{22}^{(v*)} & \le \sum_{n=m}^{\infty}h(n)\int_{n^{mq \alpha }}^{\infty}t^{-vs/(mq)} \Biggl(\sum _{j=1}^{n} t^{(2v-p)/(mq)}E|X_{j}|^{p} \Biggr)^{s/2}\,dt \\ & \le C\sum_{n=m}^{\infty}n^{\alpha( p-mq)-2+s/2 } \int_{n^{mq \alpha }}^{\infty}t^{-ps/(2mq)} \bigl(E|X|^{p} \bigr)^{s/2}\,dt \\ & = C\sum_{n=m}^{\infty}n^{\alpha p-2- (\alpha p-1)s/2 }< \infty. \end{aligned}$$

In the case \(p\ge 2v\), note that \(E|X|^{2v}<\infty\) by (1.4), thus, we get by Lemma 2.4(i)

$$\begin{aligned} I_{22}^{(v*)} & \le \sum_{n=m}^{\infty}h(n)\int_{n^{mq \alpha }}^{\infty}t^{-vs/(mq)} \Biggl( \sum _{j=1}^{n} E|X_{j}|^{2v} \Biggr)^{s/2}\,dt \\ & \le C\sum_{n=m}^{\infty}n^{\alpha( p-mq)-2+s/2 } \int_{n^{mq \alpha }}^{\infty}t^{-vs/(mq)} \bigl(E|X|^{2v} \bigr)^{s/2}\,dt \le C\sum_{n=m}^{\infty}n^{\alpha p-2-(v\alpha-1/2)s }< \infty. \end{aligned}$$

Therefore, \(I_{2}^{(v*)}<\infty\), hence

$$\sum_{n=m}^{\infty}h(n)\int _{n^{mq \alpha}}^{\infty}P \Biggl(\sum _{j=1}^{n} X_{j}^{v} > \varepsilon t^{v/(mq)} \Biggr)\,dt< \infty. $$

Thus (3.3) holds by (3.5) in the case \(2\le v\le m\).

Case 2: \(v=1\). We choose γ such that \(1/(\alpha p)<\gamma<1\). For \(\forall j\ge1\), \(t>0\), let

$$\begin{aligned} &X_{j}^{(t,1)}=-t^{\gamma/(mq)}I\bigl(X_{j}< -t^{\gamma/(mq)} \bigr)+X_{j} I\bigl(|X_{j}|\le t^{\gamma/(mq)} \bigr)+t^{\gamma/(mq)}I\bigl(X_{j}>t^{\gamma/(mq)}\bigr), \\ &X_{j}^{(t,2)}=\bigl(X_{j}-t^{\gamma/(mq)} \bigr)I\bigl(t^{\gamma/(mq)}< X_{j}\le t^{1/(mq)} \bigr)+t^{1/(mq)}I\bigl(X_{j}>t^{1/(mq)}\bigr), \\ &X_{j}^{(t,3)}=\bigl(X_{j}-t^{\gamma/(mq)}-t^{1/(mq)} \bigr)I\bigl(X_{j}>t^{1/(mq)}\bigr), \\ &X_{j}^{(t,4)}=\bigl(X_{j}+t^{\gamma/q} \bigr)I\bigl(-t^{1/(mq)}\le X_{j}< -t^{\gamma /(mq)} \bigr)-t^{1/(mq)}I\bigl(X_{j}< -t^{1/(mq)}\bigr), \\ &X_{j}^{(t,5)}=\bigl(X_{j}+t^{\gamma/(mq)}+t^{1/(mq)} \bigr)I\bigl(X_{j}< -t^{1/(mq)}\bigr). \ \end{aligned}$$

Hence \(X_{j}=\sum_{l=1}^{5} X_{j}^{(t,l)}\), note that

$$\begin{aligned} &\sum_{n=m}^{\infty}h(n)\int _{n^{mq \alpha}}^{\infty}P \Biggl(\max_{m\le k\le n} \Biggl\vert \sum_{j=1}^{k} X_{j} \Biggr\vert >\varepsilon t^{1/(mq)} \Biggr)\,dt \\ &\quad \le\sum_{l=1}^{5} \sum _{n=m}^{\infty}h(n)\int_{n^{mq \alpha}}^{\infty}P \Biggl(\max_{m\le k\le n}\Biggl\vert \sum _{j=1}^{k}X_{j}^{(t,l)}\Biggr\vert >\varepsilon t^{1/(mq)}/5 \Biggr)\,dt = \sum_{l=1}^{5} J_{l}. \end{aligned}$$

In order to prove (3.3) in the case \(v=1\), it suffices to show that \(J_{l}<\infty\) for \(1\le l\le5\).

For \(J_{1}\), we first prove that for any fixed positive integer \(n\ge m\),

$$ \int_{n^{mq \alpha}}^{\infty}P \Biggl(\sum _{j=1}^{n} t^{\gamma /(mq)}I\bigl(|X_{j}|>t^{\gamma/(mq)} \bigr) > \varepsilon t^{1/(mq)} \Biggr)\,dt< \infty \quad\mbox{for all } \varepsilon>0. $$
(3.10)

Chose \(n_{0}>n\) such that \(nn_{0}^{(\gamma-1)\alpha}<\varepsilon/2\), hence, by Lemma 2.8 we have

$$\begin{aligned} &\int_{n^{mq \alpha}}^{\infty}P \Biggl(\sum _{j=1}^{n} t^{\gamma /(mq)}I\bigl(|X_{j}|>t^{\gamma/(mq)} \bigr) > \varepsilon t^{1/(mq)} \Biggr)\,dt \\ &\quad \le \int_{n^{mq \alpha}}^{\infty}P \Biggl(\sum _{j=1}^{n} \bigl( t^{(\gamma -1)/(mq)}I \bigl(t^{\gamma/(mq)}< |X_{j}|\le t^{1/(mq)}\bigr) +I \bigl(|X_{j}|> t^{1/(mq)}\bigr) \bigr)> \varepsilon \Biggr)\,dt \\ &\quad \le \int_{n^{mq \alpha}}^{n_{0}^{mq \alpha}}\,dt +\int _{n_{0}^{mq \alpha }}^{\infty}P \Biggl(nn_{0}^{(\gamma-1)\alpha}+ \sum_{j=1}^{n} I\bigl(|X_{j}|> t^{1/(mq)}\bigr) > \varepsilon \Biggr)\,dt \\ &\quad \le C +\int_{n_{0}^{mq \alpha}}^{\infty}P \Biggl(\sum _{j=1}^{n} I\bigl(|X_{j}|> t^{1/(mq)}\bigr) > \varepsilon/2 \Biggr)\,dt \\ &\quad \le C +\int_{n^{mq \alpha}}^{\infty}\Biggl(\sum _{j=1}^{n} P\bigl(|X_{j}|> t^{1/(mq)}\bigr) \Biggr)\,dt \\ &\quad < \infty. \end{aligned}$$

Therefore, (3.10) holds. Now we prove that

$$ E_{n}:=\sup_{t\ge n^{mq \alpha}}t^{-1/(mq)}\max _{1\le k\le n}\Biggl\vert \sum_{j=1}^{k}EX_{j}^{(t,1)} \Biggr\vert \to0,\quad n\to\infty. $$
(3.11)

We consider three cases. In the case \(\alpha\le 1\), since \(\alpha p>1\) we get \(p>1\). Thus, by \(EX_{j}=0\), \(j\ge1\), Lemma 2.4(iii), and \(1/(\alpha p)<\gamma<1\), we have

$$\begin{aligned} E_{n} & \le \sup_{t\ge n^{mq \alpha}}t^{-1/(mq)}\sum _{j=1}^{n} \bigl(E|X_{j}|I \bigl(|X_{j}|>t^{\gamma/(mq)}\bigr)+t^{\gamma/(mq)}I \bigl(|X_{j}|>t^{\gamma /(mq)}\bigr) \bigr) \\ & \le 2\sup_{t\ge n^{mq \alpha}}t^{-1/(mq)}\sum _{j=1}^{n}E|X_{j}|I\bigl(|X_{j}|>t^{\gamma/(mq)} \bigr)\le Cn\sup_{t\ge n^{mq \alpha }}t^{-1/(mq)}E|X|I\bigl(|X|> t^{\gamma/(mq)}\bigr) \\ & \le Cn^{1-\alpha}E|X| I\bigl(|X|> n^{\alpha\gamma}\bigr) \le Cn^{1-\alpha p \gamma-\alpha(1-\gamma)}E|X|^{p}\to0, \quad n\to\infty. \end{aligned}$$

In the case \(\alpha>1\) and \(p< 1\), utilizing a similar method to the proof of (3.7) we also obtain

$$E_{n} \le Cn ^{1-\alpha p \gamma-(1-\gamma)\alpha}\to0,\quad n\to\infty. $$

In the case \(\alpha>1\) and \(p\ge1\), utilizing a similar method to the proof of (3.8) we have

$$E_{n}\le Cn^{1-\alpha}\to0, \quad n\to\infty. $$

Hence, (3.11) holds. In order to prove that \(J_{1}<\infty\), by (3.4), (3.10), and (3.11), it is enough to prove that

$$J_{1}^{*} = \sum_{n=m}^{\infty}h(n)\int_{n^{mq \alpha}}^{\infty}P \Biggl(\max _{m\le k\le n} \Biggl\vert \sum_{j=1}^{k} X_{j}^{(t,1)}-EX_{j}^{(t,1)}\Biggr\vert > \varepsilon t^{1/(mq)}/10 \Biggr)\,dt < \infty. $$

Take s such that \(s>\max\{2, \frac{mq}{1-\gamma},\frac{p}{1-\gamma },\frac{2mq}{2(1-\gamma)+p\gamma},\frac{2(\alpha p-1)}{2\alpha(1-\gamma )+\alpha p\gamma-1},\frac{2(\alpha p-1)}{2\alpha-1}\}\), by the Markov inequality, Lemma 2.1, Lemma 2.3, and the Jensen inequality, we have

$$\begin{aligned} J_{1}^{*} \le{}& C\sum_{n=m}^{\infty}h(n) \bigl(\log(4n)\bigr)^{s} \int_{n^{mq \alpha }}^{\infty}t^{-s/(mq)} \Biggl\{ \sum_{j=1}^{n}E \bigl\vert X_{j}^{(t,1)}\bigr\vert ^{s}+ \Biggl( \sum_{j=1}^{n}E \bigl(X_{j}^{(t,1)} \bigr)^{2} \Biggr)^{s/2} \Biggr\} \,dt \\ \le{}& C\sum_{n=m}^{\infty}h(n) \bigl( \log(4n)\bigr)^{s} \int_{n^{mq \alpha }}^{\infty}t^{\frac{-s}{mq}} \Biggl\{ \sum_{j=1}^{n} \bigl(E|X_{j}|^{s}I\bigl(|X_{j}|\le t^{\frac{\gamma}{mq}}\bigr)+ t^{\frac{\gamma s}{mq}} P\bigl(|X_{j}|>t^{\frac{\gamma}{mq}} \bigr) \bigr) \Biggr\} \,dt \\ &{} + C\sum_{n=m}^{\infty}h(n) \bigl( \log(4n)\bigr)^{s} \int_{n^{mq \alpha }}^{\infty}t^{\frac{-s}{mq}} \Biggl\{ \sum_{j=1}^{n} \bigl(E|X_{j}|^{2}I\bigl(|X_{j}|\le t^{\frac{\gamma}{mq}}\bigr)\\ &{}+ t^{\frac{2\gamma}{mq}} P\bigl(|X_{j}|>t^{\frac{\gamma}{mq}} \bigr) \bigr) \Biggr\} ^{\frac{s}{2}}\,dt \\ ={}& CJ_{11}^{*} +CJ_{12}^{*} . \end{aligned}$$

By Lemma 2.7, we have \(J_{11}^{*} < \infty\). For \(J_{12}^{*}\), in the case \(0< p< 2\) we have by Lemma 2.4(i) and (1.4)

$$\begin{aligned} J_{12}^{*} & \le \sum_{n=m}^{\infty}h(n) \bigl(\log(4n)\bigr)^{s} \int_{n^{mq \alpha }}^{\infty}t^{-s/(mq)} \Biggl(\sum_{j=1}^{n} t^{(2-p)\gamma/(mq)}E |X_{j}|^{p} \Biggr)^{s/2}\,dt \\ &\le C\sum_{n=m}^{\infty}n^{\alpha( p-mq)-2+s/2 } \bigl(\log(4n)\bigr)^{s} \int_{n^{mq \alpha}}^{\infty}t^{-[2-(2-p)\gamma]s/(2mq)} \bigl(E|X|^{p} \bigr)^{s/2}\,dt \\ &\le C\sum_{n=m}^{\infty}n^{\alpha p -2 -[\alpha(1-\gamma)+(\alpha p\gamma-1)/2] s} \bigl(\log(4n)\bigr)^{s} < \infty. \end{aligned}$$

In the case \(p\ge 2\), note that \(E|X|^{2}<\infty\) in this case, we have by Lemma 2.4(i)

$$\begin{aligned} J_{12}^{*} & \le \sum_{n=m}^{\infty}h(n) \bigl(\log(4n)\bigr)^{s}\int_{n^{m q \alpha }}^{\infty}t^{-s/(mq)} \Biggl(\sum_{j=1}^{n}EX_{j}^{2} \Biggr)^{s/2}\,dt \\ & \le C\sum_{n=m}^{\infty}n^{\alpha( p-mq)-2+s/2 } \bigl(\log(4n)\bigr)^{s} \int_{n^{mq \alpha}}^{\infty}t^{-s/(mq)} \bigl(E|X|^{2} \bigr)^{s/2}\,dt \\ & \le C\sum_{n=m}^{\infty}n^{\alpha p-2-(\alpha-1/2) s } \bigl(\log(4n)\bigr)^{s} < \infty. \end{aligned}$$

Thus, \(J_{1}<\infty\).

For \(J_{2}\), we first prove

$$ F_{n}:= \sup_{t\ge n^{mq \alpha}}t^{-1/(mq)}\sum _{j=1}^{n}EX_{j}^{(t,2)} \to0, \quad n \to\infty. $$
(3.12)

We consider two cases. In the case \(p\ge1\), we have by Lemma 2.4(iii)

$$\begin{aligned} 0 & \le F_{n}\le \sup_{t\ge n^{mq \alpha}}t^{-1/(mq)}\sum _{j=1}^{n} \bigl\{ EX_{j}I \bigl(X_{j}>t^{\gamma/(mq)}\bigr)+t^{1/(mq)}P \bigl(X_{j}>t^{1/(mq)}\bigr) \bigr\} \\ & \le 2\sup_{t\ge n^{mq \alpha}}t^{-1/(mq)}\sum _{j=1}^{n}EX_{j} I\bigl(X_{j}>t^{\gamma/(mq)} \bigr) \\ & \le Cn\sup_{t\ge n^{mq \alpha}}t^{-1/(mq)}E|X|I\bigl(|X|>t^{\gamma /(mq)} \bigr) \\ & \le Cn^{1-\alpha p\gamma-\alpha(1-\gamma)} E|X|^{p} \to0,\quad n\to \infty. \end{aligned}$$

In the case \(0< p<1\), utilizing a similar method to the proof of (3.7) we have

$$0\le F_{n} \le Cn^{1-\alpha p } \to0,\quad n\to\infty. $$

Therefore, (3.12) holds. By (3.4) and (3.12), in order to prove \(J_{2}<\infty\), it is enough to show that

$$J_{2}^{*} = \sum_{n=m}^{\infty}h(n)\int_{n^{mq \alpha}}^{\infty}P \Biggl( \sum _{j=1}^{n} \bigl(X_{j}^{(t,2)}-EX_{j}^{(t,2)} \bigr)>\varepsilon t^{1/(mq)}/10 \Biggr)\,dt < \infty. $$

Take s such that \(s>\max\{2, p,mq,2mq/p,(\alpha p-1)/(\alpha-1/2)\}\), utilizing a similar method to the proof of (3.9) we also have \(J_{2}^{*} <\infty\), hence \(J_{2}<\infty\).

For \(J_{3}\), we get by Lemma 2.8

$$\begin{aligned} J_{3} & \le \sum_{n=m}^{\infty}n^{\alpha( p-mq)-2 }\int_{n^{mq \alpha }}^{\infty}P \Biggl( \bigcup_{j=1}^{n}\bigl(X_{j}^{(t,3)}\neq0 \bigr) \Biggr)\,dt \\ & = \sum_{n=m}^{\infty}n^{\alpha( p-mq)-2 }\int _{n^{mq \alpha}}^{\infty}\Biggl(\sum _{j=1}^{n} P\bigl( X_{j}>t^{1/(mq)} \bigr) \Biggr)\,dt \\ & < \infty. \end{aligned}$$

Similar to the proof of \(J_{2}<\infty\) we have \(J_{4}<\infty\). Similar to the proof of \(J_{3}<\infty\) we have \(J_{5}<\infty\). Therefore, (1.5) holds.

Similar to the proof of (1.3), we obtain (1.6) by (1.5). □

References

  1. Hsu, P, Robbins, H: Complete convergence and the law of large numbers. Proc. Natl. Acad. Sci. USA 33, 25-31 (1947)

    Article  MATH  MathSciNet  Google Scholar 

  2. Baum, LE, Katz, M: Convergence rate in the law of large numbers. Trans. Am. Math. Soc. 120, 108-123 (1965)

    Article  MATH  MathSciNet  Google Scholar 

  3. Katz, ML: The probability in the tail of distribution. Ann. Math. Stat. 34, 12-318 (1963)

    Google Scholar 

  4. Kuczmaszewska, A: On complete convergence in Marcinkiewicz-Zygmund type SLLN for negatively associated random variables. Acta Math. Hung. 128, 116-130 (2010)

    Article  MATH  MathSciNet  Google Scholar 

  5. Qiu, D, Chen, P, Antonini, RG, Volodin, A: On the complete convergence for arrays of rowwise extended negatively dependent random variables. J. Korean Math. Soc. 50, 379-392 (2013)

    Article  MATH  MathSciNet  Google Scholar 

  6. Qiu, D, Wu, Q, Chen, P: Complete convergence for negatively orthant dependent random variables. J. Inequal. Appl. 2014, 145 (2014)

    Article  MathSciNet  Google Scholar 

  7. Wang, XJ, Hu, SH, Hu, T-C: Complete convergence for weighted sums and arrays of rowwise END sequences. Commun. Stat., Theory Methods 42, 2391-2401 (2013)

    Article  MATH  Google Scholar 

  8. Wang, XJ, Li, XQ, Hu, SH, Wang, XH: On complete convergence for an extended negatively dependent sequence. Commun. Stat., Theory Methods 43, 2923-2937 (2014)

    Article  MATH  MathSciNet  Google Scholar 

  9. Wu, YF, Cabrera, MO, Volodin, A: Complete convergence and complete moment convergence for arrays of rowwise END random variables. Glas. Mat. 49(69), 449-468 (2014)

    Google Scholar 

  10. Zhang, L, Wang, JF: A note on complete convergence of pairwise NQD random sequences. Appl. Math. J. Chin. Univ. Ser. A 19, 203-208 (2004)

    Article  MATH  Google Scholar 

  11. Chow, YS: On the rate of moment convergence of sample sums and extremes. Bull. Inst. Math. Acad. Sin. 16, 177-201 (1988)

    MATH  Google Scholar 

  12. Chen, P, Wang, D: Complete moment convergence for sequence of identically distributed φ-mixing random variables. Acta Math. Sin. Engl. Ser. 26, 679-690 (2010)

    Article  MATH  MathSciNet  Google Scholar 

  13. Li, Y, Zhang, L: Complete moment convergence of moving-average processes under dependence assumptions. Stat. Probab. Lett. 70, 191-197 (2004)

    Article  MATH  Google Scholar 

  14. Liang, H, Li, D, Rosalsky, A: Complete moment and integral convergence for sums of negatively associated random variables. Acta Math. Sin. Engl. Ser. 26, 419-432 (2010)

    Article  MATH  MathSciNet  Google Scholar 

  15. Qiu, D, Chen, P: Complete moment convergence for i.i.d. random variables. Stat. Probab. Lett. 91, 76-82 (2014)

    Article  MATH  MathSciNet  Google Scholar 

  16. Qiu, D, Liu, XD, Chen, P: Complete moment convergence for maximal partial sums under NOD setup. J. Inequal. Appl. 2015, 58 (2015)

    Article  MathSciNet  Google Scholar 

  17. Sung, SH: Complete qth moment convergence for arrays of random variables. J. Inequal. Appl. 2013, 24 (2013)

    Article  Google Scholar 

  18. Wang, D, Su, C: Moment complete convergence for sequences of B-valued i.i.d. random elements. Acta Math. Appl. Sin. 27, 440-448 (2004) (in Chinese)

    MATH  MathSciNet  Google Scholar 

  19. Liu, L: Precise large deviations for dependent random variables with heavy tails. Stat. Probab. Lett. 79, 1290-1298 (2009)

    Article  MATH  Google Scholar 

  20. Alam, K, Saxena, KML: Positive dependence in multivariate distributions. Commun. Stat., Theory Methods 10, 1183-1196 (1981)

    Article  MathSciNet  Google Scholar 

  21. Block, HW, Savits, TH, Shaked, M: Some concepts of negative dependence. Ann. Probab. 10(3), 765-772 (1982)

    Article  MATH  MathSciNet  Google Scholar 

  22. Joag-Dev, K, Proschan, F: Negative association of random variables with applications. Ann. Stat. 11, 286-295 (1983)

    Article  MATH  MathSciNet  Google Scholar 

  23. Chen, Y, Wang, YB, Wang, KY: Asymptotic results for ruin probability of a two-dimensional renewal risk model. Stoch. Anal. Appl. 31, 80-91 (2013)

    Article  MATH  MathSciNet  Google Scholar 

  24. Chen, YQ, Chen, AY, Ng, KW: The strong law of large numbers for extend negatively dependent random variables. J. Appl. Probab. 47, 908-922 (2010)

    Article  MATH  MathSciNet  Google Scholar 

  25. Liu, L: Necessary and sufficient conditions for moderate deviations of dependent random variables with heavy tails. Sci. China Math. 53, 1421-1434 (2010)

    Article  MATH  MathSciNet  Google Scholar 

  26. Shen, AT: Probability inequalities for END sequence and their applications. J. Inequal. Appl. 2011, 98 (2011)

    Article  Google Scholar 

  27. Yang, Y, Wang, YB: Tail behavior of the product of two dependent random variables with applications to risk theory. Extremes 16, 55-74 (2013)

    Article  MathSciNet  Google Scholar 

  28. Wu, YF, Peng, JY, Hu, T-C: Limiting behaviour for arrays of row-wise END random variables under conditions of h-integrability. Stochastics 87(3), 409-423 (2015)

    MathSciNet  Google Scholar 

  29. Stout, WF: Almost Sure Convergence. Academic Press, New York (1974)

    MATH  Google Scholar 

  30. Wang, YB, Yan, J, Cheng, FY, Cai, XZ: On the strong stability for Jamison type weighted product sums of pairwise NQD series with different distribution. Chin. Ann. Math., Ser. A 22, 701-706 (2001)

    MATH  MathSciNet  Google Scholar 

  31. Ragusa, MA, Tachikawa, A: Partial regularity of the minimizers of quadratic functionals with VMO coefficients. J. Lond. Math. Soc. (2) 72(3), 609-620 (2005)

    Article  MATH  MathSciNet  Google Scholar 

Download references

Acknowledgements

The work of Qiu was supported by the National Natural Science Foundation of China (Grant No. 61300204), the work of Chen was supported by the National Natural Science Foundation of China (Grant No. 11271161).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Dehua Qiu.

Additional information

Competing interests

The authors declare that they have no competing interests.

Authors’ contributions

All authors contributed equally to the writing of this paper. All authors read and approved the final manuscript.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Qiu, D., Chen, P. Complete moment convergence for product sums of sequence of extended negatively dependent random variables. J Inequal Appl 2015, 212 (2015). https://doi.org/10.1186/s13660-015-0730-4

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s13660-015-0730-4

MSC

Keywords