On complete moment convergence for weighted sums of negatively superadditive dependent random variables
- PDF / 193,888 Bytes
- 23 Pages / 499 x 709 pts Page_size
- 62 Downloads / 186 Views
APPLICATIONS OF MATHEMATICS
No. 4, 355–377
ON COMPLETE MOMENT CONVERGENCE FOR WEIGHTED SUMS OF NEGATIVELY SUPERADDITIVE DEPENDENT RANDOM VARIABLES Haiwu Huang, Hengyang, Xuewen Lu, Calgary Received September 23, 2018. Published online June 15, 2020.
Abstract. In this work, the complete moment convergence and complete convergence for weighted sums of negatively superadditive dependent (NSD) random variables are studied, and some equivalent conditions of these strong convergences are established. These main results generalize and improve the corresponding theorems of Baum and Katz (1965) and Chow (1988) to weighted sums of NSD random variables without the assumption of identical distribution. As an application, a Marcinkiewicz-Zygmund-type strong law of large numbers for weighted sums of NSD random variables is obtained. Keywords: NSD random variables; complete moment convergence; weighted sum; equivalent conditions MSC 2020 : 60F15
1. Introduction The concept of complete convergence was first introduced by Hsu and Robbins [13] as follows: a sequence {Xn ; n > 1} of random variables is said to converge com∞ P pletely to a constant λ if P (|Xn − λ| > ε) < ∞ for all ε > 0. In view of the n=1
Borel-Cantelli lemma, this implies that Xn → λ almost surely (a.s.). The converse is true if {Xn ; n > 1} is independent. Hsu and Robbins [13] showed that the sequence
This paper is supported by the Science and Technology Plan Project of Hunan Province (2016TP1020), the Scientific Research Fund of Hunan Provincial Education Department (18C0660), the National Statistical Science Research Project of China (2018LY05), the State Scholarship Fund of China Scholarship Council (No. 201908430242) and the Discovery Grants (RGPIN-2018-06466) from Natural Sciences and Engineering Research Council of Canada. DOI: 10.21136/AM.2020.0255-18
355
of arithmetic means of independent and identically distributed random variables converges completely to the expected value of the summands, provided the variance is finite. Erd˝ os [11] proved the converse. The Hsu-Robbins-Erd˝ os theorem is a fundamental result in probability theory, which has been generalized and extended in several ways. One of the most important generalizations was provided by Baum and Katz [4] for the following strong law of large numbers. Theorem A. Let 12 < α 6 1 and αp > 1. Let {X, Xn ; n > 1} be a sequence of independent and identically distributed random variables with EXn = 0. Then the following statements are equivalent: (1.1) (1.2)
p
E|X| < ∞, j ∞ X X αp−2 α n P max Xi > εn < ∞ ∀ ε > 0.
n=1
16j6n
i=1
From then on, many researchers investigated and improved the Baum-Katz theorem for independent and dependent random variables. Chow [6] first showed the concept of complete moment convergence by generalizing the result of Baum and Katz [4] as follows: Let {Xn ; n > 1} be a sequence of random variables, and an > 0, ∞ P q bn > 0, q > 0. If an E(b−1 n |Xn | − ε)+ < ∞ for all ε > 0, then {Xn ; n > 1} is said n=1
to have the property of complete moment convergence.
Data Loading...