Central Limit Theorems for Weakly Dependent Random Fields
This chapter is a primer on the limit theorems for dependent random fields. First, dependence concepts such as mixing, association and their generalizations are introduced. Then, moment inequalities for sums of dependent random variables are stated which
- PDF / 532,305 Bytes
- 47 Pages / 439.36 x 666.15 pts Page_size
- 74 Downloads / 216 Views
Central Limit Theorems for Weakly Dependent Random Fields Alexander Bulinski and Evgeny Spodarev
Abstract This chapter is a primer on the limit theorems for dependent random fields. First, dependence concepts such as mixing, association and their generalizations are introduced. Then, moment inequalities for sums of dependent random variables are stated which yield e.g. the asymptotic behaviour of the variance of these sums which is essential for the proof of limit theorems. Finally, central limit theorems for dependent random fields are given. Applications to excursion sets of random fields and Newman’s conjecture in the absence of finite susceptibility are discussed as well.
10.1 Dependence Concepts for Random Fields This section reviews several important dependence concepts of random variables and random fields such as mixing and m-dependence (already touched upon in Sect. 4.3 for point processes), association (both positive and negative), quasiassociation, etc. Special attention is paid to association of random elements with values in partially ordered spaces and Fortuin–Kastelleyn–Ginibre inequalities.
A. Bulinski () Moscow State University, Moscow, Russia e-mail: [email protected] E. Spodarev Ulm University, Ulm, Germany e-mail: [email protected] E. Spodarev (ed.), Stochastic Geometry, Spatial Statistics and Random Fields, Lecture Notes in Mathematics 2068, DOI 10.1007/978-3-642-33305-7 10, © Springer-Verlag Berlin Heidelberg 2013
337
338
A. Bulinski and E. Spodarev
10.1.1 Families of Independent and Dependent Random Variables We consider a real-valued random function D f .t/; t 2 T g defined on a probability space .˝; A; P/ and a set T , i.e. .t/ W ˝ ! R is a random variable for any t 2 T . Recall the following basic concept. Definition 10.1. A family D f .t/; t 2 T g consists of independent random variables if for each finite set J T and any collection of sets Bt 2 B.R/, t 2 J , one has ! Y \ f .t/ 2 Bt g D P..t/ 2 Bt /: (10.1) P t 2J
t 2J
If (10.1) does not hold then is named a family of dependent random variables. The independence of events f At ; t 2 T g can be defined as independence of random variables f 1.At /; t 2 T g. Exercise 10.1. Prove that validity of (10.1) is equivalent to the following statement. For all finite disjoint sets I D f s1 ; : : : ; sk g T , J D f t1 ; : : : ; tm g T (with all possible values k; m 2 N) and any bounded Borel functions f W Rk ! R, g W Rm ! R cov.f ..s1 /; : : : ; .sk //; g..t1 /; : : : ; .tm /// D 0:
(10.2)
Definition 10.1 can be easily extended to comprise random elements .t/ W ˝ ! St where .St ; Bt / are any measurable spaces and .t/ 2 A j Bt for each t 2 T . Note that (10.1) is the particular case of the independence notion for arbitrary family of -algebras (for every t 2 T we use -algebras 1 f .t/ g D f .t/ .B/ W B 2 Bt g/. Due to the Theorem 9.1 one can construct a collection f .t/; t 2 T g of independent random variables on some probability space .˝; A; P/ (defined on an arbitrary set T and taking va
Data Loading...