The Law of the Iterated Logarithm

The central limit theorem tells us that suitably normalized sums can be approximated by a normal distribution. Although arbitrarily large values may occur, and will occur, one might try to bound the magnitude in some manner. This is what the law of the it

  • PDF / 359,746 Bytes
  • 40 Pages / 439.37 x 666.142 pts Page_size
  • 43 Downloads / 235 Views

DOWNLOAD

REPORT


The Law of the Iterated Logarithm

The central limit theorem tells us that suitably normalized sums can be approximated by a normal distribution. Although arbitrarily large values may occur, and will occur, one might try to bound the magnitude in some manner. This is what the law of the iterated logarithm (LIL) does, in that it provides a parabolic bound on how large the oscillations of the partial sums may be as a function of the number of summands. In Theorem 6.9.1 we presented Borel’s theorem [27], stating that almost all numbers are normal. Later Khintchine [165], proved that, if Nn equals the number of ones among the first n decimals in the binary expansion of a number (in the unit interval), then Nn − n/2 = 1 a.s. lim sup  1 n→∞ n log log n 2 By symmetry, the liminf equals −1 almost surely. We also observe that n/2 is the expected number of ones. The conclusion thus tells us that the fluctuations around the expected value stay within a precisely given parabola except, possibly, for a finite number of visits outside. So: • To what extent can this be generalized? • The result provides the extreme limit points. Are there any more? The answer to these questions belongs to the realm of the law of the iterated logarithm. If the law of large numbers and the central limit theorem are the two most central and fundamental limit theorems, the law of the iterated logarithm is the hottest candidate for the third position. In this chapter the main focus is on the Hartman–Wintner–Strassen theorem, which deals with the i.i.d. case; Hartman and Wintner [136] proved the sufficiency, and Strassen [250] the necessity. The theorem exhibits the extreme limit points. We shall also prove an extension, due to de Acosta [2], to the effect that the interval whose endpoints are the extreme limit points is a cluster set, meaning that, almost surely, the set of limit points is equal to the interval whose endpoints are the extreme

A. Gut, Probability: A Graduate Course, Springer Texts in Statistics, DOI: 10.1007/978-1-4614-4708-5_8, © Springer Science+Business Media New York 2013

383

384

8 The Law of the Iterated Logarithm

limit points. During the course of the proof we find results for subsequences from [109] and [257], respectively, where it, for example, is shown that the cluster set shrinks for rapidly increasing subsequences. The “Some Additional Results and Remarks” section contains a proof of how one can derive the general law from the result for normal random variables via the Berry–Esseen theorem (under the assumption of an additional third moment), more on rates, and additional examples and complements.

1 The Kolmogorov and Hartman–Wintner LILs In a seminal paper, Kolmogorov [168] proved the following result for independent, not necessarily identically distributed, random variables. random variables Theorem 1.1 Suppose that X 1 , X 2 , . . . are independent   with mean 0 and finite variances σk2 , k ≥ 1, set Sn = nk=1 X k , and sn2 = nk=1 σk2 , n ≥ 1. If   sn for all n, (1.1) |X n | ≤ o √ log log sn then

Sn lim sup (