Inequalities
Inequalities play an important role in probability theory, because much work concerns the estimation of certain probabilities by others, the estimation of moments of sums by sums of moments, and so on.
- PDF / 309,063 Bytes
- 37 Pages / 439.37 x 666.142 pts Page_size
- 40 Downloads / 197 Views
Inequalities
Inequalities play an important role in probability theory, because much work concerns the estimation of certain probabilities by others, the estimation of moments of sums by sums of moments, and so on. In this chapter we have collected a number of inequalities of the following kind: • tail probabilities are estimated by moments; • moments of sums are estimated by sums of moments and vice versa; • the expected value of the product of two random variables is estimated by a suitable product of higher-order moments; • moments of low order are estimated by moments of a higher order; • a moment inequality for convex functions of random variables is provided; • relations between random variables and symmetrized versions; • the probability that a maximal partial sum of random variables exceeds some given level is related to the probability that the last partial sum does so.
1 Tail Probabilities Estimated Via Moments We begin with a useful and elementary inequality. Lemma 1.1 Suppose that g is a non-negative, non-decreasing function such that E g(|X |) < ∞, and let x > 0. Then, P(|X | > x) ≤
E g(|X |) . g(x)
Proof We have E g(|X |) ≥ E g(|X |)I {|X | > x} ≥ g(x)E I {|X | > x} = g(x)P(|X | > x).
Specializing g yields the following famous named inequality.
A. Gut, Probability: A Graduate Course, Springer Texts in Statistics, DOI: 10.1007/978-1-4614-4708-5_3, © Springer Science+Business Media New York 2013
119
120
3 Inequalities
Theorem 1.1 (Markov’s inequality) Suppose that E|X |r < ∞ for some r > 0, and let x > 0. Then, P(|X | > x) ≤
E|X |r . xr
Another useful case is the exponential function applied to bounded random variables. Theorem 1.2 (i) Suppose that P(|X | ≤ b) = 1 for some b > 0, that E X = 0, and set Var X = σ 2 . Then, for 0 < t < b−1 , and x > 0, P(X > x) ≤ e−t x+t P(|X | > x) ≤ 2e
2 σ2
,
−t x+t 2 σ 2
.
(ii) Let X 1 , X 2 , . . . , X n be independent random variables with mean 0, suppose that P(|X k | ≤ b) = 1 for all k, and set σk2 = Var X k . Then, for 0 < t < b−1 , and x > 0. n 2 2 σk . P(|Sn | > x) ≤ 2 exp −t x + t k=1
(iii) If, in addition, X 1 , X 2 , . . . , X n are identically distributed, then P(|Sn | > x) ≤ 2 exp{−t x + nt 2 σ12 }. Proof (i): Applying Lemma 1.1 with g(x) = et x , for 0 ≤ t ≤ b−1 , and formula (A.A.1) yields E et X ≤ e−t x (1 + E t X + E (t X )2 ) et x 2 2 = e−t x (1 + t 2 σ 2 ) ≤ e−t x et σ ,
P(X > x) ≤
which proves the first assertion. The other one follows by considering the negative tail and addition. Statements (ii) and (iii) are then immediate. The following inequality for bounded random variables, which we state without proof, is due to Hoeffding, [144], Theorem 2. Theorem 1.3 (Hoeffding’s inequality) Let X 1 , X 2 , . . . , X n be independent random variables, such that P(ak ≤ X k ≤ bk ) = 1 for k = 1, 2, . . . , n, and let Sn , n ≥ 1, denote the partial sums. Then 2x 2 , P(Sn − E Sn > x) ≤ exp − n 2 k=1 (bk − ak ) 2x 2 P(|Sn − E Sn | > x) ≤ 2 exp − n . 2 k=1 (bk − ak )
1 Tail Probabilities Estimated Via Moments
121
The n
Data Loading...