Description of Distributions I: Real x
In the present chapter, some important distributions are defined and described.
- PDF / 324,624 Bytes
- 13 Pages / 439.37 x 666.142 pts Page_size
- 82 Downloads / 183 Views
Description of Distributions I: Real x
In the present chapter, some important distributions are defined and described. The event variable x is real, as opposed to discrete; that is, the distributions are probability densities. In Sect. 4.1, we describe Gaussian models. The exponential distribution is described in Sect. 4.2. The Cauchy and Student’s t-distribution are defined in Sect. 4.3. Section A.4 gives the solutions of the problems suggested to the reader.
4.1 Gaussian Distributions The Gaussian1 distribution which seems to have been discovered by A. de Moivre,2 is the most frequently used statistical model. Its simplest version is treated in Sect. 4.1.1; the multidimensional Gaussian is introduced in Sect. 4.1.2 and the family of chisquared distributions in Sect. 4.1.3.
4.1.1 The Simple Gaussian The Gaussian distribution of a single event variable x has two parameters, the central value ξ and the variance σ. It is given by 2 −1/2
q(x|ξ, σ) = (2πσ )
(x − ξ)2 , exp − 2σ 2
(4.1)
1 Carl
Friedrich Gauß, 1777–1855, German mathematician, astronomer, and physicist. He contributed to number theory, celestial and general mechanics, geodesy, differential geometry, magnetism, optics, the theory of complex functions, and statistics. 2 Abraham de Moivre, 1667–1754, French mathematician, emigrated to England after the revocation (1685) of the tolerance edict of Nantes. © Springer International Publishing Switzerland 2016 H.L. Harney, Bayesian Inference, DOI 10.1007/978-3-319-41644-1_4
41
42
4 Description of Distributions I: Real x
and is represented in Fig. 2.1. The event x and the parameter ξ are defined on the whole real axis. The normalising factor is obtained from the integral Z (a) = =
∞
−∞
d x exp −a(x − ξ)2
π/a ;
(4.2)
compare Problem A.3.3. The interested reader should convince himself or herself that the mean value x is equal to ξ. The variance of a random variable x, is the mean square deviation from x ; that is, (4.3) var(x) = (x − x)2 . Here, the overlines denote expectation values with respect to the distribution of x. The square root of the variance is called the standard deviation or root mean square deviation. It quantifies the fluctuations of x about its mean value. A transformation y → T x, however, changes the value of the standard deviation. The interested reader should show that the variance can also be expressed as var(x) = x 2 − x 2 .
(4.4)
The reader is asked to prove that, for the Gaussian distribution (4.1), one has var(x) = σ 2 , (x − ξ)4 = 3σ 2 .
(4.5)
To calculate the moments (x − ξ)2n , it is helpful to consider the derivatives of the function Z (a) of (4.2). The prime importance of the Gaussian distribution is a consequence of the central limit theorem. This theorem can be stated as follows. Let there be N random variables x1 , . . . , x N that follow the distribution w(xk ). Then the distribution W (z) of their average z = x tends to a Gaussian for N → ∞, if the mean value xk and the variance xk2 − xk 2 exist. The notation . . . is defined in Eq. (2.21).
Data Loading...