Fisher information, sufficiency, and ancillarity: some clarifications
- PDF / 133,942 Bytes
- 6 Pages / 439.37 x 666.142 pts Page_size
- 19 Downloads / 204 Views
Fisher information, sufficiency, and ancillarity: some clarifications Nitis Mukhopadhyay · Swarnali Banerjee
Received: 25 July 2012 / Accepted: 11 February 2013 / Published online: 28 May 2013 © Sapienza Università di Roma 2013
Abstract Misconceptions are many when it comes to Fisher information, sufficiency, and ancillarity, especially among beginners. Many believe that IT1 (θ ) + IT2 (θ ) should equal IT1 +T2 (θ ) for all θ . We exhibit precise scenarios where IT1 +T2 (θ ) is IT1 (θ ) + IT2 (θ ) for all θ . Then, we clarify the process of verifying ancillarity of a statistic with concrete examples. Keywords Ancillarity · Dependent observations · Independent observations · Information · Sufficiency Mathematics Subject Classification
62B05 · 62B10 · 62-01
1 Introduction The intriguing concepts of sufficiency and ancillarity of statistics are intertwined with the notion of information, more commonly referred to as Fisher information. We begin with a brief introduction to these notions. Suppose that our data consist of X = (X 1 , . . . , X n ) having a likelihood function L(x; θ ). We assume that x and the unknown real valued parameter θ respectively belong to some appropriate spaces X and respectively. For simplicity, let us suppose that θ is a single parameter. Now, a statistic T ≡ T (X), is a function of X alone, with T ’s p.m.f. or p.d.f. denoted by g(t; θ ).
N. Mukhopadhyay (B) · S. Banerjee Department of Statistics, University of Connecticut, Storrs, CT 06269-4120, USA e-mail: [email protected] S. Banerjee e-mail: [email protected]
123
34
N. Mukhopadhyay, S. Banerjee
Fisher [4–6] defined the information (Fisher information) content in X as follows: ∂ ln L(X; θ ) 2 IX (θ ) = E θ ∂θ
(1)
provided that customary regularity conditions hold allowing one to exchange derivative with respect to θ and integral with respect to x. The Fisher information in a statistic T is: ∂ ln g(T ; θ ) 2 IT (θ ) = E θ . (2) ∂θ Now, T (X) is sufficient for θ if and only if the conditional distribution of X given T = t is free from θ for all possible t. Equivalently, T (X) is sufficient for θ if and only if Fisher information in T (X) is equal to Fisher information in X [4–6]. Another important notion is ancillarity. A statistic T ≡ T (X) is ancillary for θ if and only if g(t; θ ) does not involve θ . That is, an ancillary statistic T has no information about θ . The interplay between sufficiency and ancillarity is vast, deep and complex. Recent treatments include [7,8], and other important sources cited there. Recall that sufficiency of a statistic is often verified with the help of Neyman factorization theorem [1,5,9,12] or Lehmann-Scheffé [10] theorems. One may refer to a standard textbook including [13, chapter 5], [11, chapter 6], or [3, chapter 6]. While considering two statistics T1 and T2 , there is a general scenario where one can claim that IT1 +T2 (θ ) = IT1 (θ ) + IT2 (θ ) for all θ. We claim: I(T1 ,T2 ) (θ ) = IT1 (θ ) + IT2 (θ ) when T1 , T2 are independent,
(3)
and additional
Data Loading...