Theory of Multivariate Statistics
Our object in writing this book is to present the main results of the modern theory of multivariate statistics to an audience of advanced students who would appreciate a concise and mathematically rigorous treatment of that material. It is intended for us
- PDF / 3,988,839 Bytes
- 304 Pages / 483.038 x 764.209 pts Page_size
- 115 Downloads / 232 Views
New York Berlin Heidelberg Barcelona Hong Kong London Milan Paris Singapore Tokyo
Alfred: Berger: Bilodeau and Brenner: Blom: Brockwell and Davis: Chow and Teicher: Christensen: Christensen: Christensen: Creighton: A Dean and Voss: du Toit, Steyn, and Stumpf: Durrett: Edwards: Finkelstein and Levin: Flury: A Jobson: Jobson: Kalbfleisch: Kalbfleisch: Karr: Keyfitz: Kiefer: Kokoska and Nevison: Kulkami: Lehmann: Lehmann: Lehmann and Casella: Lindman: Lindsey: Madansky: McPherson: Mueller: Nguyen and Rogers: Nguyen and Rogers:
Editorial Board
A la memoire
de mon pere, Arthur,
a ma mere, Annette,
et a
To Rebecca and
Kahina.
Deena.
A First Course in Multivariate Statistics
Preface List of Tables List of Figures 1
Linear algebra
2
Random vectors
vii xv xvii 1
14
Gamma, Dirichlet, and F distributions
36
F Invariance
43
Multivariate normal
55
Multivariate sampling
73
Wishart distributions
85
8
Tests on mean and variance
9
Multivariate regression
98
144
10 Principal components
161
11 Canonical correlations
174
12 Asymptotic expansions
195
13 Robustness
206
14 Bootstrap confidence regions and tests
243
A Inversion formulas
253
B Multivariate cumulants
256
C S-plus functions
261
References Author Index Subject Index
263 277 281
Sg
Bg
pi = ps = 2 1^2
p= ui = a2 =
p=
a = a = NsiO,!), Cauchy3,{0,I),
1 Linear algebra
1.1 Introduction Multivariate analysis deals with issues related to the observations of many, usually correlated, variables on units of a selected random sample. These units can be of any nature such as persons, cars, cities, etc. The observations are gathered as vectors; for each selected unit corresponds a vector of observed variables. An understanding of vectors, matrices, and, more generally, linear algebra is thus fundamental to the study of multivariate analysis. Chapter 1 represents our selection of several important results on linear algebra. They will facilitate a great many of the concepts in multivariate analysis. A useful reference for linear algebra is Strang (1980).
1.2 Vectors and matrices To express the dependence of the x ∈ Rn on its coordinates, we may write any of ⎛ ⎞ x1 .. ⎠ ⎝ x = (xi , i = 1, . . . , n) = (xi ) = . . xn In this manner, x is envisaged as a “column” vector. The transpose of x is the “row” vector x ∈ Rn
x = (xi ) = (x1 , . . . , xn ) .
2
1. Linear algebra
An m × n matrix A ∈ Rm n may also be denoted in various ⎛ a11 .. ⎝ A = (aij , i = 1, . . . , m, j = 1, . . . , n) = (aij ) = . am1
ways:
⎞ · · · a1n .. ⎠ .. . . . · · · amn
The transpose of A is the n × m matrix A ∈ Rnm : ⎞ ⎛ a11 · · · am1 . .. ⎠ .. A = (aij ) = (aji ) = ⎝ .. . . . a1n · · · amn A square matrix S ∈ Rnn satisfying S = S is termed symmetric. The product of the m × n matrix A by the n × p matrix B is the m × p matrix C = AB for which n cij = aik bkj . k=1
n
is tr A = i=1 aii and one verifies that for A ∈ Rm The trace of A ∈ n n and B ∈ Rm , tr AB = tr BA. In particular, row vectors and column vectors are themselves matrices, so
Data Loading...