ISBN-13: 9781119092483 / Angielski / Twarda / 2016 / 552 str.
ISBN-13: 9781119092483 / Angielski / Twarda / 2016 / 552 str.
An up-to-date version of the complete, self-contained introduction to matrix analysis theory and practice Providing accessible and in-depth coverage of the most common matrix methods now used in statistical applications, Matrix Analysis for Statistics, Third Edition features an easy-to-follow theorem/proof format.
Preface xi
About the Companion Website xv
1 A Review of Elementary Matrix Algebra 1
1.1 Introduction 1
1.2 Definitions and Notation 1
1.3 Matrix Addition and Multiplication 2
1.4 The Transpose 3
1.5 The Trace 4
1.6 The Determinant 5
1.7 The Inverse 9
1.8 Partitioned Matrices 12
1.9 The Rank of a Matrix 14
1.10 Orthogonal Matrices 15
1.11 Quadratic Forms 16
1.12 Complex Matrices 18
1.13 Random Vectors and Some Related Statistical Concepts 19
Problems 29
2 Vector Spaces 35
2.1 Introduction 35
2.2 Definitions 35
2.3 Linear Independence and Dependence 42
2.4 Matrix Rank and Linear Independence 45
2.5 Bases and Dimension 49
2.6 Orthonormal Bases and Projections 53
2.7 Projection Matrices 58
2.8 Linear Transformations and Systems of Linear Equations 65
2.9 The Intersection and Sum of Vector Spaces 73
2.10 Oblique Projections 76
2.11 Convex Sets 80
Problems 85
3 Eigenvalues and Eigenvectors 95
3.1 Introduction 95
3.2 Eigenvalues, Eigenvectors, and Eigenspaces 95
3.3 Some Basic Properties of Eigenvalues and Eigenvectors 99
3.4 Symmetric Matrices 106
3.5 Continuity of Eigenvalues and Eigenprojections 114
3.6 Extremal Properties of Eigenvalues 116
3.7 Additional Results Concerning Eigenvalues Of Symmetric Matrices 123
3.8 Nonnegative Definite Matrices 129
3.9 Antieigenvalues and Antieigenvectors 141
Problems 144
4 Matrix Factorizations and Matrix Norms 155
4.1 Introduction 155
4.2 The Singular Value Decomposition 155
4.3 The Spectral Decomposition of a Symmetric Matrix 162
4.4 The Diagonalization of a Square Matrix 169
4.5 The Jordan Decomposition 173
4.6 The Schur Decomposition 175
4.7 The Simultaneous Diagonalization of Two Symmetric Matrices 178
4.8 Matrix Norms 184
Problems 191
5 Generalized Inverses 201
5.1 Introduction 201
5.2 The Moore Penrose Generalized Inverse 202
5.3 Some Basic Properties of the Moore Penrose Inverse 205
5.4 The Moore Penrose Inverse of a Matrix Product 211
5.5 The Moore Penrose Inverse of Partitioned Matrices 215
5.6 The Moore Penrose Inverse of a Sum 219
5.7 The Continuity of the Moore Penrose Inverse 222
5.8 Some Other Generalized Inverses 224
5.9 Computing Generalized Inverses 232
Problems 238
6 Systems of Linear Equations 247
6.1 Introduction 247
6.2 Consistency of a System of Equations 247
6.3 Solutions to a Consistent System of Equations 251
6.4 Homogeneous Systems of Equations 258
6.5 Least Squares Solutions to a System of Linear Equations 260
6.6 Least Squares Estimation For Less Than Full Rank Models 266
6.7 Systems of Linear Equations and The Singular Value Decomposition 271
6.8 Sparse Linear Systems of Equations 273
Problems 278
7 Partitioned Matrices 285
7.1 Introduction 285
7.2 The Inverse 285
7.3 The Determinant 288
7.4 Rank 296
7.5 Generalized Inverses 298
7.6 Eigenvalues 302
Problems 307
8 Special Matrices and Matrix Operations 315
8.1 Introduction 315
8.2 The Kronecker Product 315
8.3 The Direct Sum 323
8.4 The Vec Operator 323
8.5 The Hadamard Product 329
8.6 The Commutation Matrix 339
8.7 Some Other Matrices Associated With the Vec Operator 346
8.8 Nonnegative Matrices 351
8.9 Circulant and Toeplitz Matrices 363
8.10 Hadamard and Vandermonde Matrices 369
Problems 373
9 Matrix Derivatives and Related Topics 387
9.1 Introduction 387
9.2 Multivariable Differential Calculus 387
9.3 Vector and Matrix Functions 390
9.4 Some Useful Matrix Derivatives 396
9.5 Derivatives of Functions of Patterned Matrices 400
9.6 The Perturbation Method 402
9.7 Maxima and Minima 409
9.8 Convex and Concave Functions 413
9.9 The Method of Lagrange Multipliers 417
Problems 423
10 Inequalities 433
10.1 Introduction 433
10.2 Majorization 433
10.3 Cauchy–Schwarz Inequalities 444
10.4 H¨older s Inequality 446
10.5 Minkowski s Inequality 450
10.6 The Arithmetic–Geometric Mean Inequality 452
Problems 453
11 Some Special Topics Related to Quadratic Forms 457
11.1 Introduction 457
11.2 Some Results on Idempotent Matrices 457
11.3 Cochran s Theorem 462
11.4 Distribution of Quadratic Forms in Normal Variates 465
11.5 Independence of Quadratic Forms 471
11.6 Expected Values of Quadratic Forms 477
11.7 The Wishart Distribution 485
Problems 496
References 507
Index 513
James R. Schott, PhD, is Professor in the Department of Statistics at the University of Central Florida. He has published numerous journal articles in the area of multivariate analysis. Dr. Schott s research interests include multivariate analysis, analysis of covariance and correlation matrices, and dimensionality reduction techniques.
An up–to–date version of the complete, self–contained introduction to matrix analysis theory and practice
Providing accessible and in–depth coverage of the most common matrix methods now used in statistical applications, Matrix Analysis for Statistics, Third Edition features an easy–to–follow theorem/proof format. Featuring smooth transitions between topical coverage, the author carefully justifies the step–by–step process of the most common matrix methods now used in statistical applications, including eigenvalues and eigenvectors; the Moore–Penrose inverse; matrix differentiation; and the distribution of quadratic forms.
An ideal introduction to matrix analysis theory and practice, Matrix Analysis for Statistics, Third Edition features:
New chapter or section coverage on inequalities, oblique projections, and antieigenvalues and antieigenvectors
Additional problems and chapter–end practice exercises at the end of each chapter
Extensive examples that are familiar and easy to understand
Self–contained chapters for flexibility in topic choice
Applications of matrix methods in least squares regression and the analyses of mean vectors and covariance matrices
Matrix Analysis for Statistics, Third Edition is an ideal textbook for upper–undergraduate and graduate–level courses on matrix methods, multivariate analysis, and linear models. The book is also an excellent reference for research professionals in applied statistics.
James R. Schott, PhD, is Professor in the Department of Statistics at the University of Central Florida. He has published numerous journal articles in the area of multivariate analysis. Dr. Schott s research interests include multivariate analysis, analysis of covariance and correlation matrices, and dimensionality reduction techniques.
1997-2024 DolnySlask.com Agencja Internetowa