ISBN-13: 9781119422938 / Angielski / Twarda / 2018 / 320 str.
ISBN-13: 9781119422938 / Angielski / Twarda / 2018 / 320 str.
Written by the leading expert in the field, this text reviews the major new developments in envelope models and methods An Introduction to Envelopes provides an overview of the theory and methods of envelopes, a class of procedures for increasing efficiency in multivariate analyses without altering traditional objectives.
Preface xvNotation and Definitions xix1 Response Envelopes 11.1 The Multivariate Linear Model 21.1.1 Partitioned Models and Added Variable Plots 51.1.2 Alternative Model Forms 61.2 Envelope Model for Response Reduction 61.3 Illustrations 101.3.1 A Schematic Example 101.3.2 Compound Symmetry 131.3.3 Wheat Protein: Introductory Illustration 131.3.4 Cattle Weights: Initial Fit 141.4 More on the Envelope Model 191.4.1 Relationship with Sufficiency 191.4.2 Parameter Count 191.4.3 Potential Gains 201.5 Maximum Likelihood Estimation 211.5.1 Derivation 211.5.2 Cattle Weights: Variation of the X-Variant Parts of Y 231.5.3 Insights into ÊSigma (B)241.5.4 Scaling the Responses 251.6 Asymptotic Distributions 251.7 Fitted Values and Predictions 281.8 Testing the Responses 291.8.1 Test Development 291.8.2 Testing Individual Responses 321.8.3 Testing Containment Only 341.9 Nonnormal Errors 341.10 Selecting the Envelope Dimension, u 361.10.1 Selection Methods 361.10.1.1 Likelihood Ratio Testing 361.10.1.2 Information Criteria 371.10.1.3 Cross-validation 371.10.2 Inferring About rank (beta) 381.10.3 Asymptotic Considerations 381.10.4 Overestimation Versus Underestimation of u 411.10.5 Cattle Weights: Influence of u 431.11 Bootstrap and Uncertainty in the Envelope Dimension 451.11.1 Bootstrap for Envelope Models 451.11.2 Wheat Protein: Bootstrap and Asymptotic Standard Errors, u Fixed 461.11.3 Cattle Weights: Bootstrapping u 471.11.4 Bootstrap Smoothing 481.11.5 Cattle Data: Bootstrap Smoothing 492 Illustrative Analyses Using Response Envelopes 512.1 Wheat Protein: Full Data 512.2 Berkeley Guidance Study 512.3 Banknotes 542.4 Egyptian Skulls 552.5 Australian Institute of Sport: Response Envelopes 582.6 Air Pollution 592.7 Multivariate Bioassay 632.8 Brain Volumes 652.9 Reducing Lead Levels in Children 673 Partial Response Envelopes 693.1 Partial Envelope Model 693.2 Estimation 713.2.1 Asymptotic Distribution of 723.2.2 Selecting u1 733.3 Illustrations 743.3.1 Cattle Weight: Incorporating Basal Weight 743.3.2 Mens' Urine 743.4 Partial Envelopes for Prediction 773.4.1 Rationale 773.4.2 Pulp Fibers: Partial Envelopes and Prediction 783.5 Reducing Part of the Response 794 Predictor Envelopes 814.1 Model Formulations 814.1.1 Linear Predictor Reduction 814.1.1.1 Predictor Envelope Model 834.1.1.2 Expository Example 834.1.2 Latent Variable Formulation of Partial Least Squares Regression 844.1.3 Potential Advantages 864.2 SIMPLS 884.2.1 SIMPLS Algorithm 884.2.2 SIMPLS When n4.2.2.1 Behavior of the SIMPLS Algorithm 904.2.2.2 Asymptotic Properties of SIMPLS 914.3 Likelihood-Based Predictor Envelopes 944.3.1 Estimation 954.3.2 Comparisions with SIMPLS and Principal Component Regression 974.3.2.1 Principal Component Regression 984.3.2.2 SIMPLS 984.3.3 Asymptotic Properties 984.3.4 Fitted Values and Prediction 1004.3.5 Choice of Dimension 1014.3.6 Relevant Components 1014.4 Illustrations 1024.4.1 Expository Example, Continued 1024.4.2 Australian Institute of Sport: Predictor Envelopes 1034.4.3 Wheat Protein: Predicting Protein Content 1054.4.4 Mussels' Muscles: Predictor Envelopes 1064.4.5 Meat Properties 1094.5 Simultaneous Predictor-Response Envelopes 1094.5.1 Model Formulation 1094.5.2 Potential Gain 1104.5.3 Estimation 1135 Enveloping Multivariate Means 1175.1 Enveloping a Single Mean 1175.1.1 Envelope Structure 1175.1.2 Envelope Model 1195.1.3 Estimation 1205.1.4 Minneapolis Schools 1225.1.4.2 Four Untransformed Responses 1245.1.5 Functional Data 1265.2 Enveloping Multiple Means with Heteroscedastic Errors 1265.2.1 Heteroscedastic Envelopes 1265.2.2 Estimation 1285.2.3 Cattle Weights: Heteroscedastic Envelope Fit 1295.3 Extension to Heteroscedastic Regressions 1306 Envelope Algorithms 1336.1 Likelihood-Based Envelope Estimation 1336.2 Starting Values 1356.2.1 Choosing the Starting Value from the Eigenvectors of M 1356.2.2 Choosing the Starting Value from the Eigenvectors of M + U 1376.2.3 Summary 1386.3 A Non-Grassmann Algorithm for Estimating EM(V) 1396.4 Sequential Likelihood-Based Envelope Estimation 1416.4.1 The 1D Algorithm 1416.4.2 Envelope Component Screening 1426.4.2.1 ECS Algorithm 1436.4.2.2 Alternative ECS Algorithm 1446.5 Sequential Moment-Based Envelope Estimation 1456.5.1 Basic Algorithm 1456.5.2 Krylov Matrices and dim(V) = 1 1476.5.3 Variations on the Basic Algorithm 1477 Envelope Extensions 1497.1 Envelopes for Vector-Valued Parameters 1497.1.1 Illustrations 1517.1.2 Estimation Based on a Complete Likelihood 1547.1.2.1 Likelihood Construction 1547.1.2.2 Aster Models 1567.2 Envelopes for Matrix-Valued Parameters 1577.3 Envelopes for Matrix-Valued Responses 1607.3.1 Initial Modeling 1617.3.2 Models with Kronecker Structure 1637.3.3 Envelope Models with Kronecker Structure 1647.4 Spatial Envelopes 1667.5 Sparse Response Envelopes 1687.5.1 Sparse Response Envelopes when r << n 1687.5.2 Cattle Weights and Brain Volumes: Sparse Fits 1697.5.3 Sparse Envelopes when r > n 1707.6 Bayesian Response Envelopes 1718 Inner and Scaled Envelopes 1738.1 Inner Envelopes 1738.1.1 Definition and Properties of Inner Envelopes 1748.1.2 Inner Response Envelopes 1758.1.3 Maximum Likelihood Estimators 1768.1.4 Race Times: Inner Envelopes 1798.2 Scaled Response Envelopes 1828.2.1 Scaled Response Model 1838.2.2 Estimation 1848.2.3 Race Times: Scaled Response Envelopes 1858.3 Scaled Predictor Envelopes 1868.3.1 Scaled Predictor Model 1878.3.2 Estimation 1888.3.3 Scaled SIMPLS Algorithm 1899 Connections and Adaptations 1919.1 Canonical Correlations 1919.1.1 Construction of Canonical Variates and Correlations 1919.1.2 Derivation of Canonical Variates 1939.1.3 Connection to Envelopes 1949.2 Reduced-Rank Regression 1959.2.1 Reduced-Rank Model and Estimation 1959.2.2 Contrasts with Envelopes 1969.2.3 Reduced-Rank Response Envelopes 1979.2.4 Reduced-Rank Predictor Envelopes 1999.3 Supervised Singular Value Decomposition 1999.4 Sufficient Dimension Reduction 2029.5 Sliced Inverse Regression 2049.5.1 SIR Methodology 2049.5.2 Mussels' Muscles: Sliced Inverse Regression 2059.5.3 The "Envelope Method" 2069.5.4 Envelopes and SIR 2079.6 Dimension Reduction for the Conditional Mean 2079.6.1 Estimating One Vector in SE(Y|X) 2089.6.2 Estimating SE(Y|X) 2099.7 Functional Envelopes for SDR 2119.7.1 Functional SDR 2119.7.2 Functional Predictor Envelopes 2119.8 Comparing Covariance Matrices 2129.8.1 SDR for Covariance Matrices 2139.8.2 Connections with Envelopes 2159.8.3 Illustrations 2169.8.4 SDR for Means and Covariance Matrices 2179.9 Principal Components 2179.9.1 Introduction 2179.9.2 Random Latent Variables 2199.9.2.1 Envelopes 2209.9.2.2 Envelopes with Isotropic Intrinsic and Extrinsic Variation 2229.9.2.3 Envelopes with Isotropic Intrinsic Variation 2239.9.2.4 Selection of the Dimension u 2259.9.3 Fixed Latent Variables and Isotropic Errors 2259.9.4 Numerical Illustrations 2269.10 Principal Fitted Components 2299.10.1 Isotropic Errors, SigmaX|Y = sigma²Ip 2309.10.2 Anisotropic Errors, SigmaX|Y > 0 2319.10.3 Nonnormal Errors and the Choice of f 2329.10.3.1 Graphical Choices 2329.10.3.2 Basis Functions 2329.10.3.3 Categorical Response 2329.10.3.4 Sliced Inverse Regression 2339.10.4 High-Dimensional PFC 233Appendix A Envelope Algebra 235A.1 Invariant and Reducing Subspaces 235A.2 M-Envelopes 240A.3 Relationships Between Envelopes 241A.3.1 Invariance and Equivariance 241A.3.2 Direct Sums of Envelopes 244A.3.3 Coordinate Reduction 244A.4 Kronecker Products, vec and vech 246A.5 Commutation, Expansion, and Contraction Matrices 248A.6 Derivatives 249A.6.1 Derivatives for Eta, Omega, and Omega0 249A.6.2 Derivatives with Respect to Gamma 250A.6.3 Derivatives of Grassmann Objective Functions 251A.7 Miscellaneous Results 252A.8 Matrix Normal Distribution 255A.9 Literature Notes 256Appendix B Proofs for Envelope Algorithms 257B.1 The 1D Algorithm 257B.2 Sequential Moment-Based Algorithm 262B.2.1 First Direction Vector w1 263B.2.2 Second Direction Vector w2 263B.2.3 (q + 1)st Direction Vector wq+1, qB.2.4 Termination 265Appendix C Grassmann Manifold Optimization 267C.1 Gradient Algorithm 268C.2 Construction of B 269C.3 Construction of exp{deltaA(B)} 271C.4 Starting and Stopping 272Bibliography 273Author Index 283Subject Index 287
R. DENNIS COOK, PHD, is Full Professor, School of Statistics, University of Minnesota. He served as Director of the School of Statistics, Chair of the Department of Applied Statistics, and as Director of the Statistical Center, all at the University of Minnesota. He is Fellow of the American Statistical Association and the Institute of Mathematical Statistics. His research areas include dimension reduction, linear and nonlinear regression, experimental design, statistical diagnostics, statistical graphics, and population genetics.
1997-2024 DolnySlask.com Agencja Internetowa