ISBN-13: 9781118644614 / Angielski / Twarda / 2019 / 384 str.
This book discusses current methods of estimation in linear models. In particular, the authors address the methodology of linear multiple regression models that plays an important role in almost every scientific investigations in various fields, including economics, engineering, and biostatistics. The standard estimation method for regression parameters is the ordinary least square (OLS) principal where residual squared errors are minimized. Applied statisticians are often not satisfied with OLS estimators when the design matrix is ill-conditioned, leading to a multicollinearity problem and large variances that make the "prediction" inaccurate. This book details the ridge regression estimator, which was developed to combat the multicollinearity problem. Another estimator, called the Liu-estimator due to Liu Kejian, is also addressed since it provides a competing resolution to the multicollinearity problem. The ridge regression estimators are complicated non-linear functions of the ridge parameter, whereas, the Liu estimators are a linear function of the shrinkage parameter. With a focus on the ridge regression and LIU estimators, this book provides expanded coverage beyond the usual preliminary test and Stein type estimator. In this case, we propose a class of composite estimators that are obtained by multiplying the OLS, restricted OLS, preliminary test OLS, and Stein-type OLS by the "ridge factor" and "Liu-factor." This research is a significant step towards the study of dominance properties as well as the comparison of the extent of LASSO properties. In addition, research materials involving shrinkage and model selection for linear regression models are provided. Topical coverage includes: preliminaries; linear regression models; multiple regression models; measurement error models; generalized linear models; and autoregressive Gaussian processes.