Clustering is a phenomenon commonly observed across social science research--students are clustered in classrooms, individuals in households, and companies within industrial sectors, to name but a few examples. This book presents an elementary and systematic introduction to modeling of between-cluster variation, how results are best interpreted, and computational methods for estimation. The book addresses many important issues in the social sciences that can be best described in terms of variation sources and patterns, such as temporal, between-person, and geographical variation. By providing...
Clustering is a phenomenon commonly observed across social science research--students are clustered in classrooms, individuals in households, and comp...
Categorical data analysis is a special area of generalized linear models, which has become the most important area of statistical applications in many disciplines, from medicine to social sciences. This text presents the standard models and many newly developed ones in a language that can be immediately applied in many modern statistical packages such as GLIM, GENSTAT, S-Plus, as well as SAS and LISP-STAT. The book is structure around the distinction between independent events occurring to different individuals, resulting in frequencies, and repeated events occurring to the same individuals,...
Categorical data analysis is a special area of generalized linear models, which has become the most important area of statistical applications in many...
A well-designed experiment is an efficient method for learning about the physical world, however since experiments in any setting cannot avoid random error, statistical methods are essential for their design and implementation, and for the analysis of results. In this book, the fundamentals of optimum experimental design theory are presented. In the first part, the advantages of a statistical approach to the design of experiments are discussed, and the ideas of models, least squares fitting, and optimum experimental designs are introduced. The second part presents a more detailed discussion...
A well-designed experiment is an efficient method for learning about the physical world, however since experiments in any setting cannot avoid random ...
Over recent years, developments in statistical computing have freed statisticians from the burden of calculation and have made possible new methods of analysis that previously would have been too difficult or time-consuming. Up till now these developments have been primarily in numerical computation and graphical display, but equal steps forward are now being made in the area of symbolic computing: the use of computer languages and procedures to manipulate expressions. This allows researchers to compute an algebraic expression, rather than evaluate the expression numerically over a given...
Over recent years, developments in statistical computing have freed statisticians from the burden of calculation and have made possible new methods of...
Highly Structured Stochastic Systems (HSSS) is a modern strategy for building statistical models for challenging real-world problems, for computing with them, and for interpreting the resulting inference. The aim of this book is to make recent developments in HSSS accessible to a general statistical audience including graduate students and researchers.
Highly Structured Stochastic Systems (HSSS) is a modern strategy for building statistical models for challenging real-world problems, for computing wi...
With an abundance of helpful examples, this text expertly presents the essentials of measurement, regression, and calibration. The book develops the fundamentals and underlying theories of key techniques in a clear, step-by-step progression, starting with standard least squares prediction of a single variable and moving on to shrinkage techniques for multiple variables. Self-contained chapters discuss methods that have been specifically developed for spectroscopy, likelihood and Bayesian inference (which may be applied to a wide range of multivariate regression problems), and Bayesian...
With an abundance of helpful examples, this text expertly presents the essentials of measurement, regression, and calibration. The book develops the f...
The new edition of this important text has been completely revised and expanded to become the most up-to-date and thorough professional reference text in this fast-moving and important area of biostatistics. Two new chapters have been added on fully parametric models for discrete repeated measures data and on statistical models for time-dependent predictors where there may be feedback between the predictor and response variables. It also contains the many useful features of the previous edition such as, design issues, exploratory methods of analysis, linear models for continuous data, and...
The new edition of this important text has been completely revised and expanded to become the most up-to-date and thorough professional reference text...
This book describes the use of smoothing techniques in statistics and includes both density estimation and nonparametric regression. Incorporating recent advances, it describes a variety of ways to apply these methods to practical problems. Although the emphasis is on using smoothing techniques to explore data graphically, the discussion also covers data analysis with nonparametric curves, as an extension of more standard parametric models. Intended as an introduction, with a focus on applications rather than on detailed theory, the book will be equally valuable for undergraduate and graduate...
This book describes the use of smoothing techniques in statistics and includes both density estimation and nonparametric regression. Incorporating rec...
This book is designed to introduce graduate students and researchers to the primary methods useful for approximating integrals. The emphasis is on those methods that have been found to be of practical use, focusing on approximating higher- dimensional integrals with coverage of the lower-dimensional case as well. Included in the book are asymptotic techniques, multiple quadrature and quasi-random techniques and a complete development of Monte Carlo algorithms. For the Monte Carlo section important sampling methods, variance reduction techniques and the primary Markov Chain Monte Carlo...
This book is designed to introduce graduate students and researchers to the primary methods useful for approximating integrals. The emphasis is on tho...