Termin realizacji zamówienia: ok. 22 dni roboczych.
Darmowa dostawa!
This new edition continues to serve as a comprehensive guide to modern and classical methods of statistical computing. The book is comprised of four main parts spanning the field:
Optimization
Integration and Simulation
Bootstrapping
Density Estimation and Smoothing
Within these sections, each chapter includes a comprehensive introduction and step-by-step implementation summaries to accompany the explanations of key methods. The new edition includes updated coverage and existing topics as well as new topics such as adaptive MCMC and bootstrapping for correlated data. The book website now includes comprehensive R code for the entire book. There are extensive exercises, real examples, and helpful insights about how to use the methods in practice.
1.2 Taylor’s Theorem and Mathematical Limit Theory 2
1.3 Statistical Notation and Probability Distributions 4
1.4 Likelihood Inference 9
1.5 Bayesian Inference 11
1.6 Statistical Limit Theory 13
1.7 Markov Chains 14
1.8 Computing 17
PART IOPTIMIZATION
2OPTIMIZATION AND SOLVING NONLINEAR EQUATIONS21
2.1 Univariate Problems 22
2.2 Multivariate Problems 34
Problems 54
3COMBINATORIAL OPTIMIZATION59
3.1 Hard Problems and NP–Completeness 59
3.2 Local Search 65
3.3 Simulated Annealing 68
3.4 Genetic Algorithms 75
3.5 Tabu Algorithms 85
Problems 92
4EM OPTIMIZATION METHODS97
4.1 Missing Data, Marginalization, and Notation 97
4.2 The EM Algorithm 98
4.3 EM Variants 111
Problems 121
PART IIINTEGRATION AND SIMULATION
5NUMERICAL INTEGRATION129
5.1 Newton–Côtes Quadrature 129
5.2 Romberg Integration 139
5.3 Gaussian Quadrature 142
5.4 Frequently Encountered Problems 146
Problems 148
6SIMULATION AND MONTE CARLO INTEGRATION151
6.1 Introduction to the Monte Carlo Method 151
6.2 Exact Simulation 152
6.3 Approximate Simulation 163
6.4 Variance Reduction Techniques 180
Problems 195
7MARKOV CHAIN MONTE CARLO201
7.1 Metropolis–Hastings Algorithm 202
7.2 Gibbs Sampling 209
7.3 Implementation 218
Problems 230
8ADVANCED TOPICS IN MCMC237
8.1 Adaptive MCMC 237
8.2 Reversible Jump MCMC 250
8.3 Auxiliary Variable Methods 256
8.4 Other Metropolis–Hastings Algorithms 260
8.5 Perfect Sampling 264
8.6 Markov Chain Maximum Likelihood 268
8.7 Example: MCMC for Markov Random Fields 269
Problems 279
PART IIIBOOTSTRAPPING
9BOOTSTRAPPING287
9.1 The Bootstrap Principle 287
9.2 Basic Methods 288
9.3 Bootstrap Inference 292
9.4 Reducing Monte Carlo Error 302
9.5 Bootstrapping Dependent Data 303
9.6 Bootstrap Performance 315
9.7 Other Uses of the Bootstrap 316
9.8 Permutation Tests 317
Problems 319
PART IVDENSITY ESTIMATION AND SMOOTHING
10NONPARAMETRIC DENSITY ESTIMATION325
10.1 Measures of Performance 326
10.2 Kernel Density Estimation 327
10.3 Nonkernel Methods 341
10.4 Multivariate Methods 345
Problems 359
11BIVARIATE SMOOTHING363
11.1 Predictor–Response Data 363
11.2 Linear Smoothers 365
11.3 Comparison of Linear Smoothers 377
11.4 Nonlinear Smoothers 379
11.5 Confidence Bands 384
11.6 General Bivariate Data 388
Problems 389
12MULTIVARIATE SMOOTHING393
12.1 Predictor–Response Data 393
12.2 General Multivariate Data 413
Problems 416
DATA ACKNOWLEDGMENTS421
REFERENCES423
INDEX457
GEOF H. GIVENS, PhD, is Associate Professor in the Department of Statistics at Colorado State University. He serves as Associate Editor for Computational Statistics and Data Analysis. His research interests include statistical problems in wildlife conservation biology including ecology, population modeling and management, and automated computer face recognition.
JENNIFER A. HOETING, PhD, is Professor in the Department of Statistics at Colorado State University. She is an award–winning teacher who co–leads large research efforts for the National Science Foundation. She has served as associate editor for the Journal of the American Statistical Association and Environmetrics. Her research interests include spatial statistics, Bayesian methods, and model selection.
Givens and Hoeting have taught graduate courses on computational statistics for nearly twenty years, and short courses to leading statisticians and scientists around the world.
A valuable new edition of the complete guide to modern statistical computing
Computational Statistics, Second Edition continues to serve as a comprehensive guide to the theory and practice of statistical computing. Like its predecessor, the new edition spans a broad range of modern and classic topics including optimization, integration, Monte Carlo methods, bootstrapping, density estimation and smoothing. Algorithms are explained both conceptually and by using step–by–step descriptions, and are illustrated with detailed examples and exercises.
Important features of this Second Edition include:
Examples based on real–world applications from various fields including genetics, ecology, economics, network systems, biology, and medicine
Explanations of how computational methods are important components of major statistical approaches such as Bayesian models, linear and generalized linear models, random effects models, survival models, and hidden Markov models
Expanded coverage of Markov chain Monte Carlo methods
New topics such as sequential sampling methods, particle filters, derivative free optimization, bootstrapping dependent data, and adaptive MCMC
New exercises and examples that help readers develop the skills needed to apply computational methods to a broad array of statistical problems
A companion website offering datasets and code in the R software package
Computational Statistics, Second Edition is perfect for advanced undergraduate or graduate courses in statistical computing and as a reference for practicing statisticians.