A self-contained introduction to probability, exchangeability and Bayes rule provides a theoretical understanding of the applied material.
Numerous examples with R-code that can be run "as-is" allow the reader to perform the data analyses themselves.
The development of Monte Carlo and Markov chain Monte Carlo methods in the context of data analysis examples provides motivation for these computational methods.
This is an excellent book for its intended audience: statisticians who wish to learn Bayesian methods. Although designed for a statistics audience, it would also be a good book for econometricians who have been trained in frequentist methods, but wish to learn Bayes. In relatively few pages, it takes the reader through a vast amount of material, beginning with deep issues in statistical methodology such as de Finetti's theorem, through the nitty-gritty of Bayesian computation to sophisticated models such as generalized linear mixed effects models and copulas. And it does so in a simple manner, always drawing parallels and contrasts between Bayesian and frequentist methods, so as to allow the reader to see the similarities and differences with clarity. (Econometrics Journal) "Generally, I think this is an excellent choice for a text for a one-semester Bayesian Course. It provides a good overview of the basic tenets of Bayesian thinking for the common one and two parameter distributions and gives introductions to Bayesian regression, multivariate-response modeling, hierarchical modeling, and mixed effects models. The book includes an ample collection of exercises for all the chapters. A strength of the book is its good discussion of Gibbs sampling and Metropolis-Hastings algorithms. The author goes beyond a description of the MCMC algorithms, but also provides insight into why the algorithms work. ...I believe this text would be an excellent choice for my Bayesian class since it seems to cover a good number of introductory topics and giv the student a good introduction to the modern computational tools for Bayesian inference with illustrations using R. (Journal of the American Statistical Association, June 2010, Vol. 105, No. 490)
"Statisticians and applied scientists. The book is accessible to readers having a basic familiarity with probability theory and grounding statistical methods. The author has succeeded in writing an acceptable introduction to the theory and application of Bayesian statistical methods which is modern and covers both the theory and practice. ... this book can be useful as a quick introduction to Bayesian methods for self study. In addition, I highly recommend this book as a text for a course for Bayesian statistics." (Lasse Koskinen, International Statistical Review, Vol. 78 (1), 2010)
"The book under review covers a balanced choice of topics ... presented with a focus on the interplay between Bayesian thinking and the underlying mathematical concepts. ... the book by Peter D. Hoff appears to be an excellent choice for a main reading in an introductory course. After studying this text the student can go in a direction of his liking at the graduate level." (Krzysztof Latuszynski, Mathematical Reviews, Issue 2011 m)
"The book is a good introductory treatment of methods of Bayes analysis. It should especially appeal to the reader who has had some statistical courses in estimation and modeling, and wants to understand the Bayesian interpretation of those methods. Also, readers who are primarily interested in modeling data and who are working in areas outside of statistics should find this to be a good reference book. ... should appeal to the reader who wants to keep with modern approaches to data analysis." (Richard P. Heydorn, Technometrics, Vol. 54 (1), February, 2012)
and examples.- Belief, probability and exchangeability.- One-parameter models.- Monte Carlo approximation.- The normal model.- Posterior approximation with the Gibbs sampler.- The multivariate normal model.- Group comparisons and hierarchical modeling.- Linear regression.- Nonconjugate priors and Metropolis-Hastings algorithms.- Linear and generalized linear mixed effects models.- Latent variable methods for ordinal data.
This book provides a compact self-contained introduction to the theory and application of Bayesian statistical methods. The book is accessible to readers having a basic familiarity with probability, yet allows more advanced readers to quickly grasp the principles underlying Bayesian theory and methods. The examples and computer code allow the reader to understand and implement basic Bayesian data analyses using standard statistical models and to extend the standard models to specialized data analysis situations. The book begins with fundamental notions such as probability, exchangeability and Bayes' rule, and ends with modern topics such as variable selection in regression, generalized linear mixed effects models, and semiparametric copula estimation. Numerous examples from the social, biological and physical sciences show how to implement these methodologies in practice.
Monte Carlo summaries of posterior distributions play an important role in Bayesian data analysis. The open-source R statistical computing environment provides sufficient functionality to make Monte Carlo estimation very easy for a large number of statistical models and example R-code is provided throughout the text. Much of the example code can be run ``as is'' in R, and essentially all of it can be run after downloading the relevant datasets from the companion website for this book.
Peter Hoff is an Associate Professor of Statistics and Biostatistics at the University of Washington. He has developed a variety of Bayesian methods for multivariate data, including covariance and copula estimation, cluster analysis, mixture modeling and social network analysis. He is on the editorial board of the Annals of Applied Statistics.