'In the last decade, Efron has played a leading role in laying down the foundations of large-scale inference, not only in bringing back and developing old ideas, but also linking them with more recent developments, including the theory of false discovery rates and Bayes methods. We are indebted to him for this timely, readable and highly informative monograph, a book he is uniquely qualified to write. It is a synthesis of many of Efron's own contributions over the last decade with that of closely related material, together with some connecting theory, valuable comments, and challenges for the future. His avowed aim is 'not to have the last word' but to help us deal 'with the burgeoning statistical problems of the twenty-first century'. He has succeeded admirably.' Terry Speed, International Statistical Review
Introduction and foreword; 1. Empirical Bayes and the James–Stein estimator; 2. Large-scale hypothesis testing; 3. Significance testing algorithms; 4. False discovery rate control; 5. Local false discovery rates; 6. Theoretical, permutation and empirical null distributions; 7. Estimation accuracy; 8. Correlation questions; 9. Sets of cases (enrichment); 10. Combination, relevance, and comparability; 11. Prediction and effect size estimation; A. Exponential families; B. Programs and data sets; Bibliography; Index.