"Every chapter is filled with detailed examples contemplating a large variety of statistical distributions. Each chapter finishes with the formulation of a list of conceptual exercises and another of computational exercises. ... Each chapter includes a section with a series of R software procedures designed to illustrate the statistical concepts discussed and establish asymptotic properties, revealing and visually demonstrating some otherwise hidden aspects." (Annibal Parracho Sant'Anna, zbMATH 1473.62005, 2021)
Chapter 1: Introduction.- Chapter 2: Consistency of an Estimator.- Chapter 3: Consistent and Asymptotically Normal Estimators.- Chapter 4: CAN Estimators in Exponential and Cramer Families.- Chapter 5: Large Sample Test Procedures.- Chapter 6: Goodness of Fit Test and Tests for Contingency Tables.- Chapter 7: Solutions to Conceptual Exercises.
Madhuri Kulkarni has been working as an Assistant Professor at the Department of Statistics, Savitribai Phule Pune University since 2003. She has taught a variety of courses in the span of 17 years. The list includes programming languages like C and C++, core statistical courses like probability distributions, statistical inference, regression analysis, and applied statistical courses like actuarial statistics, Bayesian inference, reliability theory. She has been using R for teaching the practical and applied courses for more than a decade. She is a recipient of the prestigious U. S. Nair Young Statistician Award. She has completed research projects for Armament Research and Development Establishment (ARDE), Pune, and has also received core research grant for a research project on software reliability from DST-SERB, India in 2018. She writes regularly in English, Hindi and Marathi in her blog. She also shares the e-content developed by her.
Shailaja Deshmukh is a visiting faculty at the Department of Statistics, Savitribai Phule Pune University (formerly known as University of Pune). She was earlier a Professor of Statistics and also Head of the Department of Statistics, before her retirement from the university in November 2015 after thirty eight years of service. She has taught around twenty five different theoretical and applied courses. She worked as a visiting professor at the Department of Statistics, University of Michigan, Ann Arbor, Michigan during 2009-10 academic year. Her areas of interest are inference in stochastic processes, applied probability, actuarial statistics and analysis of microarray data. She has a number of research publications in various peer-reviewed journals, such as Biometrika, Communication in Statistics (Theory and Methods), Journal of Multivariate Analysis, J. R. Statist. Soc.Australian Journal of Statistics, Biometrical Journal, Statistics and Probability Letters, Journal of Applied Statistics, Australian and New Zealand Journal of Statistics, Environmetrics, J. of Statistical Planning and Inference, Naval Research Logistics, Journal of Indian Statistical Association, Stochastic Modelling and Applications, Journal of Translational Medicine, Annals of Institute of Statistical Mathematics. She has published four books, the last of which was 'Multiple Decrement Models in Insurance: An Introduction Using R', published by Springer. She has served as an executive editor and as a chief editor of the Journal of Indian Statistical Association and is an elected member of the international Statistical Institute.
The book presents the fundamental concepts from asymptotic statistical inference theory, elaborating on some basic large sample optimality properties of estimators and some test procedures. The most desirable property of consistency of an estimator and its large sample distribution, with suitable normalization, are discussed, the focus being on the consistent and asymptotically normal (CAN) estimators. It is shown that for the probability models belonging to an exponential family and a Cramer family, the maximum likelihood estimators of the indexing parameters are CAN. The book describes some large sample test procedures, in particular, the most frequently used likelihood ratio test procedure. Various applications of the likelihood ratio test procedure are addressed, when the underlying probability model is a multinomial distribution. These include tests for the goodness of fit and tests for contingency tables. The book also discusses a score test and Wald’s test, their relationship with the likelihood ratio test and Karl Pearson’s chi-square test. An important finding is that, while testing any hypothesis about the parameters of a multinomial distribution, a score test statistic and Karl Pearson’s chi-square test statistic are identical.
Numerous illustrative examples of differing difficulty level are incorporated to clarify the concepts. For better assimilation of the notions, various exercises are included in each chapter. Solutions to almost all the exercises are given in the last chapter, to motivate students towards solving these exercises and to enable digestion of the underlying concepts.
The concepts from asymptotic inference are crucial in modern statistics, but are difficult to grasp in view of their abstract nature. To overcome this difficulty, keeping up with the recent trend of using R software for statistical computations, the book uses it extensively, for illustrating the concepts, verifying the properties of estimators and carrying out various test procedures. The last section of the chapters presents R codes to reveal and visually demonstrate the hidden aspects of different concepts and procedures. Augmenting the theory with R software is a novel and a unique feature of the book.
The book is designed primarily to serve as a text book for a one semester introductory course in asymptotic statistical inference, in a post-graduate program, such as Statistics, Bio-statistics or Econometrics. It will also provide sufficient background information for studying inference in stochastic processes. The book will cater to the need of a concise but clear and student-friendly book introducing, conceptually and computationally, basics of asymptotic inference.