Images are all around us The proliferation of low-cost, high-quality imaging devices has led to an explosion in acquired images. When these images are acquired from a microscope, telescope, satellite, or medical imaging device, there is a statistical image processing task: the inference of something--an artery, a road, a DNA marker, an oil spill--from imagery, possibly noisy, blurry, or incomplete. A great many textbooks have been written on image processing. However this book does not so much focus on images, per se, but rather on spatial data sets, with one or more measurements...
Images are all around us The proliferation of low-cost, high-quality imaging devices has led to an explosion in acquired images. When these images ar...
This book is a comprehensive and accessible introduction to the cross-entropy (CE) method. The CE method started life around 1997 when the first author proposed an adaptive algorithm for rare-event simulation using a cross-entropy minimization technique. It was soon realized that the underlying ideas had a much wider range of application than just in rare-event simulation; they could be readily adapted to tackle quite general combinatorial and multi-extremal optimization problems, including many problems associated with the field of learning algorithms and neural computation. The book is...
This book is a comprehensive and accessible introduction to the cross-entropy (CE) method. The CE method started life around 1997 when the first autho...
Mythanksareduetothemanypeoplewhohaveassistedintheworkreported here and in the preparation of this book. The work is incomplete and this account of it rougher than it might be. Such virtues as it has owe much to others; the faults are all mine. MyworkleadingtothisbookbeganwhenDavidBoultonandIattempted to develop a method for intrinsic classi?cation. Given data on a sample from some population, we aimed to discover whether the population should be considered to be a mixture of di?erent types, classes or species of thing, and, if so, how many classes were present, what each class looked like,...
Mythanksareduetothemanypeoplewhohaveassistedintheworkreported here and in the preparation of this book. The work is incomplete and this account of it ...
Probabilistic graphical models and decision graphs are powerful modeling tools for reasoning and decision making under uncertainty. As modeling languages they allow a natural specification of problem domains with inherent uncertainty, and from a computational perspective they support efficient algorithms for automatic construction and query answering. This includes belief updating, finding the most probable explanation for the observed evidence, detecting conflicts in the evidence entered into the network, determining optimal strategies, analyzing for relevance, and performing sensitivity...
Probabilistic graphical models and decision graphs are powerful modeling tools for reasoning and decision making under uncertainty. As modeling lan...
Monte Carlo methods are revolutionizing the on-line analysis of data in fields as diverse as financial modeling, target tracking and computer vision. These methods, appearing under the names of bootstrap filters, condensation, optimal Monte Carlo filters, particle filters and survival of the fittest, have made it possible to solve numerically many complex, non-standard problems that were previously intractable. This book presents the first comprehensive treatment of these techniques, including convergence results and applications to tracking, guidance, automated target recognition, aircraft...
Monte Carlo methods are revolutionizing the on-line analysis of data in fields as diverse as financial modeling, target tracking and computer vision. ...
In the fall of 1999, I was asked to teach a course on computer intrusion detection for the Department of Mathematical Sciences of The Johns Hopkins University. That course was the genesis of this book. I had been working in the field for several years at the Naval Surface Warfare Center, in Dahlgren, Virginia, under the auspices of the SHADOW program, with some funding by the Office of Naval Research. In designing the class, I was concerned both with giving an overview of the basic problems in computer security, and with providing information that was of interest to a department of...
In the fall of 1999, I was asked to teach a course on computer intrusion detection for the Department of Mathematical Sciences of The Johns Hopkins Un...
The aim of this book is to discuss the fundamental ideas which lie behind the statistical theory of learning and generalization. It considers learning as a general problem of function estimation based on empirical data. Omitting proofs and technical details, the author concentrates on discussing the main results of learning theory and their connections to fundamental problems in statistics. These include: * the setting of learning problems based on the model of minimizing the risk functional from empirical data * a comprehensive analysis of the empirical risk minimization principle including...
The aim of this book is to discuss the fundamental ideas which lie behind the statistical theory of learning and generalization. It considers learning...
Probabilistic Conditional Independence Structures provides the mathematical description of probabilistic conditional independence structures; the author uses non-graphical methods of their description, and takes an algebraic approach.
The monograph presents the methods of structural imsets and supermodular functions, and deals with independence implication and equivalence of structural imsets. Motivation, mathematical foundations and areas of application are included, and a rough overview of graphical methods is also given. In particular, the author has been careful to use suitable...
Probabilistic Conditional Independence Structures provides the mathematical description of probabilistic conditional independence structures; the a...
Methods of dimensionality reduction provide a way to understand and visualize the structure of complex data sets. Traditional methods like principal component analysis and classical metric multidimensional scaling suffer from being based on linear models. Until recently, very few methods were able to reduce the data dimensionality in a nonlinear way. However, since the late nineties, many new methods have been developed and nonlinear dimensionality reduction, also called manifold learning, has become a hot topic. New advances that account for this rapid growth are, e.g. the use of graphs...
Methods of dimensionality reduction provide a way to understand and visualize the structure of complex data sets. Traditional methods like principa...
Twenty-?ve years have passed since the publication of the Russian version of the book Estimation of Dependencies Based on Empirical Data (EDBED for short). Twen- ?ve years is a long period of time. During these years many things have happened. Looking back, one can see how rapidly life and technology have changed, and how slow and dif?cult it is to change the theoretical foundation of the technology and its philosophy. I pursued two goals writing this Afterword: to update the technical results presented in EDBED (the easy goal) and to describe a general picture of how the new ideas developed...
Twenty-?ve years have passed since the publication of the Russian version of the book Estimation of Dependencies Based on Empirical Data (EDBED for sh...