Case-based reasoning (CBR) has received a great deal of attention in recent years and has established itself as a core methodology in the field of artificial intelligence. The key idea of CBR is to tackle new problems by referring to similar problems that have already been solved in the past. More precisely, CBR proceeds from individual experiences in the form of cases. The generalization beyond these experiences typically relies on a kind of regularity assumption demanding that 'similar problems have similar solutions'.
Making use of different frameworks of approximate reasoning...
Case-based reasoning (CBR) has received a great deal of attention in recent years and has established itself as a core methodology in the field of ...
Aggregation of individual opinions into a social decision is a problem widely observed in everyday life. For centuries people tried to invent the best' aggregation rule. In 1951 young American scientist and future Nobel Prize winner Kenneth Arrow formulated the problem in an axiomatic way, i.e., he specified a set of axioms which every reasonable aggregation rule has to satisfy, and obtained that these axioms are inconsistent. This result, often called Arrow's Paradox or General Impossibility Theorem, had become a cornerstone of social choice theory. The main condition used by Arrow was his...
Aggregation of individual opinions into a social decision is a problem widely observed in everyday life. For centuries people tried to invent the best...
Beliefs, Interactions and Preferences in Decision Making mixes a selection of papers, presented at the Eighth Foundations and Applications of Utility and Risk Theory (FUR VIII') conference in Mons, Belgium, together with a few solicited papers from well-known authors in the field. This book addresses some of the questions that have recently emerged in the research on decision-making and risk theory. In particular, authors have modeled more and more as interactions between the individual and the environment or between different individuals the emergence of beliefs as well as the...
Beliefs, Interactions and Preferences in Decision Making mixes a selection of papers, presented at the Eighth Foundations and Applications of...
This book presents the content of a year's course in decision processes for third and fourth year students given at the University of Toronto. A principal theme of the book is the relationship between normative and descriptive decision theory. The distinction between the two approaches is not clear to everyone, yet it is of great importance. Normative decision theory addresses itself to the question of how people ought to make decisions in various types of situations, if they wish to be regarded (or to regard themselves) as 'rational'. Descriptive decision theory purports to describe how...
This book presents the content of a year's course in decision processes for third and fourth year students given at the University of Toronto. A princ...
Fundamentals of Convex Analysis offers an in-depth look at some of the fundamental themes covered within an area of mathematical analysis called convex analysis. In particular, it explores the topics of duality, separation, representation, and resolution. The work is intended for students of economics, management science, engineering, and mathematics who need exposure to the mathematical foundations of matrix games, optimization, and general equilibrium analysis. It is written at the advanced undergraduate to beginning graduate level and...
Fundamentals of Convex Analysis offers an in-depth look at some of the fundamental themes covered within an area of mathematical analysis cal...
Econometrics as an applied discipline attempts to use information in a most efficient manner, yet the information theory and entropy approach developed by Shannon and others has not played much of a role in applied econometrics. Econometrics of Information and Efficiency bridges the gap. Broadly viewed, information theory analyzes the uncertainty of a given set of data and its probabilistic characteristics. Whereas the economic theory of information emphasizes the value of information to agents in a market, the entropy theory stresses the various aspects of imprecision of data...
Econometrics as an applied discipline attempts to use information in a most efficient manner, yet the information theory and entropy approach develope...
Non-Additive Measure and Integral is the first systematic approach to the subject. Much of the additive theory (convergence theorems, Lebesgue spaces, representation theorems) is generalized, at least for submodular measures which are characterized by having a subadditive integral. The theory is of interest for applications to economic decision theory (decisions under risk and uncertainty), to statistics (including belief functions, fuzzy measures) to cooperative game theory, artificial intelligence, insurance, etc. Non-Additive Measure and Integral collects the...
Non-Additive Measure and Integral is the first systematic approach to the subject. Much of the additive theory (convergence theorems, Lebesgu...
Statistical Analysis of Observations of IncreasingDimension is devoted to the investigation of the limit distribution of the empirical generalized variance, covariance matrices, their eigenvalues and solutions of the system of linear algebraic equations with random coefficients, which are an important function of observations in multidimensional statistical analysis. A general statistical analysis is developed in which observed random vectors may not have density and their components have an arbitrary dependence structure. The methods of this theory have very important...
Statistical Analysis of Observations of IncreasingDimension is devoted to the investigation of the limit distribution of the empiri...
This text is an Elementary Introduction to Stochastic Processes in discrete and continuous time with an initiation of the statistical inference. The material is standard and classical for a first course in Stochastic Processes at the senior/graduate level (lessons 1-12). To provide students with a view of statistics of stochastic processes, three lessons (13-15) were added. These lessons can be either optional or serve as an introduction to statistical inference with dependent observations. Several points of this text need to be elaborated, (1) The pedagogy is somewhat obvious. Since this...
This text is an Elementary Introduction to Stochastic Processes in discrete and continuous time with an initiation of the statistical inference. The m...
In the last few decades the accumulation of large amounts of in formation in numerous applications. has stimtllated an increased in terest in multivariate analysis. Computer technologies allow one to use multi-dimensional and multi-parametric models successfully. At the same time, an interest arose in statistical analysis with a de ficiency of sample data. Nevertheless, it is difficult to describe the recent state of affairs in applied multivariate methods as satisfactory. Unimprovable (dominating) statistical procedures are still unknown except for a few specific cases. The simplest problem...
In the last few decades the accumulation of large amounts of in formation in numerous applications. has stimtllated an increased in terest in multivar...