The regression estimation problem has a long history. Already in 1632 Galileo Galilei used a procedure which can be interpreted as ?tting a linear relationship to contaminated observed data. Such ?tting of a line through a cloud of points is the classical linear regression problem. A solution of this problem is provided by the famous principle of least squares, which was discovered independently by A. M. Legendre and C. F. Gauss and published in 1805 and 1809, respectively. The principle of least squares can also be applied to construct nonparametric regression estimates, where one does not...
The regression estimation problem has a long history. Already in 1632 Galileo Galilei used a procedure which can be interpreted as ?tting a linear rel...
This volume provides a systematic in-depth analysis of nonparametric learning. It covers the theoretical limits and the asymptotical optimal algorithms and estimates, such as pattern recognition, nonparametric regression estimation, universal prediction, vector quantization, distribution and density estimation, and genetic programming.
This volume provides a systematic in-depth analysis of nonparametric learning. It covers the theoretical limits and the asymptotical optimal algori...
The regression estimation problem has a long history. Already in 1632 Galileo Galilei used a procedure which can be interpreted as ?tting a linear relationship to contaminated observed data. Such ?tting of a line through a cloud of points is the classical linear regression problem. A solution of this problem is provided by the famous principle of least squares, which was discovered independently by A. M. Legendre and C. F. Gauss and published in 1805 and 1809, respectively. The principle of least squares can also be applied to construct nonparametric regression estimates, where one does not...
The regression estimation problem has a long history. Already in 1632 Galileo Galilei used a procedure which can be interpreted as ?tting a linear rel...
This volume investigates algorithmic methods based on machine learning in order to design sequential investment strategies for financial markets. Such sequential investment strategies use information collected from the market's past and determine, at the beginning of a trading period, a portfolio; that is, a way to invest the currently available capital among the assets that are available for purchase or investment.The aim is to produce a self-contained text intended for a wide audience, including researchers and graduate students in computer science, finance, statistics, mathematics,...
This volume investigates algorithmic methods based on machine learning in order to design sequential investment strategies for financial markets. Such...
A self-contained and coherent account of probabilistic techniques, covering: distance measures, kernel rules, nearest neighbour rules, Vapnik-Chervonenkis theory, parametric classification, and feature extraction. Each chapter concludes with problems and exercises to further the readers understanding. Both research workers and graduate students will benefit from this wide-ranging and up-to-date account of a fast- moving field.
A self-contained and coherent account of probabilistic techniques, covering: distance measures, kernel rules, nearest neighbour rules, Vapnik-Chervone...