• Wyszukiwanie zaawansowane
  • Kategorie
  • Kategorie BISAC
  • Książki na zamówienie
  • Promocje
  • Granty
  • Książka na prezent
  • Opinie
  • Pomoc
  • Załóż konto
  • Zaloguj się

Minimum Divergence Methods in Statistical Machine Learning: From an Information Geometric Viewpoint » książka

zaloguj się | załóż konto
Logo Krainaksiazek.pl

koszyk

konto

szukaj
topmenu
Księgarnia internetowa
Szukaj
Książki na zamówienie
Promocje
Granty
Książka na prezent
Moje konto
Pomoc
 
 
Wyszukiwanie zaawansowane
Pusty koszyk
Bezpłatna dostawa dla zamówień powyżej 20 złBezpłatna dostawa dla zamówień powyżej 20 zł

Kategorie główne

• Nauka
 [2946600]
• Literatura piękna
 [1856966]

  więcej...
• Turystyka
 [72221]
• Informatyka
 [151456]
• Komiksy
 [35826]
• Encyklopedie
 [23190]
• Dziecięca
 [619653]
• Hobby
 [140543]
• AudioBooki
 [1577]
• Literatura faktu
 [228355]
• Muzyka CD
 [410]
• Słowniki
 [2874]
• Inne
 [445822]
• Kalendarze
 [1744]
• Podręczniki
 [167141]
• Poradniki
 [482898]
• Religia
 [510455]
• Czasopisma
 [526]
• Sport
 [61590]
• Sztuka
 [243598]
• CD, DVD, Video
 [3423]
• Technologie
 [219201]
• Zdrowie
 [101638]
• Książkowe Klimaty
 [124]
• Zabawki
 [2473]
• Puzzle, gry
 [3898]
• Literatura w języku ukraińskim
 [254]
• Art. papiernicze i szkolne
 [8170]
Kategorie szczegółowe BISAC

Minimum Divergence Methods in Statistical Machine Learning: From an Information Geometric Viewpoint

ISBN-13: 9784431569206 / Angielski / Twarda / 2022 / 232 str.

Shinto Eguchi; Osamu Komori
Minimum Divergence Methods in Statistical Machine Learning: From an Information Geometric Viewpoint Eguchi, Shinto 9784431569206 Springer Japan - książkaWidoczna okładka, to zdjęcie poglądowe, a rzeczywista szata graficzna może różnić się od prezentowanej.

Minimum Divergence Methods in Statistical Machine Learning: From an Information Geometric Viewpoint

ISBN-13: 9784431569206 / Angielski / Twarda / 2022 / 232 str.

Shinto Eguchi; Osamu Komori
cena 564,88 zł
(netto: 537,98 VAT:  5%)

Najniższa cena z 30 dni: 501,19 zł
Termin realizacji zamówienia:
ok. 22 dni roboczych
Bez gwarancji dostawy przed świętami

Darmowa dostawa!

This book explores minimum divergence methods of statistical machine learning for estimation,  regression, prediction, and so forth,  in which we engage in information geometry to elucidate their intrinsic properties of the corresponding loss functions, learning algorithms, and statistical models. One of the most elementary  examples is Gauss's least squares estimator in a linear regression model, in which the estimator is given by minimization of the sum of squares between a response vector and a vector of the linear subspace hulled by explanatory vectors.  This is extended to Fisher's maximum likelihood estimator (MLE) for an exponential model, in which the estimator is provided by minimization of the Kullback-Leibler (KL) divergence between a data distribution and a parametric distribution of the exponential model in an empirical analogue. Thus, we envisage a geometric interpretation of such  minimization procedures such that a right triangle is kept with Pythagorean identity in the sense of the KL divergence.  This understanding sublimates  a dualistic interplay between a statistical estimation and model, which requires dual geodesic paths, called m-geodesic and e-geodesic paths, in a framework of information geometry.
We extend such a dualistic structure of the MLE and exponential model to that of the minimum divergence estimator and the maximum entropy model, which is applied to robust statistics, maximum entropy, density estimation, principal component analysis, independent component analysis, regression analysis, manifold learning, boosting algorithm,  clustering, dynamic treatment regimes, and so forth. We consider a variety of information divergence measures typically including KL divergence to express departure from one probability distribution to another. An information divergence is decomposed into the cross-entropy and the (diagonal) entropy in which the entropy associates with a generative model as a family of maximum entropy distributions; the cross entropy associates with a statistical estimation method via minimization of the empirical analogue based on given data. Thus any statistical divergence includes an intrinsic object between the generative model and the estimation method. Typically, KL divergence leads to the exponential model and the maximum likelihood estimation. It is shown that any information divergence leads to a Riemannian metric and a pair of the linear connections in the framework of information geometry.
We focus on a class of information divergence generated by an increasing and convex function U, called U-divergence. It is shown that any generator function U generates the U-entropy and U-divergence, in which there is a dualistic structure between the U-divergence method and the maximum U-entropy model. We observe that a specific choice of  U leads to a robust statistical procedure via the minimum U-divergence method. If U is selected as an exponential function, then the corresponding  U-entropy and U-divergence are reduced to the Boltzmann-Shanon entropy and the KL divergence; the minimum U-divergence estimator is equivalent to the MLE. For robust supervised learning to predict a class label we observe that the U-boosting algorithm performs well for contamination of mislabel examples if U is appropriately selected. We present such maximal U-entropy and minimum U-divergence methods, in particular, selecting a power function as U to provide flexible performance in statistical machine learning. 

Kategorie:
Nauka, Biologia i przyroda
Kategorie BISAC:
Mathematics > Prawdopodobieństwo i statystyka
Computers > Mathematical & Statistical Software
Computers > Computer Science
Wydawca:
Springer Japan
Język:
Angielski
ISBN-13:
9784431569206
Rok wydania:
2022
Ilość stron:
232
Waga:
0.50 kg
Wymiary:
23.39 x 15.6 x 1.42
Oprawa:
Twarda
Wolumenów:
01
Dodatkowe informacje:
Wydanie ilustrowane

Information geometry.- Information divergence.- Maximum entropy model.- Minimum divergence method.- Unsupervised learning algorithms.- Regression model.- Classification. 

Shinto Eguchi is currently Professor at the Institute of Statistical Mathematics.

Osamu Komori is Associate professor at the Seikei University.

This book explores minimum divergence methods of statistical machine learning for estimation,  regression, prediction, and so forth,  in which we engage in information geometry to elucidate their intrinsic properties of the corresponding loss functions, learning algorithms, and statistical models. One of the most elementary  examples is Gauss's least squares estimator in a linear regression model, in which the estimator is given by minimization of the sum of squares between a response vector and a vector of the linear subspace hulled by explanatory vectors.  This is extended to Fisher's maximum likelihood estimator (MLE) for an exponential model, in which the estimator is provided by minimization of the Kullback-Leibler (KL) divergence between a data distribution and a parametric distribution of the exponential model in an empirical analogue. Thus, we envisage a geometric interpretation of such  minimization procedures such that a right triangle is kept with Pythagorean identity in the sense of the KL divergence.  This understanding sublimates  a dualistic interplay between a statistical estimation and model, which requires dual geodesic paths, called m-geodesic and e-geodesic paths, in a framework of information geometry.
We extend such a dualistic structure of the MLE and exponential model to that of the minimum divergence estimator and the maximum entropy model, which is applied to robust statistics, maximum entropy, density estimation, principal component analysis, independent component analysis, regression analysis, manifold learning, boosting algorithm,  clustering, dynamic treatment regimes, and so forth. We consider a variety of information divergence measures typically including KL divergence to express departure from one probability distribution to another. An information divergence is decomposed into the cross-entropy and the (diagonal) entropy in which the entropy associates with a generative model as a family of maximum entropy distributions; the cross entropy associates with a statistical estimation method via minimization of the empirical analogue based on given data. Thus any statistical divergence includes an intrinsic object between the generative model and the estimation method. Typically, KL divergence leads to the exponential model and the maximum likelihood estimation. It is shown that any information divergence leads to a Riemannian metric and a pair of the linear connections in the framework of information geometry.
We focus on a class of information divergence generated by an increasing and convex function U, called U-divergence. It is shown that any generator function U generates the U-entropy and U-divergence, in which there is a dualistic structure between the U-divergence method and the maximum U-entropy model. We observe that a specific choice of  U leads to a robust statistical procedure via the minimum U-divergence method. If U is selected as an exponential function, then the corresponding  U-entropy and U-divergence are reduced to the Boltzmann-Shanon entropy and the KL divergence; the minimum U-divergence estimator is equivalent to the MLE. For robust supervised learning to predict a class label we observe that the U-boosting algorithm performs well for contamination of mislabel examples if U is appropriately selected. We present such maximal U-entropy and minimum U-divergence methods, in particular, selecting a power function as U to provide flexible performance in statistical machine learning.

 




Udostępnij

Facebook - konto krainaksiazek.pl



Opinie o Krainaksiazek.pl na Opineo.pl

Partner Mybenefit

Krainaksiazek.pl w programie rzetelna firma Krainaksiaze.pl - płatności przez paypal

Czytaj nas na:

Facebook - krainaksiazek.pl
  • książki na zamówienie
  • granty
  • książka na prezent
  • kontakt
  • pomoc
  • opinie
  • regulamin
  • polityka prywatności

Zobacz:

  • Księgarnia czeska

  • Wydawnictwo Książkowe Klimaty

1997-2025 DolnySlask.com Agencja Internetowa

© 1997-2022 krainaksiazek.pl
     
KONTAKT | REGULAMIN | POLITYKA PRYWATNOŚCI | USTAWIENIA PRYWATNOŚCI
Zobacz: Księgarnia Czeska | Wydawnictwo Książkowe Klimaty | Mapa strony | Lista autorów
KrainaKsiazek.PL - Księgarnia Internetowa
Polityka prywatnosci - link
Krainaksiazek.pl - płatnośc Przelewy24
Przechowalnia Przechowalnia