书目名称 | Minimum Divergence Methods in Statistical Machine Learning | 副标题 | From an Information | 编辑 | Shinto Eguchi,Osamu Komori | 视频video | | 概述 | Provides various applications including boosting and kernel methods in machine learning with a geometric invariance viewpoint.Facilitates a deeper understanding of the complementary relation between s | 图书封面 |  | 描述 | .This book explores minimum divergence methods of statistical machine learning for estimation, regression, prediction, and so forth, in which we engage in information geometry to elucidate their intrinsic properties of the corresponding loss functions, learning algorithms, and statistical models. One of the most elementary examples is Gauss‘s least squares estimator in a linear regression model, in which the estimator is given by minimization of the sum of squares between a response vector and a vector of the linear subspace hulled by explanatory vectors. This is extended to Fisher‘s maximum likelihood estimator (MLE) for an exponential model, in which the estimator is provided by minimization of the Kullback-Leibler (KL) divergence between a data distribution and a parametric distribution of the exponential model in an empirical analogue. Thus, we envisage a geometric interpretation of such minimization procedures such that a right triangle is kept with Pythagorean identity in the sense of the KL divergence. This understanding sublimates a dualistic interplay between a statistical estimation and model, which requires dual geodesic paths, called m-geodesic and e-geodesic pat | 出版日期 | Book 2022 | 关键词 | Boosting; Independent Component Analysis; Information Geometry; Kernel Method; Machine Learning | 版次 | 1 | doi | https://doi.org/10.1007/978-4-431-56922-0 | isbn_ebook | 978-4-431-56922-0 | copyright | Springer Japan KK, part of Springer Nature 2022 |
The information of publication is updating
|
|