找回密码
 To register

QQ登录

只需一步,快速开始

扫一扫,访问微社区

Titlebook: Minimum Divergence Methods in Statistical Machine Learning; From an Information Shinto Eguchi,Osamu Komori Book 2022 Springer Japan KK, pa

[复制链接]
查看: 43586|回复: 35
发表于 2025-3-21 16:24:31 | 显示全部楼层 |阅读模式
书目名称Minimum Divergence Methods in Statistical Machine Learning
副标题From an Information
编辑Shinto Eguchi,Osamu Komori
视频video
概述Provides various applications including boosting and kernel methods in machine learning with a geometric invariance viewpoint.Facilitates a deeper understanding of the complementary relation between s
图书封面Titlebook: Minimum Divergence Methods in Statistical Machine Learning; From an Information  Shinto Eguchi,Osamu Komori Book 2022 Springer Japan KK, pa
描述.This book explores minimum divergence methods of statistical machine learning for estimation,  regression, prediction, and so forth,  in which we engage in information geometry to elucidate their intrinsic properties of the corresponding loss functions, learning algorithms, and statistical models. One of the most elementary  examples is Gauss‘s least squares estimator in a linear regression model, in which the estimator is given by minimization of the sum of squares between a response vector and a vector of the linear subspace hulled by explanatory vectors.  This is extended to Fisher‘s maximum likelihood estimator (MLE) for an exponential model, in which the estimator is provided by minimization of the Kullback-Leibler (KL) divergence between a data distribution and a parametric distribution of the exponential model in an empirical analogue. Thus, we envisage a geometric interpretation of such  minimization procedures such that a right triangle is kept with Pythagorean identity in the sense of the KL divergence.  This understanding sublimates  a dualistic interplay between a statistical estimation and model, which requires dual geodesic paths, called m-geodesic and e-geodesic pat
出版日期Book 2022
关键词Boosting; Independent Component Analysis; Information Geometry; Kernel Method; Machine Learning
版次1
doihttps://doi.org/10.1007/978-4-431-56922-0
isbn_ebook978-4-431-56922-0
copyrightSpringer Japan KK, part of Springer Nature 2022
The information of publication is updating

书目名称Minimum Divergence Methods in Statistical Machine Learning影响因子(影响力)




书目名称Minimum Divergence Methods in Statistical Machine Learning影响因子(影响力)学科排名




书目名称Minimum Divergence Methods in Statistical Machine Learning网络公开度




书目名称Minimum Divergence Methods in Statistical Machine Learning网络公开度学科排名




书目名称Minimum Divergence Methods in Statistical Machine Learning被引频次




书目名称Minimum Divergence Methods in Statistical Machine Learning被引频次学科排名




书目名称Minimum Divergence Methods in Statistical Machine Learning年度引用




书目名称Minimum Divergence Methods in Statistical Machine Learning年度引用学科排名




书目名称Minimum Divergence Methods in Statistical Machine Learning读者反馈




书目名称Minimum Divergence Methods in Statistical Machine Learning读者反馈学科排名




单选投票, 共有 0 人参与投票
 

0票 0%

Perfect with Aesthetics

 

0票 0%

Better Implies Difficulty

 

0票 0%

Good and Satisfactory

 

0票 0%

Adverse Performance

 

0票 0%

Disdainful Garbage

您所在的用户组没有投票权限
发表于 2025-3-21 21:36:39 | 显示全部楼层
Minimum Divergence Methods in Statistical Machine LearningFrom an Information
发表于 2025-3-22 02:27:58 | 显示全部楼层
发表于 2025-3-22 07:29:06 | 显示全部楼层
Book 2022 of such  minimization procedures such that a right triangle is kept with Pythagorean identity in the sense of the KL divergence.  This understanding sublimates  a dualistic interplay between a statistical estimation and model, which requires dual geodesic paths, called m-geodesic and e-geodesic pat
发表于 2025-3-22 11:28:42 | 显示全部楼层
发表于 2025-3-22 14:15:29 | 显示全部楼层
发表于 2025-3-22 17:14:41 | 显示全部楼层
Springer Japan KK, part of Springer Nature 2022
发表于 2025-3-22 22:15:12 | 显示全部楼层
http://image.papertrans.cn/m/image/634627.jpg
发表于 2025-3-23 03:00:12 | 显示全部楼层
发表于 2025-3-23 07:19:00 | 显示全部楼层
Shinto Eguchi,Osamu KomoriProvides various applications including boosting and kernel methods in machine learning with a geometric invariance viewpoint.Facilitates a deeper understanding of the complementary relation between s
 关于派博传思  派博传思旗下网站  友情链接
派博传思介绍 公司地理位置 论文服务流程 影响因子官网 SITEMAP 大讲堂 北京大学 Oxford Uni. Harvard Uni.
发展历史沿革 期刊点评 投稿经验总结 SCIENCEGARD IMPACTFACTOR 派博系数 清华大学 Yale Uni. Stanford Uni.
|Archiver|手机版|小黑屋| 派博传思国际 ( 京公网安备110108008328) GMT+8, 2025-5-14 16:19
Copyright © 2001-2015 派博传思   京公网安备110108008328 版权所有 All rights reserved
快速回复 返回顶部 返回列表