书目名称 | Improved Classification Rates for Localized Algorithms under Margin Conditions |
编辑 | Ingrid Karin Blaschzyk |
视频video | http://file.papertrans.cn/463/462738/462738.mp4 |
概述 | Study in the field of natural sciences.Study in the field of statistical learning theory |
图书封面 |  |
描述 | Support vector machines (SVMs) are one of the most successful algorithms on small and medium-sized data sets, but on large-scale data sets their training and predictions become computationally infeasible. The author considers a spatially defined data chunking method for large-scale learning problems, leading to so-called localized SVMs, and implements an in-depth mathematical analysis with theoretical guarantees, which in particular include classification rates. The statistical analysis relies on a new and simple partitioning based technique and takes well-known margin conditions into account that describe the behavior of the data-generating distribution. It turns out that the rates outperform known rates of several other learning algorithms under suitable sets of assumptions. From a practical point of view, the author shows that a common training and validation procedure achieves the theoretical rates adaptively, that is, without knowing the margin parameters in advance. |
出版日期 | Book 2020 |
关键词 | Classification; Learning Rates; Gaussian Kernel; Tsybakov Noise; Localized SVMs; Support Vector Machines |
版次 | 1 |
doi | https://doi.org/10.1007/978-3-658-29591-2 |
isbn_softcover | 978-3-658-29590-5 |
isbn_ebook | 978-3-658-29591-2 |
copyright | Springer Fachmedien Wiesbaden GmbH, part of Springer Nature 2020 |