充气女 发表于 2025-3-28 17:50:05

978-3-540-26556-6Springer-Verlag Berlin Heidelberg 2005

琐事 发表于 2025-3-28 21:05:43

Learning Theory978-3-540-31892-7Series ISSN 0302-9743 Series E-ISSN 1611-3349

冷漠 发表于 2025-3-29 01:17:54

On the Consistency of Multiclass Classification Methodsproperty of Bayes consistency. We provide a necessary and sufficient condition for consistency which applies to a large class of multiclass classification methods. The approach is illustrated by applying it to some multiclass methods proposed in the literature.

怒目而视 发表于 2025-3-29 05:56:57

Data Dependent Concentration Bounds for Sequential Prediction Algorithmsg some newly developed probability inequalities, we are able to bound the total generalization performance of a learning algorithm in terms of its observed total loss. Consequences of this analysis will be illustrated with examples.

防锈 发表于 2025-3-29 07:17:36

The Weak Aggregating Algorithm and Weak Mixabilityrom a finite alphabet. For the bounded games the paper introduces the Weak Aggregating Algorithm that allows us to obtain additive terms of the form .. A modification of the Weak Aggregating Algorithm that covers unbounded games is also described.

fastness 发表于 2025-3-29 14:06:55

Tracking the Best of Many Experts provided that the set of experts has a certain structure allowing efficient implementations of the exponentially weighted average predictor. As an example we work out the case where each expert is represented by a path in a directed graph and the loss of each expert is the sum of the weights over the edges in the path.

发表于 2025-3-29 17:57:51

https://doi.org/10.1007/b137542Boosting; Support Vector Machine; classification; game theory; learning; learning theory; supervised learn

Allure 发表于 2025-3-29 23:30:53

Martingale BoostingMartingale boosting is a simple and easily understood technique with a simple and easily understood analysis. A slight variant of the approach provably achieves optimal accuracy in the presence of random misclassification noise.

custody 发表于 2025-3-30 01:14:18

Sensitive Error Correcting Output CodesWe present a reduction from cost-sensitive classification to binary classification based on (a modification of) error correcting output codes. The reduction satisfies the property that . regret for binary classification implies ..-regret of at most 2. for cost estimation. This has several implications:

Brain-Imaging 发表于 2025-3-30 08:00:34

Margin-Based Ranking Meets Boosting in the MiddleUC and achieves the same AUC as RankBoost. This explains the empirical observations made by Cortes and Mohri, and Caruana and Niculescu-Mizil, about the excellent performance of AdaBoost as a ranking algorithm, as measured by the AUC.
页: 1 2 3 4 [5] 6
查看完整版本: Titlebook: Learning Theory; 18th Annual Conferen Peter Auer,Ron Meir Conference proceedings 2005 Springer-Verlag Berlin Heidelberg 2005 Boosting.Suppo