Dorsal 发表于 2025-4-1 05:13:14

http://reply.papertrans.cn/24/2326/232576/232576_61.png

火光在摇曳 发表于 2025-4-1 06:04:45

http://reply.papertrans.cn/24/2326/232576/232576_62.png

音的强弱 发表于 2025-4-1 11:01:39

http://reply.papertrans.cn/24/2326/232576/232576_63.png

Basilar-Artery 发表于 2025-4-1 16:14:56

http://reply.papertrans.cn/24/2326/232576/232576_64.png

PSA-velocity 发表于 2025-4-1 19:38:51

A minimax lower bound for empirical quantizer design, empirically designed vector quantizer is at least . (..) away from the optimal distortion for some distribution on a bounded subset of .., where . is the number of i.i.d. data points that are used to train the empirical quantizer.

GULP 发表于 2025-4-2 01:55:59

Vapnik-Chervonenkis dimension of recurrent neural networks,o widely used in learning applications, in particular when time is a relevant parameter. This paper provides lower and upper bounds for the VC dimension of such networks. Several types of activation functions are discussed, including threshold, polynomial, piecewise-polynomial and sigmoidal function
页: 1 2 3 4 5 6 [7]
查看完整版本: Titlebook: Computational Learning Theory; Third European Confe Shai Ben-David Conference proceedings 1997 Springer-Verlag Berlin Heidelberg 1997 Algor