risky-drinking 发表于 2025-3-21 17:13:53

书目名称Artificial Neural Networks - ICANN 2001影响因子(影响力)<br>        http://figure.impactfactor.cn/if/?ISSN=BK0162701<br><br>        <br><br>书目名称Artificial Neural Networks - ICANN 2001影响因子(影响力)学科排名<br>        http://figure.impactfactor.cn/ifr/?ISSN=BK0162701<br><br>        <br><br>书目名称Artificial Neural Networks - ICANN 2001网络公开度<br>        http://figure.impactfactor.cn/at/?ISSN=BK0162701<br><br>        <br><br>书目名称Artificial Neural Networks - ICANN 2001网络公开度学科排名<br>        http://figure.impactfactor.cn/atr/?ISSN=BK0162701<br><br>        <br><br>书目名称Artificial Neural Networks - ICANN 2001被引频次<br>        http://figure.impactfactor.cn/tc/?ISSN=BK0162701<br><br>        <br><br>书目名称Artificial Neural Networks - ICANN 2001被引频次学科排名<br>        http://figure.impactfactor.cn/tcr/?ISSN=BK0162701<br><br>        <br><br>书目名称Artificial Neural Networks - ICANN 2001年度引用<br>        http://figure.impactfactor.cn/ii/?ISSN=BK0162701<br><br>        <br><br>书目名称Artificial Neural Networks - ICANN 2001年度引用学科排名<br>        http://figure.impactfactor.cn/iir/?ISSN=BK0162701<br><br>        <br><br>书目名称Artificial Neural Networks - ICANN 2001读者反馈<br>        http://figure.impactfactor.cn/5y/?ISSN=BK0162701<br><br>        <br><br>书目名称Artificial Neural Networks - ICANN 2001读者反馈学科排名<br>        http://figure.impactfactor.cn/5yr/?ISSN=BK0162701<br><br>        <br><br>

Herbivorous 发表于 2025-3-21 21:20:15

Brent Kawahara,Hector Estrada,Luke S. Leetional invariance can be based, and try to delimit the conditions under which each of them acts. We find out that, surprisingly, some of the most popular neural learning methods, such as weight-decay and input noise addition, exhibit this interesting property.

无法治愈 发表于 2025-3-22 00:24:22

http://reply.papertrans.cn/17/1628/162701/162701_3.png

爱哭 发表于 2025-3-22 05:00:25

http://reply.papertrans.cn/17/1628/162701/162701_4.png

STALE 发表于 2025-3-22 11:37:30

https://doi.org/10.1057/978-1-349-93268-9 weight in the output layer is derived as a nonlinear function of the training data moments. The experimental results, using one- and two-dimensional simulated data and different polynomial orders, show that the classification rate of the polynomial densities is very close to the optimum rate.

仪式 发表于 2025-3-22 14:55:31

Neural Learning Invariant to Network Size Changestional invariance can be based, and try to delimit the conditions under which each of them acts. We find out that, surprisingly, some of the most popular neural learning methods, such as weight-decay and input noise addition, exhibit this interesting property.

尖牙 发表于 2025-3-22 18:34:18

http://reply.papertrans.cn/17/1628/162701/162701_7.png

Etymology 发表于 2025-3-22 21:21:02

Discriminative Dimensionality Reduction Based on Generalized LVQionality reduction in feature extraction. Experimental results reveal that the training of both a feature transformation matrix and reference vectors by GLVQ is superior to that by principal component analysis in terms of dimensionality reduction.

业余爱好者 发表于 2025-3-23 04:34:29

http://reply.papertrans.cn/17/1628/162701/162701_9.png

甜得发腻 发表于 2025-3-23 08:32:00

Fast Curvature Matrix-Vector ProductsFisher information matrices with arbitrary vectors, using techniques similar to but even cheaper than the fast Hessian-vector product [.]. The stability of SMD [.,.,.,.], a learning rate adaptation method that uses curvature matrix-vector products, improves when the extended Gauss-Newton matrix is substituted for the Hessian.
页: [1] 2 3 4 5 6 7
查看完整版本: Titlebook: Artificial Neural Networks - ICANN 2001; International Confer Georg Dorffner,Horst Bischof,Kurt Hornik Conference proceedings 2001 Springer