找回密码
 To register

QQ登录

只需一步,快速开始

扫一扫,访问微社区

Titlebook: Artificial Neural Networks - ICANN 2001; International Confer Georg Dorffner,Horst Bischof,Kurt Hornik Conference proceedings 2001 Springer

[复制链接]
查看: 34961|回复: 66
发表于 2025-3-21 17:13:53 | 显示全部楼层 |阅读模式
期刊全称Artificial Neural Networks - ICANN 2001
期刊简称International Confer
影响因子2023Georg Dorffner,Horst Bischof,Kurt Hornik
视频video
发行地址Includes supplementary material:
学科分类Lecture Notes in Computer Science
图书封面Titlebook: Artificial Neural Networks - ICANN 2001; International Confer Georg Dorffner,Horst Bischof,Kurt Hornik Conference proceedings 2001 Springer
影响因子This book is based on the papers presented at the International Conference on Arti?cial Neural Networks, ICANN 2001, from August 21–25, 2001 at the - enna University of Technology, Austria. The conference is organized by the A- trian Research Institute for Arti?cal Intelligence in cooperation with the Pattern Recognition and Image Processing Group and the Center for Computational - telligence at the Vienna University of Technology. The ICANN conferences were initiated in 1991 and have become the major European meeting in the ?eld of neural networks. From about 300 submitted papers, the program committee selected 171 for publication. Each paper has been reviewed by three program committee m- bers/reviewers. We would like to thank all the members of the program comm- tee and the reviewers for their great e?ort in the reviewing process and helping us to set up a scienti?c program of high quality. In addition, we have invited eight speakers; three of their papers are also included in the proceedings. We would like to thank the European Neural Network Society (ENNS) for their support. We acknowledge the ?nancial support of Austrian Airlines, A- trian Science Foundation (FWF) under the c
Pindex Conference proceedings 2001
The information of publication is updating

书目名称Artificial Neural Networks - ICANN 2001影响因子(影响力)




书目名称Artificial Neural Networks - ICANN 2001影响因子(影响力)学科排名




书目名称Artificial Neural Networks - ICANN 2001网络公开度




书目名称Artificial Neural Networks - ICANN 2001网络公开度学科排名




书目名称Artificial Neural Networks - ICANN 2001被引频次




书目名称Artificial Neural Networks - ICANN 2001被引频次学科排名




书目名称Artificial Neural Networks - ICANN 2001年度引用




书目名称Artificial Neural Networks - ICANN 2001年度引用学科排名




书目名称Artificial Neural Networks - ICANN 2001读者反馈




书目名称Artificial Neural Networks - ICANN 2001读者反馈学科排名




单选投票, 共有 0 人参与投票
 

0票 0%

Perfect with Aesthetics

 

0票 0%

Better Implies Difficulty

 

0票 0%

Good and Satisfactory

 

0票 0%

Adverse Performance

 

0票 0%

Disdainful Garbage

您所在的用户组没有投票权限
发表于 2025-3-21 21:20:15 | 显示全部楼层
Brent Kawahara,Hector Estrada,Luke S. Leetional invariance can be based, and try to delimit the conditions under which each of them acts. We find out that, surprisingly, some of the most popular neural learning methods, such as weight-decay and input noise addition, exhibit this interesting property.
发表于 2025-3-22 00:24:22 | 显示全部楼层
发表于 2025-3-22 05:00:25 | 显示全部楼层
发表于 2025-3-22 11:37:30 | 显示全部楼层
https://doi.org/10.1057/978-1-349-93268-9 weight in the output layer is derived as a nonlinear function of the training data moments. The experimental results, using one- and two-dimensional simulated data and different polynomial orders, show that the classification rate of the polynomial densities is very close to the optimum rate.
发表于 2025-3-22 14:55:31 | 显示全部楼层
Neural Learning Invariant to Network Size Changestional invariance can be based, and try to delimit the conditions under which each of them acts. We find out that, surprisingly, some of the most popular neural learning methods, such as weight-decay and input noise addition, exhibit this interesting property.
发表于 2025-3-22 18:34:18 | 显示全部楼层
发表于 2025-3-22 21:21:02 | 显示全部楼层
Discriminative Dimensionality Reduction Based on Generalized LVQionality reduction in feature extraction. Experimental results reveal that the training of both a feature transformation matrix and reference vectors by GLVQ is superior to that by principal component analysis in terms of dimensionality reduction.
发表于 2025-3-23 04:34:29 | 显示全部楼层
发表于 2025-3-23 08:32:00 | 显示全部楼层
Fast Curvature Matrix-Vector ProductsFisher information matrices with arbitrary vectors, using techniques similar to but even cheaper than the fast Hessian-vector product [.]. The stability of SMD [.,.,.,.], a learning rate adaptation method that uses curvature matrix-vector products, improves when the extended Gauss-Newton matrix is substituted for the Hessian.
 关于派博传思  派博传思旗下网站  友情链接
派博传思介绍 公司地理位置 论文服务流程 影响因子官网 SITEMAP 大讲堂 北京大学 Oxford Uni. Harvard Uni.
发展历史沿革 期刊点评 投稿经验总结 SCIENCEGARD IMPACTFACTOR 派博系数 清华大学 Yale Uni. Stanford Uni.
|Archiver|手机版|小黑屋| 派博传思国际 ( 京公网安备110108008328) GMT+8, 2025-5-18 19:02
Copyright © 2001-2015 派博传思   京公网安备110108008328 版权所有 All rights reserved
快速回复 返回顶部 返回列表