爆发 发表于 2025-3-21 18:00:53

书目名称Artificial Neural Networks and Machine Learning -- ICANN 2012影响因子(影响力)<br>        http://impactfactor.cn/if/?ISSN=BK0162633<br><br>        <br><br>书目名称Artificial Neural Networks and Machine Learning -- ICANN 2012影响因子(影响力)学科排名<br>        http://impactfactor.cn/ifr/?ISSN=BK0162633<br><br>        <br><br>书目名称Artificial Neural Networks and Machine Learning -- ICANN 2012网络公开度<br>        http://impactfactor.cn/at/?ISSN=BK0162633<br><br>        <br><br>书目名称Artificial Neural Networks and Machine Learning -- ICANN 2012网络公开度学科排名<br>        http://impactfactor.cn/atr/?ISSN=BK0162633<br><br>        <br><br>书目名称Artificial Neural Networks and Machine Learning -- ICANN 2012被引频次<br>        http://impactfactor.cn/tc/?ISSN=BK0162633<br><br>        <br><br>书目名称Artificial Neural Networks and Machine Learning -- ICANN 2012被引频次学科排名<br>        http://impactfactor.cn/tcr/?ISSN=BK0162633<br><br>        <br><br>书目名称Artificial Neural Networks and Machine Learning -- ICANN 2012年度引用<br>        http://impactfactor.cn/ii/?ISSN=BK0162633<br><br>        <br><br>书目名称Artificial Neural Networks and Machine Learning -- ICANN 2012年度引用学科排名<br>        http://impactfactor.cn/iir/?ISSN=BK0162633<br><br>        <br><br>书目名称Artificial Neural Networks and Machine Learning -- ICANN 2012读者反馈<br>        http://impactfactor.cn/5y/?ISSN=BK0162633<br><br>        <br><br>书目名称Artificial Neural Networks and Machine Learning -- ICANN 2012读者反馈学科排名<br>        http://impactfactor.cn/5yr/?ISSN=BK0162633<br><br>        <br><br>

盖他为秘密 发表于 2025-3-21 21:32:12

Theoretical Analysis of Function of Derivative Term in On-Line Gradient Descent Learningh as by using the natural gradient, has been proposed for speeding up the convergence. Beside this sophisticated method, ”simple method” that replace the derivative term with a constant has proposed and showed that this greatly increases convergence speed. Although this phenomenon has been analyzed

刚毅 发表于 2025-3-22 02:15:26

http://reply.papertrans.cn/17/1627/162633/162633_3.png

记忆 发表于 2025-3-22 05:14:37

http://reply.papertrans.cn/17/1627/162633/162633_4.png

Goblet-Cells 发表于 2025-3-22 10:56:40

Electricity Load Forecasting: A Weekday-Based Approachlection using autocorrelation analysis for each day of the week and build a separate prediction model using linear regression and backpropagation neural networks. We used two years of 5-minute electricity load data for the state of New South Wales in Australia to evaluate performance. Our results sh

Meager 发表于 2025-3-22 15:42:06

Adaptive Exploration Using Stochastic Neuronsl-free temporal-difference learning using discrete actions. The advantage is in particular memory efficiency, because memorizing exploratory data is only required for starting states. Hence, if a learning problem consist of only one starting state, exploratory data can be considered as being global.

Tinea-Capitis 发表于 2025-3-22 21:01:49

Comparison of Long-Term Adaptivity for Neural Networks. Problems occur if the system dynamics change over time (concept drift). We survey different approaches to handle concept drift and to ensure good prognosis quality over long time ranges. Two main approaches - data accumulation and ensemble learning - are explained and implemented. We compare the c

雪崩 发表于 2025-3-22 22:48:28

http://reply.papertrans.cn/17/1627/162633/162633_8.png

TAP 发表于 2025-3-23 03:36:35

A Modified Artificial Fish Swarm Algorithm for the Optimization of Extreme Learning Machinesffer from generalization loss caused by overfitting, thereby the process of learning is highly biased. For this work we use Extreme Learning Machine which is an algorithm for training single hidden layer neural networks, and propose a novel swarm-based method for optimizing its weights and improving

Corral 发表于 2025-3-23 09:11:09

http://reply.papertrans.cn/17/1627/162633/162633_10.png
页: [1] 2 3 4 5 6 7
查看完整版本: Titlebook: Artificial Neural Networks and Machine Learning -- ICANN 2012; 22nd International C Alessandro E. P. Villa,Włodzisław Duch,Günther Pal Conf