1 |
Front Matter |
|
|
Abstract
|
2 |
Novelty Analysis in Dynamic Scene for Autonomous Mental Development |
Sang-Woo Ban,Minho Lee |
|
Abstract
We propose a new biologically motivated novelty analysis model that can give robust performance for natural scenes with affine transformed field of view as well as noisy scenes in dynamic visual environment, which can play important role for an autonomous mental development. The proposed model based on biological visual pathway uses a topology matching method of a visual scan path obtained from a low level top-down attention model in conjunction with a bottom-up saliency map model in order to detect a novelty in an input scene. In addition, the energy signature for the corresponding visual scan path is also considered to decide whether a novelty is occurred in an input scene or not. The computer experimental results show that the proposed model successfully indicates a novelty for natural color input scenes in dynamic visual environment.
|
3 |
The Computational Model to Simulate the Progress of Perceiving Patterns in Neuron Population |
Wen-Chuang Chou,Tsung-Ying Sun |
|
Abstract
We set out here, in an effort to extend the capacities of recent neurobiological evidence and theories, to propose a computational framework, which gradually accumulates and focuses transited energy as a distribution of incitation in the cortex by means of the interaction and communication between nerve cells within different attributes. In our attempts to simulate the human neural system, we found a reproduction of the corresponding perception pattern from that which is sensed by the brain. The model successfully projects a high-dimensional signal sequence as a lower-dimensional unique pattern, while also indicating the significant active role of nerve cell bodies in the central processing of neural network, rather than a merely passive nonlinear function for input and output.
|
4 |
Short Term Memory and Pattern Matching with Simple Echo State Networks |
Georg Fette,Julian Eggert |
|
Abstract
Two recently proposed approaches to recognize temporal patterns have been proposed by Jäger with the so called Echo State Network (ESN) and by Maass with the so called Liquid State Machine (LSM). The ESN approach assumes a sort of “black-box” operability of the networks and claims a broad applicability to several different problems using the same principle. Here we propose a simplified version of ESNs which we call Simple Echo State Network (SESN) which exhibits good results in memory capacity and pattern matching tasks and which allows a better understanding of the capabilities and restrictions of ESNs.
|
5 |
Analytical Solution for Dynamic of Neuronal Populations |
Wentao Huang,Licheng Jiao,Shiping Ma,Yuelei Xu |
|
Abstract
The population density approach is a viable method to describe the large populations of neurons and has generated considerable interest recently. The evolution in time of the population density is determined by a partial differential equation. Now, the discussion of most researchers is based on the population density function. In this paper, we propose a new function to characterize the population of excitatory and inhibitory spiking neurons and derive a novel evolution equation which is a nonhomogeneous parabolic type equation. Moreover, we study the stationary solution and give the firing rate of the stationary states. Then we solve for the time dependent solution using the Fourier transform, which can be used to analyze the various behavior of cerebra.
|
6 |
Dynamics of Cortical Columns – Sensitive Decision Making |
Jörg Lücke |
|
Abstract
Based on elementary assumptions on the interconnectivity within a cortical macrocolumn we derive a differential equation system which models the mean neural activities of its minicolumns. A stability analysis shows a rich diversity of stationary points and sensitive behavior with respect to a parameter of inhibition. If this parameter is continuously changed, the system shows the same types of bifurcations as the macrocolumn model presented in [1] which is based on explicitly defined interconnectivity and spiking neurons. Due to this behavior the macrocolumn is able to make very sensitive decisions with respect to external input. The decision making process can be used to induce self-organization of receptive fields as is shown in [2].
|
7 |
Dynamics of Cortical Columns – Self-organization of Receptive Fields |
Jörg Lücke,Jan D. Bouecke |
|
Abstract
We present a system of differential equations which abstractly models neural dynamics and synaptic plasticity of a cortical macrocolumn. The equations assume inhibitory coupling between minicolumn activities and Hebbian type synaptic plasticity of afferents to the minicolumns. If input in the form of activity patterns is presented, self-organization of receptive fields (RFs) of the minicolumns is induced. Self-organization is shown to appropriately classify input patterns or to extract basic constituents form input patterns consisting of superpositions of subpatterns. The latter is demonstrated using the bars benchmark test. The dynamics was motivated by the more explicit model suggested in [1] but represents a much compacter, continuous, and easier to analyze dynamic description.
|
8 |
Optimal Information Transmission Through Cortico-Cortical Synapses |
Marcelo A. Montemurro,Stefano Panzeri |
|
Abstract
Neurons in visual cortex receive a large fraction of their inputs from other cortical neurons with a similar stimulus preference. Here we use models of neuronal population activity and information theoretic tools to investigate whether this arrangement of synapses allows efficient information transmission. We find that efficient information transmission requires that the tuning curve of the afferent neurons is approximately as wide as the spread of stimulus preferences of the afferent neurons reaching a target neuron. This is compatible with present neurophysiological evidence from visual cortex. We thus suggest that the organization of V1 cortico-cortical synaptic inputs allows optimal information transmission.
|
9 |
Ensemble of SVMs for Improving Brain Computer Interface P300 Speller Performances |
A. Rakotomamonjy,V. Guigue,G. Mallet,V. Alvarado |
|
Abstract
This paper addresses the problem of signal responses variability within a single subject in P300 speller Brain-Computer Interfaces. We propose here a method to cope with these variabilities by considering a single learner for each acquisition session. Each learner consists of a channel selection procedure and a classifier. Our algorithm has been benchmarked with the data and the results of the BCI 2003 competition dataset and we clearly show that our approach yields to state-of-the art results.
|
10 |
Modelling Path Integrator Recalibration Using Hippocampal Place Cells |
T. Strösslin,R. Chavarriaga,D. Sheynikhovich,W. Gerstner |
|
Abstract
The firing activities of place cells in the rat hippocampus exhibit strong correlations to the animal’s location. External (e.g. visual) as well as internal (proprioceptive and vestibular) sensory information take part in controlling hippocampal place fields. Previously it has been observed that when rats shuttle between a movable origin and a fixed target the hippocampus encodes position in two different frames of reference. This paper presents a new model of hippocampal place cells that explains place coding in multiple reference frames by continuous interaction between visual and self-motion information. The model is tested using a simulated mobile robot in a real-world experimental paradigm.
|
11 |
Coding of Objects in Low-Level Visual Cortical Areas |
N. R. Taylor,M. Hartley,J. G. Taylor |
|
Abstract
We develop a neural network architecture to help model the creation of visual temporal object representations. We take visual input to be hard-wired up to and including V1 (as an orientation-filtering system). We then develop architectures for afferents to V2 and thence to V4, both of which are trained by a causal Hebbian law. We use an incremental approach, using sequences of increasingly complex stimuli at an increasing level of the hierarchy. The V2 representations are shown to encode angles, and V4 is found sensitive to shapes embedded in figures. These results are compared to recent experimental data, supporting the incremental training scheme and associated architecture.
|
12 |
A Gradient Rule for the Plasticity of a Neuron’s Intrinsic Excitability |
Jochen Triesch |
|
Abstract
While synaptic learning mechanisms have always been a core topic of neural computation research, there has been relatively little work on intrinsic learning processes, which change a neuron’s excitability. Here, we study a single, continuous activation model neuron and derive a gradient rule for the intrinsic plasticity based on information theory that allows the neuron to bring its firing rate distribution into an approximately exponential regime, as observed in visual cortical neurons. In simulations, we show that the rule works efficiently.
|
13 |
Building the Cerebellum in a Computer |
Tadashi Yamazaki,Shigeru Tanaka |
|
Abstract
We have built a realistic computational model of the cerebellum. This model simulates the cerebellar cortex of the size 0.5mm × 1mm consisting of several types of neurons, which are modeled as conductancebased leaky integrate-and-.re units with realistic values of parameters adopted from known anatomical and physiological data. We demonstrate that the recurrent inhibitory circuit composed of granule and Golgi cells can represent a time passage by population of active granule cells, which we call “the cerebellar internal clock”. We also demonstrate that our model can explain Pavlovian eyelid conditioning, in which the cerebellar internal clock plays an important role.
|
14 |
Combining Attention and Value Maps |
Stathis Kasderidis,John G. Taylor |
|
Abstract
We present an approach where we combine attention with value maps for the purpose of acquiring a decision-making policy for multiple concurrent goals. The former component is essential for dealing with an uncertain and open environment while the latter offers a general model for building decision-making systems based on reward information. We discuss the multiple goals policy acquisition problem and justify our approach. We provide simulation results that support our solution.
|
15 |
Neural Network with Memory and Cognitive Functions |
Janusz A. Starzyk,Yue Li,David D. Vogel |
|
Abstract
This paper provides an analysis of a new class of distributed memories known as R-nets. These networks are similar to Hebbian networks, but are relatively sparsly connected. R-nets use simple binary neurons and trained links between excitatory and inhibitory neurons. They use inhibition to prevent neurons not associated with a recalled pattern from firing. They are shown to implement associative learning and have the ability to store sequential patterns, used in networks with higher cognitive functions. This work explores the statistical properties of such networks in terms of storage capacity as a function of R-net topology and employed learning and recall mechanisms.
|
16 |
Associative Learning in Hierarchical Self Organizing Learning Arrays |
Janusz A. Starzyk,Zhen Zhu,Yue Li |
|
Abstract
In this paper we introduce feedback based associative learning in self-organized learning arrays (SOLAR). SOLAR structures are hierarchically organized and have the ability to classify patterns in a network of sparsely connected neurons. These neurons may define their own functions and select their interconnections locally, thus satisfying some of the requirements for biologically plausible intelligent structures. Feed-forward processing is used to make necessary correlations and learn the input patterns. Associations between neuron inputs are used to generate feedback signals. These feedback signals, when propagated to the associated inputs, can establish the expected input values. This can be used for hetero and auto associative learning and pattern recognition.
|
17 |
A Review of Cognitive Processing in the Brain |
John G. Taylor |
|
Abstract
A review of cognitive processing in the brain is presented, using insights from brain science to extract general principles. After discussing the nature of cognition, the triumvirate of basic brain functions: attention control, reinforcement learning and memory, are explored as to how they could support cognitive processes, from which certain cognitive principles are developed. Several specific cognitive tasks are discussed and their simulations considered which support the effectiveness of the approach.
|
18 |
Neuronal Behavior with Sub-threshold Oscillations and Spiking/Bursting Activity Using a Piecewise Li |
Carlos Aguirre,Doris Campos,Pedro Pascual,Eduardo Serrano |
|
Abstract
A phenomenological neuronal model based on a coupled piecewise linear two–dimensional map is presented. The model mimics many of the neuronal features such as spiking, bursting and subthreshold activity. The model requires a computational effort lower than most of the phenomenological or differential neuronal models and its behavior is coherent with the one present in the other models. The regimes of synchronization of a pair of coupled maps is also explored.
|
19 |
On-Line Real-Time Oriented Application for Neuronal Spike Sorting with Unsupervised Learning |
Yoshiyuki Asai,Tetyana I. Aksenova,Alessandro E. P. Villa |
|
Abstract
Multisite electrophysiological recordings have become a standard tool for exploring brain functions. These techniques point out the necessity of fast and reliable unsupervised spike sorting. We present an algorithm that performs on-line real-time spike sorting for data streaming from a data acquisition board or in off-line mode from a WAV formatted file. Spike shapes are represented in a phase space according to the first and second derivatives of the signal trace. The output of the application is spike data format file in which the timing of spike occurrences are recorded by their inter-spike-intervals. It allows its application to the study of neuronal activity patterns in clinically recorded data.
|
20 |
A Spiking Neural Sparse Distributed Memory Implementation for Learning and Predicting Temporal Seque |
J. Bose,S. B. Furber,J. L. Shapiro |
|
Abstract
In this paper we present a neural sequence machine that can learn temporal sequences of discrete symbols, and perform better than machines that use Elman’s context layer, time delay nets or shift register-like context memories. This machine can perform sequence detection, prediction and learning of new sequences. The network model is an associative memory with a separate store for the sequence context of a pattern. Learning is one-shot. The model is capable of both off-line and on-line learning. The machine is based upon a sparse distributed memory which is used to store associations between the current context and the input symbol. Numerical tests have been done on the machine to verify its properties. We have also shown that it is possible to implement the memory using spiking neurons.
|