Results 1  10
of
11
Campenhout, Linking Nonbinned Spike Train Kernels to Several Existing Spike Train Metrics
 Neurocomputing
"... spike train metrics ..."
(Show Context)
Computing with Spiking Neuron Networks
"... Abstract Spiking Neuron Networks (SNNs) are often referred to as the 3 rd generation of neural networks. Highly inspired from natural computing in the brain and recent advances in neurosciences, they derive their strength and interest from an accurate modeling of synaptic interactions between neuron ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
(Show Context)
Abstract Spiking Neuron Networks (SNNs) are often referred to as the 3 rd generation of neural networks. Highly inspired from natural computing in the brain and recent advances in neurosciences, they derive their strength and interest from an accurate modeling of synaptic interactions between neurons, taking into account the time of spike firing. SNNs overcome the computational power of neural networks made of threshold or sigmoidal units. Based on dynamic eventdriven processing, they open up new horizons for developing models with an exponential capacity of memorizing and a strong ability to fast adaptation. Today, the main challenge is to discover efficient learning rules that might take advantage of the specific features of SNNs while keeping the nice properties (generalpurpose, easytouse, available simulators, etc.) of traditional connectionist models. This chapter relates the history of the “spiking neuron ” in Section 1 and summarizes the most currentlyinuse models of neurons and synaptic plasticity in Section 2. The computational power of SNNs is addressed in Section 3 and the problem of learning in networks of spiking neurons is tackled in Section 4, with insights into the tracks currently explored for solving it. Finally, Section 5 discusses application domains, implementation issues and proposes several simulation frameworks.
Spiketiming error backpropagation in theta neuron networks
 Neural Comput
, 2009
"... The main contribution of this paper is the derivation of a steepest gradient descent learning rule for a multilayer network of theta neurons; a onedimensional nonlinear neuron model. Central to our model is the assumption that the intrinsic neuron dynamics are sufficient to achieve consistent tim ..."
Abstract

Cited by 6 (1 self)
 Add to MetaCart
(Show Context)
The main contribution of this paper is the derivation of a steepest gradient descent learning rule for a multilayer network of theta neurons; a onedimensional nonlinear neuron model. Central to our model is the assumption that the intrinsic neuron dynamics are sufficient to achieve consistent time coding, with no need to involve the precise shape of postsynaptic currents; this assumption departs from other related models such as SpikeProp and Tempotron learning. Our results clearly show that it is possible to perform complex computations by applying supervised learning techniques to the spike times and time response properties of nonlinear integrate and fire neurons. Networks trained with our multilayer training rule are shown to have similar generalization abilities for spike latency pattern classification as Tempotron learning. The rule is also able to train networks to perform complex regression tasks that neither SpikeProp or Tempotron learning appear to be capable of. 1
LETTER Communicated by Sander M. Bohte Spiking Neural Networks for Cortical Neuronal Spike Train
"... Recent investigation of cortical coding and computation indicates that temporal coding is probably a more biologically plausible scheme used by neurons than the rate coding used commonly inmost publishedwork. We propose and demonstrate in this letter that spiking neural networks (SNN), consisting of ..."
Abstract
 Add to MetaCart
(Show Context)
Recent investigation of cortical coding and computation indicates that temporal coding is probably a more biologically plausible scheme used by neurons than the rate coding used commonly inmost publishedwork. We propose and demonstrate in this letter that spiking neural networks (SNN), consisting of spiking neurons that propagate information by the timing of spikes, are a better alternative to the coding scheme based on spike frequency (histogram) alone. TheSNNmodel analyzes cortical neural spike trains directly without losing temporal information for generatingmore reliablemotor command for cortically controlled prosthetics. In this letter,we compared the temporal pattern classification result from the SNN approach with results generated from firingratebased approaches: conventional artificial neural networks, support vector machines, and linear regression. The results show that the SNN algorithm can achieve higher classification accuracy and identify the spiking activity related to movement control earlier than the other methods. Both are desirable characteristics for fast neural information processing and reliable control command pattern recognition for neuroprosthetic applications.
1Supervised Learning in Multilayer Spiking Neu ral Networks
"... feedforward networks The current article introduces a supervised learning algorithm for multilayer spiking neural networks. The algorithm presented here overcomes some limitations of existing learning algorithms as it can be applied to neurons firing multiple spikes and it can in principle be appli ..."
Abstract
 Add to MetaCart
feedforward networks The current article introduces a supervised learning algorithm for multilayer spiking neural networks. The algorithm presented here overcomes some limitations of existing learning algorithms as it can be applied to neurons firing multiple spikes and it can in principle be applied to any linearisable neuron model. The algorithm is applied successfully to various benchmarks, such as the XOR problem and the Iris data set, as well as complex classifications problems. The simulations also show the flexibility of this supervised learning algorithm which permits different encodings of the spike timing patterns, including precise spike trains encoding. 1
Prediction of Single Neuron Spiking Activity using an Optimized Nonlinear Dynamic Model
"... Abstract—The increasing need of knowledge in the treatment of brain diseases has driven a huge interest in understanding the phenomenon of neural spiking. Researchers have successfully been able to create mathematical models which, with specific parameters, are able to reproduce the experimental neu ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract—The increasing need of knowledge in the treatment of brain diseases has driven a huge interest in understanding the phenomenon of neural spiking. Researchers have successfully been able to create mathematical models which, with specific parameters, are able to reproduce the experimental neuronal responses. The spiking activity is characterized using spike trains and it is essential to develop methods for parameter estimation that rely solely on the spike times or interspike intervals (ISI). In this paper we describe a new technique for optimization of a single neuron model using an experimental spike train from a biological neuron. We are able to fit model parameters using the gradient descent method. The optimized model is then used to predict the activity of the biological neuron and the performance is quantified using a spike distance measure. I.
Abstract In this paper we develop and analyze Spiking Neu
"... ral Network (SNN) versions of Resilient Propagation (RProp) and QuickProp, both training methods used to speed up training in Artificial Neural Networks (ANNs) by making certain assumptions about the data and the error surface. Modifications are made to both algorithms to adapt them to SNNs. Resul ..."
Abstract
 Add to MetaCart
(Show Context)
ral Network (SNN) versions of Resilient Propagation (RProp) and QuickProp, both training methods used to speed up training in Artificial Neural Networks (ANNs) by making certain assumptions about the data and the error surface. Modifications are made to both algorithms to adapt them to SNNs. Results generated on standard XOR and Fisher Iris data sets using the QuickProp and RProp versions of SpikeProp are shown to converge to a final error of 0.5 an average of 80% faster than using SpikeProp on its own. I.
unknown title
"... Abstract In this paper, we train a onelayer Theta Neuron Network (TNN) to perform a Braitenberg obstacle avoidance algorithm on a Khepera robot. The Theta neuron model is more biologically plausible than the leaky integrate and fire model typically used in Spiking Neural Networks. Our motivation i ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract In this paper, we train a onelayer Theta Neuron Network (TNN) to perform a Braitenberg obstacle avoidance algorithm on a Khepera robot. The Theta neuron model is more biologically plausible than the leaky integrate and fire model typically used in Spiking Neural Networks. Our motivation is to determine if the dynamical properties of the theta neuron model can be leveraged to increase the noise robustness in an embedded application. We compare Khepera obstacle avoidance results with traditional Artificial Neural Network and TNN implementations under different levels of sensor noise. As the noise increases, the performance of the TNN is the least affected. At high noise levels, the ANN and Braitenberg implementations calculate the incorrect turn direction 42 % more often than the TNN and deviate from a straight path trajectory over 10 times as far. The results demonstrate that TNNs warrants further development for engineering applications. I.
Abstract On Leveraging the Dynamical Properties of Nonlinear Spiking Neurons
, 2007
"... This is to certify that I have examined this copy of a doctoral dissertation by ..."
Abstract
 Add to MetaCart
(Show Context)
This is to certify that I have examined this copy of a doctoral dissertation by