Results 1  10
of
50,261
Attractor Neural Networks with Hypercolumns
 In LNCS: Vol. 2415. Proceedings of the international conference on artificial neural networks
, 2002
"... We investigate attractor neural networks with a modular structure, where a local winnertakesall rule acts within the modules (called hypercolumns). We make a signaltonoise analysis of storage capacity and noise tolerance, and compare the results with those from simulations. ..."
Abstract

Cited by 4 (4 self)
 Add to MetaCart
We investigate attractor neural networks with a modular structure, where a local winnertakesall rule acts within the modules (called hypercolumns). We make a signaltonoise analysis of storage capacity and noise tolerance, and compare the results with those from simulations.
Conditions for the emergence of spatially asymmetric retrieval states in attractor neural networks
 Central European Journal of Physics
"... an attractor neural network ..."
Balancing Stabilization and Inhibition in Continuous Attractor Neural Networks
"... Attractor Neural Networks (ANN) are neural nets with recurrent connections strong enough ..."
Abstract
 Add to MetaCart
Attractor Neural Networks (ANN) are neural nets with recurrent connections strong enough
Synaptic Efficacies of an Attractor Neural Network
"... . The synaptic efficacies of an attractor neural network are calculated starting off with a biologically plausible ansatz. The new method can be applied both in case of vanishing and nonvanishing selfinteractions of the neurons. In case of nonvanishing selfinteractions, our method and the wellk ..."
Abstract
 Add to MetaCart
. The synaptic efficacies of an attractor neural network are calculated starting off with a biologically plausible ansatz. The new method can be applied both in case of vanishing and nonvanishing selfinteractions of the neurons. In case of nonvanishing selfinteractions, our method and the well
Finite connectivity attractor neural networks
, 2003
"... We study a family of diluted attractor neural networks with a finite average number of (symmetric) connections per neuron. As in finite connectivity spin glasses, their equilibrium properties are described by order parameter functions, for which we derive an integral equation in replica symmetric ap ..."
Abstract
 Add to MetaCart
We study a family of diluted attractor neural networks with a finite average number of (symmetric) connections per neuron. As in finite connectivity spin glasses, their equilibrium properties are described by order parameter functions, for which we derive an integral equation in replica symmetric
Historydependent Attractor Neural Networks
 Network
, 1997
"... We present a methodological framework enabling a detailed description of the performance of Hopfieldlike attractor neural networks (ANN) in the first two iterations. Using the Bayesian approach, we find that performance is improved when a historybased term is included in the neuron's dyna ..."
Abstract

Cited by 3 (3 self)
 Add to MetaCart
We present a methodological framework enabling a detailed description of the performance of Hopfieldlike attractor neural networks (ANN) in the first two iterations. Using the Bayesian approach, we find that performance is improved when a historybased term is included in the neuron
Optimal signalling in Attractor Neural Networks
 Network
, 1994
"... In [ Meilijson and Ruppin, 1993a ] we presented a methodological framework describing the twoiteration performance of Hopfieldlike attractor neural networks with historydependent, Bayesian dynamics. We now extend this analysis in a number of directions: input patterns applied to small subsets of ..."
Abstract

Cited by 3 (3 self)
 Add to MetaCart
In [ Meilijson and Ruppin, 1993a ] we presented a methodological framework describing the twoiteration performance of Hopfieldlike attractor neural networks with historydependent, Bayesian dynamics. We now extend this analysis in a number of directions: input patterns applied to small subsets
(1 + ∞)dimensional attractor neural networks
, 2000
"... Abstract. We solve a class of attractor neural network models with a mixture of 1D nearestneighbour interactions and infiniterange interactions, which are both of a Hebbiantype form. Our solution is based on a combination of meanfield methods, transfer matrices, and 1D randomfield techniques, an ..."
Abstract
 Add to MetaCart
Abstract. We solve a class of attractor neural network models with a mixture of 1D nearestneighbour interactions and infiniterange interactions, which are both of a Hebbiantype form. Our solution is based on a combination of meanfield methods, transfer matrices, and 1D randomfield techniques
1+∞ Dimensional Attractor Neural Networks
"... We solve a class of attractor neural network models with a mixture of 1D nearestneighbour interactions and infiniterange interactions, which are both of a Hebbiantype form. Our solution is based on a combination of meanfield methods, transfer matrices, and 1D randomfield techniques, and is obtai ..."
Abstract
 Add to MetaCart
We solve a class of attractor neural network models with a mixture of 1D nearestneighbour interactions and infiniterange interactions, which are both of a Hebbiantype form. Our solution is based on a combination of meanfield methods, transfer matrices, and 1D randomfield techniques
Compensatory mechanisms in an attractor neural network model of Schizophrenia
 Neural Computation
, 1994
"... We investigate the effect of synaptic compensation on the dynamic behavior of an attractor neural network receiving its input stimuli as external fields projecting on the network. It is shown how, in face of weakened inputs, memory performance may be preserved by strengthening internal synaptic conn ..."
Abstract

Cited by 15 (8 self)
 Add to MetaCart
We investigate the effect of synaptic compensation on the dynamic behavior of an attractor neural network receiving its input stimuli as external fields projecting on the network. It is shown how, in face of weakened inputs, memory performance may be preserved by strengthening internal synaptic
Results 1  10
of
50,261