Results 1  10
of
146,875
Distributed representations, simple recurrent networks, and grammatical structure
 Machine Learning
, 1991
"... Abstract. In this paper three problems for a connectionist account of language are considered: 1. What is the nature of linguistic representations? 2. How can complex structural relationships such as constituent structure be represented? 3. How can the apparently openended nature of language be acc ..."
Abstract

Cited by 394 (17 self)
 Add to MetaCart
be accommodated by a fixedresource system? Using a prediction task, a simple recurrent network (SRN) is trained on multiclausal sentences which contain multiplyembedded relative clauses. Principal component analysis of the hidden unit activation patterns reveals that the network solves the task by developing
A Learning Algorithm for Continually Running Fully Recurrent Neural Networks
, 1989
"... The exact form of a gradientfollowing learning algorithm for completely recurrent networks running in continually sampled time is derived and used as the basis for practical algorithms for temporal supervised learning tasks. These algorithms have: (1) the advantage that they do not require a precis ..."
Abstract

Cited by 525 (4 self)
 Add to MetaCart
The exact form of a gradientfollowing learning algorithm for completely recurrent networks running in continually sampled time is derived and used as the basis for practical algorithms for temporal supervised learning tasks. These algorithms have: (1) the advantage that they do not require a
. Recurrent networks
, 2001
"... that when presented with a new pattern z , the network responds by producing whichever one of the stored patterns most closely resembles z ." (Hertz et al., 1991, p. 11). The set of patterns is given by {x 1 , x 2 ,...,x p }, the nodes in the network are labeled 1, 2, ..., N. A pattern of ..."
Abstract
 Add to MetaCart
that when presented with a new pattern z , the network responds by producing whichever one of the stored patterns most closely resembles z ." (Hertz et al., 1991, p. 11). The set of patterns is given by {x 1 , x 2 ,...,x p }, the nodes in the network are labeled 1, 2, ..., N. A pattern
Connection Reduction of the Recurrent Networks
 IN PROCEEDING OF ICONIP'95, BEIJING
, 1995
"... There are many variations on the topology of recurrent networks. Models with fullyconnected recurrent weights may not be superior to those models with sparsely connected recurrent weights in terms of capacity, time for training and generalization ability. In this paper, we show that the fullycon ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
There are many variations on the topology of recurrent networks. Models with fullyconnected recurrent weights may not be superior to those models with sparsely connected recurrent weights in terms of capacity, time for training and generalization ability. In this paper, we show that the fully
Holographic recurrent networks
 Advances in Neural Information Processing Systems 5
, 1993
"... Holographic Recurrent Networks (HRNs) are recurrent networks which incorporate associative memory techniques for storing sequential structure. HRNs can be easily and quickly trained using gradient descent techniques to generate sequences of discrete outputs and trajectories through continuous sp ..."
Abstract

Cited by 15 (1 self)
 Add to MetaCart
Holographic Recurrent Networks (HRNs) are recurrent networks which incorporate associative memory techniques for storing sequential structure. HRNs can be easily and quickly trained using gradient descent techniques to generate sequences of discrete outputs and trajectories through continuous
Supervised Learning in Recurrent Networks
, 1995
"... this article is the supervised learning algorithms for training recurrent networks to perform temporal tasks. The problem setup is similar to the case of feedforward ..."
Abstract
 Add to MetaCart
this article is the supervised learning algorithms for training recurrent networks to perform temporal tasks. The problem setup is similar to the case of feedforward
Holographic Recurrent Networks
, 1993
"... Holographic Recurrent Networks (HRNs) are recurrent networks which incorporate associative memory techniques for storing sequential structure. HRNs can be easily and quickly trained using gradient descent techniques to generate sequences of discrete outputs and trajectories through continuous sp ..."
Abstract
 Add to MetaCart
Holographic Recurrent Networks (HRNs) are recurrent networks which incorporate associative memory techniques for storing sequential structure. HRNs can be easily and quickly trained using gradient descent techniques to generate sequences of discrete outputs and trajectories through continuous
On the Generalization Ability of Recurrent Networks
 Artificial Neural Networks  ICANN'2001
, 2001
"... The generalization ability of discrete time partially recurrent networks is examined. It is well known that the VC dimension of recurrent networks is infinite in most interesting cases and hence the standard VC analysis cannot be applied directly. We find guarantees for specific situations where ..."
Abstract

Cited by 3 (3 self)
 Add to MetaCart
The generalization ability of discrete time partially recurrent networks is examined. It is well known that the VC dimension of recurrent networks is infinite in most interesting cases and hence the standard VC analysis cannot be applied directly. We find guarantees for specific situations where
Finite State Automata and Simple Recurrent Networks
"... Figurel: The simple recurrent network (Elman 1988). In the SRN, the pattern of activation on the hidden units at time step t 1, together with the new input pattern, is allowed to influence the pattern of activation at time step t. This is achieved by copying the pattern of activation on the hidden ..."
Abstract

Cited by 166 (10 self)
 Add to MetaCart
Figurel: The simple recurrent network (Elman 1988). In the SRN, the pattern of activation on the hidden units at time step t 1, together with the new input pattern, is allowed to influence the pattern of activation at time step t. This is achieved by copying the pattern of activation on the hidden
Divisive Inhibition in Recurrent Networks
 Network
, 2000
"... Models of visual cortex suggest that response selectivity can arise from recurrent networks operating at high gain. However, such networks have a number of problematic features: 1) they operate perilously close to a point of instability, 2) small changes in synaptic strength can dramatically modify ..."
Abstract

Cited by 11 (0 self)
 Add to MetaCart
Models of visual cortex suggest that response selectivity can arise from recurrent networks operating at high gain. However, such networks have a number of problematic features: 1) they operate perilously close to a point of instability, 2) small changes in synaptic strength can dramatically modify
Results 1  10
of
146,875