• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations
Advanced Search Include Citations

DMCA

Finding structure in time (1990)

Cached

  • Download as a PDF

Download Links

  • [www.cs.swarthmore.edu]
  • [web.cs.swarthmore.edu]
  • [web.cs.swarthmore.edu]
  • [www.cs.swarthmore.edu]
  • [www.cnbc.cmu.edu]
  • [web.cs.swarthmore.edu]
  • [web.cs.swarthmore.edu]
  • [www.neurosecurity.com]
  • [www.dtic.upf.edu]
  • [web.cs.swarthmore.edu]
  • [www.cs.swarthmore.edu]
  • [www.sci.unich.it]
  • [homepage.psy.utexas.edu]
  • [web.cs.swarthmore.edu]
  • [www.dtic.upf.edu]
  • [www.cs.utsa.edu]
  • [www-cse.ucsd.edu]
  • [www.cnbc.cmu.edu]
  • [www.cnbc.cmu.edu]
  • [crl.ucsd.edu]
  • [axon.cs.byu.edu]
  • [www.nbu.bg]
  • [axon.cs.byu.edu]
  • [synapse.cs.byu.edu]
  • [www.coli.uni-saarland.de]
  • [nbu.bg]
  • [www.coli.uni-saarland.de]
  • [synapse.cs.byu.edu]
  • [axon.cs.byu.edu]
  • [ftp.crl.ucsd.edu]
  • [crl.ucsd.edu]
  • [crl.ucsd.edu]
  • [www.comp.leeds.ac.uk]
  • [www2.fiit.stuba.sk]
  • [www2.fiit.stuba.sk]
  • [fuzzy.cs.uni-magdeburg.de]
  • [machine-learning.martinsewell.com]

  • Other Repositories/Bibliography

  • DBLP
  • Save to List
  • Add to Collection
  • Correct Errors
  • Monitor Changes
by Jeffrey L. Elman
Venue:COGNITIVE SCIENCE
Citations:2071 - 23 self
  • Summary
  • Citations
  • Active Bibliography
  • Co-citation
  • Clustered Documents
  • Version History

BibTeX

@ARTICLE{Elman90findingstructure,
    author = {Jeffrey L. Elman},
    title = {Finding structure in time},
    journal = {COGNITIVE SCIENCE},
    year = {1990},
    volume = {14},
    number = {2},
    pages = {179--211}
}

Share

Facebook Twitter Reddit Bibsonomy

OpenURL

 

Abstract

Time underlies many interesting human behaviors. Thus, the question of how to represent time in connectionist models is very important. One approach is to represent time implicitly by its effects on processing rather than explicitly (as in a spatial representation). The current report develops a proposal along these lines first described by Jordan (1986) which involves the use of recurrent links in order to provide networks with a dynamic memory. In this approach, hidden unit patterns are fed back to themselves; the internal representations which develop thus reflect task demands in the context of prior internal states. A set of simulations is reported which range from relatively simple problems (temporal version of XOR) to discovering syntactic/semantic features for words. The networks are able to learn interesting internal representations which incorporate task demands with memory demands; indeed, in this approach the notion of memory is inextricably bound up with task processing. These representations reveal a rich structure, which allows them to be highly context-dependent while also expressing generalizations across classes of items. These representations suggest a method for representing lexical categories and the type/token distinction.

Keyphrases

task demand    recurrent link    interesting internal representation    temporal version    task processing    prior internal state    type token distinction    syntactic semantic feature    spatial representation    memory demand    internal representation    many interesting human behavior    rich structure    connectionist model    simple problem    hidden unit pattern    lexical category    dynamic memory    current report   

Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University