• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations

Tools

Sorted by:
Try your query at:
Semantic Scholar Scholar Academic
Google Bing DBLP
Results 1 - 10 of 1,942
Next 10 →

A New Statistical Parser Based on Bigram Lexical Dependencies

by Michael John Collins , 1996
"... This paper describes a new statistical parser which is based on probabilities of dependencies between head-words in the parse tree. Standard bigram probability estimation techniques are extended to calculate probabilities of dependencies between pairs of words. Tests using Wall Street Journal ..."
Abstract - Cited by 490 (4 self) - Add to MetaCart
This paper describes a new statistical parser which is based on probabilities of dependencies between head-words in the parse tree. Standard bigram probability estimation techniques are extended to calculate probabilities of dependencies between pairs of words. Tests using Wall Street

A New Statistical Parser Based on Bigram Lexical Dependencies

by unknown authors , 1996
"... This paper describes a new statistical parser which is based on probabilities of dependencies between head-words in the parse tree. Standard bigram probability estimation techniques are extended to calculate probabilities of dependencies between pairs of words. Tests using Wall Street Journal data s ..."
Abstract - Add to MetaCart
This paper describes a new statistical parser which is based on probabilities of dependencies between head-words in the parse tree. Standard bigram probability estimation techniques are extended to calculate probabilities of dependencies between pairs of words. Tests using Wall Street Journal data

Feature-Rich Part-of-Speech Tagging with a Cyclic Dependency Network

by Kristina Toutanova , Dan Klein, Christopher D. Manning, Yoram Singer - IN PROCEEDINGS OF HLT-NAACL , 2003
"... We present a new part-of-speech tagger that demonstrates the following ideas: (i) explicit use of both preceding and following tag contexts via a dependency network representation, (ii) broad use of lexical features, including jointly conditioning on multiple consecutive words, (iii) effective ..."
Abstract - Cited by 693 (23 self) - Add to MetaCart
We present a new part-of-speech tagger that demonstrates the following ideas: (i) explicit use of both preceding and following tag contexts via a dependency network representation, (ii) broad use of lexical features, including jointly conditioning on multiple consecutive words, (iii

Using Linear Algebra for Intelligent Information Retrieval

by Michael W. Berry, Susan T. Dumais - SIAM REVIEW , 1995
"... Currently, most approaches to retrieving textual materials from scientific databases depend on a lexical match between words in users' requests and those in or assigned to documents in a database. Because of the tremendous diversity in the words people use to describe the same document, lexical ..."
Abstract - Cited by 676 (18 self) - Add to MetaCart
Currently, most approaches to retrieving textual materials from scientific databases depend on a lexical match between words in users' requests and those in or assigned to documents in a database. Because of the tremendous diversity in the words people use to describe the same document

Finding structure in time

by Jeffrey L. Elman - COGNITIVE SCIENCE , 1990
"... Time underlies many interesting human behaviors. Thus, the question of how to represent time in connectionist models is very important. One approach is to represent time implicitly by its effects on processing rather than explicitly (as in a spatial representation). The current report develops a pro ..."
Abstract - Cited by 2071 (23 self) - Add to MetaCart
; indeed, in this approach the notion of memory is inextricably bound up with task processing. These representations reveal a rich structure, which allows them to be highly context-dependent while also expressing generalizations across classes of items. These representations suggest a method

Dependent Bigram Identification

by Ted Pedersen , 1998
"... F23.16> industry 240 times, industry occurs without oil 1001 times, and bigrams other than oil industry occur 1,298,742 times. This distribution is sparse and skewed and thus violates a central assumption implicit in significance testing of contingency tables (Read & Cressie 1988). W 1 W 2 ..."
Abstract - Cited by 1 (0 self) - Add to MetaCart
F23.16> industry 240 times, industry occurs without oil 1001 times, and bigrams other than oil industry occur 1,298,742 times. This distribution is sparse and skewed and thus violates a central assumption implicit in significance testing of contingency tables (Read & Cressie 1988). W 1 W 2

Three New Probabilistic Models for Dependency Parsing: An Exploration

by Jason M. Eisner , 1996
"... After presenting a novel O(n³) parsing algorithm for dependency grammar, we develop three contrasting ways to stochasticize it. We propose (a) a lexical affinity model where words struggle to modify each other, (b) a sense tagging model where words fluctuate randomly in their selectional prefe ..."
Abstract - Cited by 318 (14 self) - Add to MetaCart
After presenting a novel O(n³) parsing algorithm for dependency grammar, we develop three contrasting ways to stochasticize it. We propose (a) a lexical affinity model where words struggle to modify each other, (b) a sense tagging model where words fluctuate randomly in their selectional

Partial parsing via finite-state cascades

by Steven Abney - Natural Language Engineering , 1996
"... Finite-state cascades represent an attractive architecture for parsing unrestricted text. Deterministic parsers specified by finite-state cascades are fast and reliable. They can be extended at modest cost to construct parse trees with finite feature structures. Finally, such deterministic parsers d ..."
Abstract - Cited by 340 (4 self) - Add to MetaCart
text corpora. The work described here is a step along the way toward a bootstrapping scheme that involves inducing a tagger from word distributions, a lowlevel “chunk ” parser from a tagged corpus, and lexical dependencies from a chunked corpus. In particular, I describe a chunk parsing technique based

The importance of shape in early lexical learning

by B. Smith, S. Jones - Cognitive Development , 1988
"... We ask if certain dimensions of perceptual similarity are weighted more heavily than others in determining word extension. The specific dimensions examined were shape, size, and texture. In four experiments, subjects were asked either to extend a novel count noun to new instances or, in a nonword cl ..."
Abstract - Cited by 235 (31 self) - Add to MetaCart
classification task, to put together objects that go together. The subjects were 2-year-olds, 3-year-olds, and adults. The results of all four experiments indicate that 2- and 3-year-olds and adults all weight shape more heavily than they do size or texture. This observed emphasis on shape, however, depends

Fast exact inference with a factored model for natural language parsing.

by Dan Klein , Christopher D Manning - In Advances in Neural Information Processing Systems, , 2003
"... Abstract We present a novel generative model for natural language tree structures in which semantic (lexical dependency) and syntactic (PCFG) structures are scored with separate models. This factorization provides conceptual simplicity, straightforward opportunities for separately improving the com ..."
Abstract - Cited by 306 (9 self) - Add to MetaCart
Abstract We present a novel generative model for natural language tree structures in which semantic (lexical dependency) and syntactic (PCFG) structures are scored with separate models. This factorization provides conceptual simplicity, straightforward opportunities for separately improving
Next 10 →
Results 1 - 10 of 1,942
Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University