Results 1  10
of
442,015
Semantics of ContextFree Languages
 In Mathematical Systems Theory
, 1968
"... "Meaning " may be assigned to a string in a contextfree language by defining "attributes " of the symbols in a derivation tree for that string. The attributes can be defined by functions associated with each production in the grammar. This paper examines the implications of th ..."
Abstract

Cited by 569 (0 self)
 Add to MetaCart
"Meaning " may be assigned to a string in a contextfree language by defining "attributes " of the symbols in a derivation tree for that string. The attributes can be defined by functions associated with each production in the grammar. This paper examines the implications
An Efficient ContextFree Parsing Algorithm
, 1970
"... A parsing algorithm which seems to be the most efficient general contextfree algorithm known is described. It is similar to both Knuth's LR(k) algorithm and the familiar topdown algorithm. It has a time bound proportional to n 3 (where n is the length of the string being parsed) in general; i ..."
Abstract

Cited by 798 (0 self)
 Add to MetaCart
; it has an n 2 bound for unambiguous grammars; and it runs in linear time on a large class of grammars, which seems to include most practical contextfree programming language grammars. In an empirical comparison it appears to be superior to the topdown and bottomup algorithms studied by Griffiths
Statistical Parsing with a Contextfree Grammar and Word Statistics
, 1997
"... We describe a parsing system based upon a language model for English that is, in turn, based upon assigning probabilities to possible parses for a sentence. This model is used in a parsing system by finding the parse for the sentence with the highest probability. This system outperforms previou ..."
Abstract

Cited by 414 (18 self)
 Add to MetaCart
explain their relative performance. Introduction We present a statistical parser that induces its grammar and probabilities from a handparsed corpus (a treebank). Parsers induced from corpora are of interest both as simply exercises in machine learning and also because they are often the best parsers
FASTUS: A finitestate processor for information extraction from realworld text
, 1993
"... Approaches to text processing that rely on parsing the text with a contextfree grammar tend to be slow and errorprone because of the massive ambiguity of long sentences. In contrast, FASTUS employs a nondeterministic finitestate language model that produces a phrasal decomposition of a sentence i ..."
Abstract

Cited by 150 (4 self)
 Add to MetaCart
Approaches to text processing that rely on parsing the text with a contextfree grammar tend to be slow and errorprone because of the massive ambiguity of long sentences. In contrast, FASTUS employs a nondeterministic finitestate language model that produces a phrasal decomposition of a sentence
Hierarchical phrasebased translation
 Computational Linguistics
, 2007
"... We present a statistical machine translation model that uses hierarchical phrases—phrases that contain subphrases. The model is formally a synchronous contextfree grammar but is learned from a parallel text without any syntactic annotations. Thus it can be seen as combining fundamental ideas from b ..."
Abstract

Cited by 597 (9 self)
 Add to MetaCart
We present a statistical machine translation model that uses hierarchical phrases—phrases that contain subphrases. The model is formally a synchronous contextfree grammar but is learned from a parallel text without any syntactic annotations. Thus it can be seen as combining fundamental ideas from
A hierarchical phrasebased model for statistical machine translation
 IN ACL
, 2005
"... We present a statistical phrasebased translation model that uses hierarchical phrases— phrases that contain subphrases. The model is formally a synchronous contextfree grammar but is learned from a bitext without any syntactic information. Thus it can be seen as a shift to the formal machinery of ..."
Abstract

Cited by 491 (12 self)
 Add to MetaCart
We present a statistical phrasebased translation model that uses hierarchical phrases— phrases that contain subphrases. The model is formally a synchronous contextfree grammar but is learned from a bitext without any syntactic information. Thus it can be seen as a shift to the formal machinery
Learning Stochastic Logic Programs
, 2000
"... Stochastic Logic Programs (SLPs) have been shown to be a generalisation of Hidden Markov Models (HMMs), stochastic contextfree grammars, and directed Bayes' nets. A stochastic logic program consists of a set of labelled clauses p:C where p is in the interval [0,1] and C is a firstorder r ..."
Abstract

Cited by 1194 (81 self)
 Add to MetaCart
Stochastic Logic Programs (SLPs) have been shown to be a generalisation of Hidden Markov Models (HMMs), stochastic contextfree grammars, and directed Bayes' nets. A stochastic logic program consists of a set of labelled clauses p:C where p is in the interval [0,1] and C is a first
Automatic learning of contextfree grammar
 in Proc. of Conference on Computational Linguistics and Speech Processing
, 2006
"... In this paper we study the problem of learning contextfree grammar from a corpus. We investigate a technique that is based on the notion of minimum description length of the corpus. A cost as a function of grammar is defined as the sum of the number of bits required for the representation of a gra ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
In this paper we study the problem of learning contextfree grammar from a corpus. We investigate a technique that is based on the notion of minimum description length of the corpus. A cost as a function of grammar is defined as the sum of the number of bits required for the representation of a
Pfold: RNA secondary structure prediction using stochastic contextfree grammars
 Nucleic Acids Res
, 2003
"... RNA secondary structures are important in many biological processes and efficient structure prediction can give vital directions for experimental investigations. Many available programs for RNA secondary structure prediction only use a single sequence at a time. This may be sufficient in some applic ..."
Abstract

Cited by 213 (11 self)
 Add to MetaCart
RNA secondary structures are important in many biological processes and efficient structure prediction can give vital directions for experimental investigations. Many available programs for RNA secondary structure prediction only use a single sequence at a time. This may be sufficient in some applications, but often it is possible to obtain related RNA sequences with conserved secondary structure. These should be included in structural analyses to give improved results. This work presents a practical way of predicting RNA secondary structure that is especially useful when related sequences can be obtained. The method improves a previous algorithm based on an explicit evolutionary model and a probabilistic model of structures. Predictions can be done on a web server at
Offspringannotated probabilistic contextfree grammars
"... Abstract. This paper describes the application of a new model to learn probabilistic contextfree grammars (PCFGs) from a tree bank corpus. The model estimates the probabilities according to a generalized £gram scheme for trees. It allows for faster parsing, decreases considerably the perplexity of ..."
Abstract
 Add to MetaCart
Abstract. This paper describes the application of a new model to learn probabilistic contextfree grammars (PCFGs) from a tree bank corpus. The model estimates the probabilities according to a generalized £gram scheme for trees. It allows for faster parsing, decreases considerably the perplexity
Results 1  10
of
442,015