Results 1  10
of
149
Generation and Synchronous TreeAdjoining Grammars
, 1990
"... Treeadjoining grammars (TAG) have been proposed as a formalism for generation based on the intuition that the extended domain of syntactic locality that TAGs provide should aid in localizing semantic dependencies as well, in turn serving as an aid to generation from semantic representations. We dem ..."
Abstract

Cited by 774 (43 self)
 Add to MetaCart
(Show Context)
Treeadjoining grammars (TAG) have been proposed as a formalism for generation based on the intuition that the extended domain of syntactic locality that TAGs provide should aid in localizing semantic dependencies as well, in turn serving as an aid to generation from semantic representations. We demonstrate that this intuition can be made concrete by using the formalism of synchronous treeadjoining grammars. The use of synchronous TAGs for generation provides solutions to several problems with previous approaches to TAG generation. Furthermore, the semantic monotonicity requirement previously advocated for generation gram mars as a computational aid is seen to be an inherent property of synchronous TAGs.
The induction of dynamical recognizers
 Machine Learning
, 1991
"... A higher order recurrent neural network architecture learns to recognize and generate languages after being "trained " on categorized exemplars. Studying these networks from the perspective of dynamical systems yields two interesting discoveries: First, a longitudinal examination of the le ..."
Abstract

Cited by 225 (14 self)
 Add to MetaCart
(Show Context)
A higher order recurrent neural network architecture learns to recognize and generate languages after being "trained " on categorized exemplars. Studying these networks from the perspective of dynamical systems yields two interesting discoveries: First, a longitudinal examination of the learning process illustrates a new form of mechanical inference: Induction by phase transition. A small weight adjustment causes a "bifurcation" in the limit behavior of the network. This phase transition corresponds to the onset of the network’s capacity for generalizing to arbitrarylength strings. Second, a study of the automata resulting from the acquisition of previously published training sets indicates that while the architecture is not guaranteed to find a minimal finite automaton consistent with the given exemplars, which is an NPHard problem, the architecture does appear capable of generating nonregular languages by exploiting fractal and chaotic dynamics. I end the paper with a hypothesis relating linguistic generative capacity to the behavioral regimes of nonlinear dynamical systems.
A survey of statistical machine translation
, 2007
"... Statistical machine translation (SMT) treats the translation of natural language as a machine learning problem. By examining many samples of humanproduced translation, SMT algorithms automatically learn how to translate. SMT has made tremendous strides in less than two decades, and many popular tec ..."
Abstract

Cited by 93 (6 self)
 Add to MetaCart
Statistical machine translation (SMT) treats the translation of natural language as a machine learning problem. By examining many samples of humanproduced translation, SMT algorithms automatically learn how to translate. SMT has made tremendous strides in less than two decades, and many popular techniques have only emerged within the last few years. This survey presents a tutorial overview of stateoftheart SMT at the beginning of 2007. We begin with the context of the current research, and then move to a formal problem description and an overview of the four main subproblems: translational equivalence modeling, mathematical modeling, parameter estimation, and decoding. Along the way, we present a taxonomy of some different approaches within these areas. We conclude with an overview of evaluation and notes on future directions.
The language of RNA: A formal grammar that includes pseudoknots
 Bioinformatics
"... Motivation: In a previous paper, we presented a polynomial time dynamic programming algorithm for predicting optimal RNA secondary structure including pseudoknots. However a formal grammatical representation for RNA secondary structure with pseudoknots was still lacking. Results: Here we show a one ..."
Abstract

Cited by 75 (1 self)
 Add to MetaCart
Motivation: In a previous paper, we presented a polynomial time dynamic programming algorithm for predicting optimal RNA secondary structure including pseudoknots. However a formal grammatical representation for RNA secondary structure with pseudoknots was still lacking. Results: Here we show a onetoone correspondence between that algorithm and a formal transformational grammar. This grammar class encompasses the contextfree grammars and goes beyond to generate pseudoknotted structures. The pseudoknot grammar avoids the use of general contextsensitive rules by introducing a small number of auxiliary symbols used to reorder the strings generated by an otherwise contextfree grammar. This formal representation of the residue correlations in RNA structure is important because it means we can build full probabilistic models of RNA secondary structure, including pseudoknots, and use them to optimally parse sequences in polynomial time. Contact: eddy@genetics.wustl.edu 1 ...
The Computational Analysis of the Syntax and Interpretation of "Free" Word Order in Turkish
, 1995
"... ..."
Bilexical Grammars And Their CubicTime Parsing Algorithms
 IN: NEW DEVELOPMENTS IN NATURAL LANGUAGE PARSING
, 2000
"... This chapter introduces weighted bilexical grammars, a formalism in which individual lexical items, such as verbs and their arguments, can have idiosyncratic selectional influences on each other. Such ‘bilexicalism ’ has been a theme of much current work in parsing. The new formalism can be used t ..."
Abstract

Cited by 64 (1 self)
 Add to MetaCart
This chapter introduces weighted bilexical grammars, a formalism in which individual lexical items, such as verbs and their arguments, can have idiosyncratic selectional influences on each other. Such ‘bilexicalism ’ has been a theme of much current work in parsing. The new formalism can be used to describe bilexical approaches to both dependency and phrasestructure grammars, and a slight modification yields link grammars. Its scoring approach is compatible with a wide variety of probability models. The obvious parsing algorithm for bilexical grammars (used by most previous authors) takes time O(n^5). A more efficient O(n³) method is exhibited. The new algorithm has been implemented and used in a large parsing experiment (Eisner, 1996b). We also give a useful extension to the case where the parser must undo a stochastic transduction that has altered the input.
Grammatical Acquisition: Inductive Bias and Coevolution of Language and the Language Acquisition Device
 Language
, 2000
"... An account of grammatical acquisition is developed within the parametersetting framework applied to a generalized categorial grammar (GCG). The GCG is embedded in a default inheritance network yielding a natural partial ordering (reflecting generality) of parameters which determines a partial ord ..."
Abstract

Cited by 60 (0 self)
 Add to MetaCart
(Show Context)
An account of grammatical acquisition is developed within the parametersetting framework applied to a generalized categorial grammar (GCG). The GCG is embedded in a default inheritance network yielding a natural partial ordering (reflecting generality) of parameters which determines a partial order for parameter setting. Computational simulation shows that several resulting acquisition procedures are effective on a parameter set expressing major typological distinctions based on constituent order, and defining 70 distinct full languages and over 200 subset languages. The effects on acquisition of inductive bias, that is, of differing initial parameter settings, are explored via computational simulation. Computational simulation of populations of language learners and users instantiating the acquisition model show: 1) that variant acquisition procedures, with differing inductive biases, exert differing selective pressures on the evolution of language(s); 2) acquisition proc...
Derivational Minimalism is Mildly ContextSensitive
 In Proceedings, Logical Aspects of Computational Linguistics, LACL’98
, 1998
"... this paper we address the issue by showing that each MG as deøned in [3] falls into the class of mildly contextsensitive grammars (MCSGs) as described in e.g. [1]. The proof of our claim is essentially done by converting a given MG into a linear contextfree rewriting system (LCFRS) which derives the ..."
Abstract

Cited by 57 (12 self)
 Add to MetaCart
this paper we address the issue by showing that each MG as deøned in [3] falls into the class of mildly contextsensitive grammars (MCSGs) as described in e.g. [1]. The proof of our claim is essentially done by converting a given MG into a linear contextfree rewriting system (LCFRS) which derives the same (string) language.
Monotonic syntactic processing: a crosslinguistic study of attachment and reanalysis
 Language and Cognitive Processes
, 1996
"... Requests for reprints should be addressed to Patrick Sturt, Centre for Cognitive Science, ..."
Abstract

Cited by 52 (2 self)
 Add to MetaCart
Requests for reprints should be addressed to Patrick Sturt, Centre for Cognitive Science,