Results 1 -
7 of
7
Commitment and flexibility in the . . .
, 2010
"... This dissertation investigates adults and children’s sentence processing mechanisms, with a special focus on how multiple levels of linguistic representation are incrementally computed in real time, and how this process affects the parser’s ability to later revise its early commitments. Using cross- ..."
Abstract
- Add to MetaCart
This dissertation investigates adults and children’s sentence processing mechanisms, with a special focus on how multiple levels of linguistic representation are incrementally computed in real time, and how this process affects the parser’s ability to later revise its early commitments. Using cross-methodological and cross-linguistic investigations of long-distance dependency processing, this dissertation demonstrates how paying explicit attention to the procedures by which linguistic representations are computed is vital to understanding both adults’ real time linguistic computation and children’s reanalysis mechanisms. The first part of the dissertation uses time course evidence from self-paced reading and eye tracking studies (reading and visual world) to show that long-distance dependency processing can be decomposed into a sequence of syntactic and interpretive processes. First, the reading experiments provide evidence that suggests that filler-gap dependencies are constructed before verb information is accessed. Second, visual world experiments show that, in the absence of information that would allow hearers to predict verb content in advance, interpretive processes in filler-gap dependency computation take
Psychocomputational Linguistics: A Gateway to the Computational Linguistics Curriculum
"... Computational modeling of human language processes is a small but growing subfield of computational linguistics. This paper describes a course that makes use of recent research in psychocomputational modeling as a framework to introduce a number of mainstream computational linguistics concepts to an ..."
Abstract
- Add to MetaCart
(Show Context)
Computational modeling of human language processes is a small but growing subfield of computational linguistics. This paper describes a course that makes use of recent research in psychocomputational modeling as a framework to introduce a number of mainstream computational linguistics concepts to an audience of linguistics, cognitive science and computer science doctoral students. The emphasis on what I take to be the largely interdisciplinary nature of computational linguistics is particularly germane for the computer science students. Since 2002 the course has been taught three times under the auspices of the MA/PhD program in Linguistics at The City University
An approximation approach to the problem of the acquisition of phonotactics in Optimality Theory
"... The problem of the acquisition of phonotactics in Optimality Theory is intractable. This paper offers a way to cope with this hardness result: the problem is reformulated as a well known integer program (the Assignment problem with linear side constraints) paving the way for the application to phono ..."
Abstract
- Add to MetaCart
(Show Context)
The problem of the acquisition of phonotactics in Optimality Theory is intractable. This paper offers a way to cope with this hardness result: the problem is reformulated as a well known integer program (the Assignment problem with linear side constraints) paving the way for the application to phonotactics of approximation algorithms recently developed for integer programming. Knowledge of the phonotactics of a language is knowledge of its distinction between licit and illicit forms. The acquisition of phonotactics represents a distinguished and important stage of language acquisition. In fact, in carefully controlled experimental conditions, nine-month-old infants already react differently to licit and illicit sound combinations (Jusczyk et al., 1993). They thus display knowledge of phonotactics already at an early stage of language development. Usually, the problem of the acquisition of the phonotactics of a language given a finite set of linguistic data is formalized as the problem of finding a smallest language in the typology that is consistent with the data (Berwick, 1985; Manzini
References
"... In Kirsti Börjars, ed., Non-transformational syntax: a guide to current models. ..."
Abstract
- Add to MetaCart
(Show Context)
In Kirsti Börjars, ed., Non-transformational syntax: a guide to current models.
The Subset Principle: Consequences and Conspiracies 1
"... The penalty for not obeying SP is very well understood. In this talk, we examine problems that emerge as we begin to think about how to apply SP in various contexts. We begin by adopting the psychologically attractive assumption that the learning mechanism (LM) is memoryless; during the course of le ..."
Abstract
- Add to MetaCart
The penalty for not obeying SP is very well understood. In this talk, we examine problems that emerge as we begin to think about how to apply SP in various contexts. We begin by adopting the psychologically attractive assumption that the learning mechanism (LM) is memoryless; during the course of learning LM has no ability to recall past hypotheses that were entertained 2 or prior sentences that were encountered. Given this memoryless assumption, we present a safe definition of SP: SP: When LM’s current language is incompatible with a new input sentence s, LM should hypothesize a UG-compatible language which is a smallest superset of {s}. By “smallest superset”, we mean a language that contains s and has no proper subset that also contains s. Although SP as defined above is safe (i.e., will not lead to chronic overgeneration errors), it is problematic since previous facts that were correctly learned may have to be abandoned if the next input does not exhibit them. Intuitively, in order for a learner with no memory for past learning events to abide by SP, each newly encountered sentence is essentially the first sentence the learner has heard. In the worst-case, we prove that even a finite (e.g. parameterized) domain is not learnable unless every potential target language in the domain contains a subset-free-trigger: a sentence s such that the target
Learning in the Face of Infidelity: Evaluating the Robust Interpretive Parsing/Constraint Demotion Model of Optimality Theory Language Acquisition
, 2008
"... ..."
(Show Context)
Commitment and flexibility in the . . .
"... This dissertation investigates adults and children’s sentence processing mechanisms, with a special focus on how multiple levels of linguistic representation are incrementally computed in real time, and how this process affects the parser’s ability to later revise its early commitments. Using cross- ..."
Abstract
- Add to MetaCart
(Show Context)
This dissertation investigates adults and children’s sentence processing mechanisms, with a special focus on how multiple levels of linguistic representation are incrementally computed in real time, and how this process affects the parser’s ability to later revise its early commitments. Using cross-methodological and cross-linguistic investigations of long-distance dependency processing, this dissertation demonstrates how paying explicit attention to the procedures by which linguistic representations are computed is vital to understanding both adults ’ real time linguistic computation and children’s reanalysis mechanisms. The first part of the dissertation uses time course evidence from self-paced reading and eye tracking studies (reading and visual world) to show that long-distance dependency processing can be decomposed into a sequence of syntactic and interpretive processes. First, the reading experiments provide evidence that suggests that filler-gap dependencies are constructed before verb information is accessed. Second, visual world experiments show that, in the absence of information that would allow hearers to predict