• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations

Tools

Sorted by:
Try your query at:
Semantic Scholar Scholar Academic
Google Bing DBLP
Results 1 - 10 of 203
Next 10 →

Learnability in Optimality Theory

by Bruce Tesar, Paul Smolensky , 1995
"... In this article we show how Optimality Theory yields a highly general Constraint Demotion principle for grammar learning. The resulting learning procedure specifically exploits the grammatical structure of Optimality Theory, independent of the content of substantive constraints defining any given gr ..."
Abstract - Cited by 529 (35 self) - Add to MetaCart
efficient convergence to a correct grammar. We discuss implications for learning from overt data only, as well as other learning issues. We argue that Optimality Theory promotes confluence of the demands of more effective learnability and deeper linguistic explanation.

Fluctuations, effective learnability and metastability in analysis

by Ulrich Kohlenbach, Pavol Safarik
"... This paper discusses what kind of quantitative information one can extract under which circumstances from proofs of convergence statements in analysis. We show that from proofs using only a limited amount of the law-of-excluded-middle, one can extract functionals (B, L), where L is a learning proced ..."
Abstract - Cited by 6 (1 self) - Add to MetaCart
procedure for a rate of convergence which succeeds after at most B(a)-many mind changes. This (B, L)-learnability provides quantitative information strictly in between a full rate of convergence (obtainable in general only from semi-constructive proofs) and a rate of metastability in the sense of Tao

Scale-sensitive Dimensions, Uniform Convergence, and Learnability

by Noga Alon, Shai Ben-David, Nicolo Cesa-Bianchi, David Haussler , 1997
"... Learnability in Valiant's PAC learning model has been shown to be strongly related to the existence of uniform laws of large numbers. These laws define a distribution-free convergence property of means to expectations uniformly over classes of random variables. Classes of real-valued functions ..."
Abstract - Cited by 242 (2 self) - Add to MetaCart
(or "agnostic") framework. Furthermore, we show a characterization of learnability in the probabilistic concept model, solving an open problem posed by Kearns and Schapire. These results show that the accuracy parameter plays a crucial role in determining the effective complexity

The Learnability of Naive Bayes

by Huajie Zhang, Charles X. Ling, Zhiduo Zhao - In: Proceedings of Canadian Artificial Intelligence Conference , 2005
"... Naive Bayes is an efficient and effective learning algorithm, but previous results show that its representation ability is severely limited since it can only represent certain linearly separable functions in the binary domain. We give necessary and sufficient conditions on linearly separable functio ..."
Abstract - Cited by 164 (0 self) - Add to MetaCart
Naive Bayes is an efficient and effective learning algorithm, but previous results show that its representation ability is severely limited since it can only represent certain linearly separable functions in the binary domain. We give necessary and sufficient conditions on linearly separable

Empirical tests of the Gradual Learning Algorithm

by Paul Boersma, Bruce Hayes - LINGUISTIC INQUIRY 32.45–86 , 2001
"... The Gradual Learning Algorithm (Boersma 1997) is a constraint ranking algorithm for learning Optimality-theoretic grammars. The purpose of this article is to assess the capabilities of the Gradual Learning Algorithm, particularly in comparison with the Constraint Demotion algorithm of Tesar and Smol ..."
Abstract - Cited by 383 (37 self) - Add to MetaCart
and Smolensky (1993, 1996, 1998, 2000), which initiated the learnability research program for Optimality Theory. We argue that the Gradual Learning Algorithm has a number of special advantages: it can learn free variation, deal effectively with noisy learning data, and account for gradient wellformedness

On the Relative Sizes of Learnable Sets

by Lance Fortnow, Rusins Freivalds, William I. Gasarch, Martin Kummer, Stuart A. Kurtz, Carl H. Smith
"... Measure and category (or rather, their recursion-theoretical counterparts) have been used in theoretical computer science to make precise the intuitive notion "for most of the recursive sets." We use the notions of effective measure and category to discuss the relative sizes of inferrible ..."
Abstract - Cited by 1 (1 self) - Add to MetaCart
Measure and category (or rather, their recursion-theoretical counterparts) have been used in theoretical computer science to make precise the intuitive notion "for most of the recursive sets." We use the notions of effective measure and category to discuss the relative sizes of inferrible

Linear Object Classes and Image Synthesis From a Single Example Image

by Thomas Vetter, Tomaso Poggio - IEEE Transactions on Pattern Analysis and Machine Intelligence , 1997
"... Abstract—The need to generate new views of a 3D object from a single real image arises in several fields, including graphics and object recognition. While the traditional approach relies on the use of 3D models, we have recently introduced [1], [2], [3] simpler techniques that are applicable under r ..."
Abstract - Cited by 235 (25 self) - Add to MetaCart
restricted conditions. The approach exploits image transformations that are specific to the relevant object class, and learnable from example views of other “prototypical ” objects of the same class. In this paper, we introduce such a technique by extending the notion of linear class proposed by Poggio

Learnability-based Syntactic Annotation Design

by Ro Schwartz, Omri Abend
"... There is often more than one way to represent syntactic structures, even within a given formalism. Selecting one representation over another may affect parsing performance. Therefore, selecting between alternative syntactic representations (henceforth, syntactic selection) is an essential step in de ..."
Abstract - Cited by 5 (0 self) - Add to MetaCart
in designing an annotation scheme. We present a methodology for syntactic selection and apply it to six central dependency structures. Our methodology compares pairs of annotation schemes that differ in the annotation of a single structure. It selects the more learnable scheme, namely the one that can

Sufficient conditions for agnostic active learnable

by Liwei Wang - In Advances in Neural Information Processing Systems 22
"... We study pool-based active learning in the presence of noise, i.e. the agnostic setting. Previous works have shown that the effectiveness of agnostic active learning depends on the learning problem and the hypothesis space. Although there are many cases on which active learning is very useful, it is ..."
Abstract - Cited by 11 (0 self) - Add to MetaCart
We study pool-based active learning in the presence of noise, i.e. the agnostic setting. Previous works have shown that the effectiveness of agnostic active learning depends on the learning problem and the hypothesis space. Although there are many cases on which active learning is very useful

Running Head: Learnability in Chinese and English Learnability in Chinese and English

by Author(s Lin, Amy Akamatsu, Angel Mei, Yi Lin, Nobuhiko Akamatsu
"... The main purpose of this paper is to briefly review some empirical findings on the processing mechanisms of skilled readers and beginning readers in Chinese and English and to compare the learnability of reading in the two languages. In the learning processes, a similar global-to-analytic developmen ..."
Abstract - Add to MetaCart
The main purpose of this paper is to briefly review some empirical findings on the processing mechanisms of skilled readers and beginning readers in Chinese and English and to compare the learnability of reading in the two languages. In the learning processes, a similar global
Next 10 →
Results 1 - 10 of 203
Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University