• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations

Tools

Sorted by:
Try your query at:
Semantic Scholar Scholar Academic
Google Bing DBLP
Results 1 - 10 of 6,224
Next 10 →

Learning from ambiguously labeled examples

by Eyke Hüllermeier, Jürgen Beringer, Fakultät Für Informatik - Intell. Data Anal , 2006
"... Inducing a classification function from a set of examples in the form of labeled instances is a standard problem in supervised machine learning. In this paper, we are concerned with ambiguous label classification (ALC), an extension of this setting in which several candidate labels may be assigned t ..."
Abstract - Cited by 16 (1 self) - Add to MetaCart
Inducing a classification function from a set of examples in the form of labeled instances is a standard problem in supervised machine learning. In this paper, we are concerned with ambiguous label classification (ALC), an extension of this setting in which several candidate labels may be assigned

Filtering noisy continuous labeled examples †

by José Ramón Quevedo, María Dolores García, Elena Montañés
"... Abstract. It is common in Machine Learning where rules are learned from examples that some of them could not be informative, otherwise they could be irrelevant or noisy. This type of examples makes the Machine Learning Systems produce not adequate rules. In this paper we present an algorithm that fi ..."
Abstract - Add to MetaCart
that filters noisy continuous labeled examples, whose computational cost is O(N·logN+NA 2) for N examples and A attributes. Besides, it is shown experimentally to be better than the embedded algorithms of the state-of-the art of the Machine Learning Systems. 1

Manifold regularization: A geometric framework for learning from labeled and unlabeled examples

by Mikhail Belkin, Partha Niyogi, Vikas Sindhwani - JOURNAL OF MACHINE LEARNING RESEARCH , 2006
"... We propose a family of learning algorithms based on a new form of regularization that allows us to exploit the geometry of the marginal distribution. We focus on a semi-supervised framework that incorporates labeled and unlabeled data in a general-purpose learner. Some transductive graph learning al ..."
Abstract - Cited by 578 (16 self) - Add to MetaCart
We propose a family of learning algorithms based on a new form of regularization that allows us to exploit the geometry of the marginal distribution. We focus on a semi-supervised framework that incorporates labeled and unlabeled data in a general-purpose learner. Some transductive graph learning

Combining labeled and unlabeled data with co-training

by Avrim Blum, Tom Mitchell , 1998
"... We consider the problem of using a large unlabeled sample to boost performance of a learning algorithm when only a small set of labeled examples is available. In particular, we consider a setting in which the description of each example can be partitioned into two distinct views, motivated by the ta ..."
Abstract - Cited by 1633 (28 self) - Add to MetaCart
We consider the problem of using a large unlabeled sample to boost performance of a learning algorithm when only a small set of labeled examples is available. In particular, we consider a setting in which the description of each example can be partitioned into two distinct views, motivated

Object recognition with partially labeled examples

by Andy Crane, Martin Szummer, Tomaso Poggio - Master’s thesis, Massachusetts Inst. of Technology , 2002
"... The Problem: Learning to recognize objects from very few labeled training examples, but large numbers of unlabeled examples. Motivation: Statistical object recognition techniques require large training sets to achieve good performance. It is often difficult and expensive to collect many labeled exam ..."
Abstract - Cited by 2 (0 self) - Add to MetaCart
The Problem: Learning to recognize objects from very few labeled training examples, but large numbers of unlabeled examples. Motivation: Statistical object recognition techniques require large training sets to achieve good performance. It is often difficult and expensive to collect many labeled

On Specifying Boolean Functions by Labelled Examples

by Martin Anthony, et al.
"... ..."
Abstract - Cited by 33 (13 self) - Add to MetaCart
Abstract not found

Probability: Theory and examples

by Rick Durrett - CAMBRIDGE U PRESS , 2011
"... Some times the lights are shining on me. Other times I can barely see. Lately it occurs to me what a long strange trip its been. Grateful Dead In 1989 when the first edition of the book was completed, my sons David and Greg were 3 and 1, and the cover picture showed the Dow Jones at 2650. The last t ..."
Abstract - Cited by 1331 (16 self) - Add to MetaCart
twenty years have brought many changes but the song remains the same. The title of the book indicates that as we develop the theory, we will focus our attention on examples. Hoping that the book would be a useful reference for people who apply probability in their work, we have tried to emphasize

Object Recognition with Partially Labeled Examples

by Tomaso Poggio, Arthur C. Smith, Andrew Crane, Andrew S. Crane - Master’s thesis, Massachusetts Inst. of Technology , 2002
"... Machine learning algorithms tend to improve in performance with larger training sets, but obtaining a large amount of training data comes at a high cost. Several methods of semi-supervised learning have been introduced recently to take advantage of a larger training set without the burden of labeli ..."
Abstract - Add to MetaCart
of labeling many samples. We apply these semisupervised learning methods to a data set of cars and background images, attempting to separate the two classes. Some of the algorithms obtain very high classification accuracy and can be used towards a car-detection system.

SEMI-SUPERVISED LEARNING WITH PARTIALLY LABELED EXAMPLES

by unknown authors , 2010
"... Traditionally, machine learning community has been focused on supervised learn-ing where the source of learning is fully labeled examples including both input features and corresponding output labels. As one way to alleviate the costly effort of collecting fully labeled examples, semi-supervised lea ..."
Abstract - Add to MetaCart
Traditionally, machine learning community has been focused on supervised learn-ing where the source of learning is fully labeled examples including both input features and corresponding output labels. As one way to alleviate the costly effort of collecting fully labeled examples, semi

Unsupervised Models for Named Entity Classification

by Michael Collins, Yoram Singer - In Proceedings of the Joint SIGDAT Conference on Empirical Methods in Natural Language Processing and Very Large Corpora , 1999
"... This paper discusses the use of unlabeled examples for the problem of named entity classification. A large number of rules is needed for coverage of the domain, suggesting that a fairly large number of labeled examples should be required to train a classifier. However, we show that the use of unlabe ..."
Abstract - Cited by 542 (4 self) - Add to MetaCart
This paper discusses the use of unlabeled examples for the problem of named entity classification. A large number of rules is needed for coverage of the domain, suggesting that a fairly large number of labeled examples should be required to train a classifier. However, we show that the use
Next 10 →
Results 1 - 10 of 6,224
Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University