Results 1 
3 of
3
Selection of relevant features and examples in machine learning
 ARTIFICIAL INTELLIGENCE
, 1997
"... In this survey, we review work in machine learning on methods for handling data sets containing large amounts of irrelevant information. We focus on two key issues: the problem of selecting relevant features, and the problem of selecting relevant examples. We describe the advances that have been mad ..."
Abstract

Cited by 590 (2 self)
 Add to MetaCart
(Show Context)
In this survey, we review work in machine learning on methods for handling data sets containing large amounts of irrelevant information. We focus on two key issues: the problem of selecting relevant features, and the problem of selecting relevant examples. We describe the advances that have been made on these topics in both empirical and theoretical work in machine learning, and we present a general framework that we use to compare different methods. We close with some challenges for future work in this area.
Relevant examples and relevant features: Thoughts from computational learning theory
 In AAAI94 Fall Symposium, Workshop on Relevance, 1994. [DF95] [DF99] [Jac94] [Kea98
, 1993
"... It can be said that nearly all results in machine learning, whether experimental or theoretical, deal with problems of separating relevant from irrelevant information in some way. In this paper I will attempt to ..."
Abstract

Cited by 32 (2 self)
 Add to MetaCart
(Show Context)
It can be said that nearly all results in machine learning, whether experimental or theoretical, deal with problems of separating relevant from irrelevant information in some way. In this paper I will attempt to
A Technique for Upper Bounding the Spectral Norm with Applications to Learning
 In Proceedings of the Fifth Annual Workshop on Computational Learning Theory
, 1992
"... We present a general technique to upper bound the spectral norm of an arbitrary function. At the heart of our technique is a theorem which shows how to obtain an upper bound on the spectral norm of a decision tree given the spectral norms of the functions in the nodes of this tree. The theorem appli ..."
Abstract

Cited by 13 (0 self)
 Add to MetaCart
(Show Context)
We present a general technique to upper bound the spectral norm of an arbitrary function. At the heart of our technique is a theorem which shows how to obtain an upper bound on the spectral norm of a decision tree given the spectral norms of the functions in the nodes of this tree. The theorem applies to trees whose nodes may compute any boolean functions. Applications are to the design of efficient learning algorithms and the construction of small depth threshold circuits (or neural nets). In particular, we present polynomial time algorithms for learning O(logn) term DNF formulas and various classes of decision trees, all under the uniform distribution with membership queries. Department of Computer Science & Engineering, Mail Code 0114, University of California at San Diego, 9500 Gilman Drive, La Jolla, CA 92093. Email: mihir@cs.ucsd.edu 1 Introduction Associated to any realvalued function is a number called its spectral norm. Let us begin by saying what it is. 1.1 Definitions...