Results 1  10
of
14
Learning Decision Trees using the Fourier Spectrum
, 1991
"... This work gives a polynomial time algorithm for learning decision trees with respect to the uniform distribution. (This algorithm uses membership queries.) The decision tree model that is considered is an extension of the traditional boolean decision tree model that allows linear operations in each ..."
Abstract

Cited by 205 (10 self)
 Add to MetaCart
This work gives a polynomial time algorithm for learning decision trees with respect to the uniform distribution. (This algorithm uses membership queries.) The decision tree model that is considered is an extension of the traditional boolean decision tree model that allows linear operations in each node (i.e., summation of a subset of the input variables over GF (2)). This paper shows how to learn in polynomial time any function that can be approximated (in norm L 2 ) by a polynomially sparse function (i.e., a function with only polynomially many nonzero Fourier coefficients). The authors demonstrate that any function f whose L 1 norm (i.e., the sum of absolute value of the Fourier coefficients) is polynomial can be approximated by a polynomially sparse function, and prove that boolean decision trees with linear operations are a subset of this class of functions. Moreover, it is shown that the functions with polynomial L 1 norm can be learned deterministically. The algorithm can a...
Learning intersections and thresholds of halfspaces
, 2004
"... We give the first polynomial time algorithm to learn any function of a constant number of halfspaces under the uniform distribution on the Boolean hypercube to within any constant error parameter. We also give the first quasipolynomial time algorithm for learning any Boolean function of a polylog n ..."
Abstract

Cited by 86 (32 self)
 Add to MetaCart
We give the first polynomial time algorithm to learn any function of a constant number of halfspaces under the uniform distribution on the Boolean hypercube to within any constant error parameter. We also give the first quasipolynomial time algorithm for learning any Boolean function of a polylog number of polynomialweight halfspaces under any distribution on the Boolean hypercube. As special cases of these results we obtain algorithms for learning intersections and thresholds of halfspaces. Our uniform distribution learning algorithms involve a novel nongeometric approach to learning halfspaces; we use Fourier techniques together with a careful analysis of the noise sensitivity of functions of halfspaces. Our algorithms for learning under any distribution use techniques from real approximation theory to construct lowdegree polynomial threshold functions. Finally, we also observe that any function of a constant number of polynomialweight halfspaces can be learned in polynomial time in the model of exact learning from membership and equivalence queries.
On the Fourier Spectrum of Monotone Functions
, 1996
"... In this paper, monotone Boolean functions are studied using harmonic analysis on the cube. ..."
Abstract

Cited by 60 (0 self)
 Add to MetaCart
In this paper, monotone Boolean functions are studied using harmonic analysis on the cube.
On learning monotone DNF under product distributions
 In Proceedings of the Fourteenth Annual Conference on Computational Learning Theory
, 2001
"... We show that the class of monotone 2 O( √ log n)term DNF formulae can be PAC learned in polynomial time under the uniform distribution from random examples only. This is an exponential improvement over the best previous polynomialtime algorithms in this model, which could learn monotone o(log 2 n) ..."
Abstract

Cited by 32 (11 self)
 Add to MetaCart
(Show Context)
We show that the class of monotone 2 O( √ log n)term DNF formulae can be PAC learned in polynomial time under the uniform distribution from random examples only. This is an exponential improvement over the best previous polynomialtime algorithms in this model, which could learn monotone o(log 2 n)term DNF. We also show that various classes of small constantdepth circuits which compute monotone functions are PAC learnable in polynomial time under the uniform distribution. All of our results extend to learning under any constantbounded product distribution.
Learning DNF from random walks
, 2003
"... We consider a model of learning Boolean functions from examples generated by a uniform random walk on {0, 1} n. We give a polynomial time algorithm for learning decision trees and DNF formulas in this model. This is the first efficient algorithm for learning these classes in a natural passive learni ..."
Abstract

Cited by 21 (4 self)
 Add to MetaCart
(Show Context)
We consider a model of learning Boolean functions from examples generated by a uniform random walk on {0, 1} n. We give a polynomial time algorithm for learning decision trees and DNF formulas in this model. This is the first efficient algorithm for learning these classes in a natural passive learning model where the learner has no influence over the choice of examples used for learning.
Boolean functions with small spectral norm. Geometric and Functional Analysis
"... ar ..."
(Show Context)
Implementation issues in the Fourier Transform algorithm
 Proceedings of the Neural Information Processing Systems
, 1995
"... The Fourier transform of functions with boolean inputs has received considerable attention in the last few years in the Computational Learning Theory community, and has come to play an important role in proving many important learnability results. The aim of this work is to demonstrate that the Four ..."
Abstract

Cited by 8 (0 self)
 Add to MetaCart
(Show Context)
The Fourier transform of functions with boolean inputs has received considerable attention in the last few years in the Computational Learning Theory community, and has come to play an important role in proving many important learnability results. The aim of this work is to demonstrate that the Fourier transform techniques are also a useful and practical algorithm, in addition to having many interesting theoretical properties. One of the benefits we present is the confidence level the algorithm produces in addition to the predictions. The confidence level measures the likelihood that the prediction is correct. In order to have the algorithm runtime be reasonable and still produce accurate hypotheses, we had to perform many optimizations. In the paper we discuss the more prominent optimizations, ones that were crucial and without which the performance of the algorithm would severely deteriorate. 1 Introduction Over the last few years the Fourier transform representation of boolean fun...
Evaluating Spectral Norms for Constant Depth Circuits with Symmetric Gates
 J. computational complexity
, 1995
"...  Implications of our results and technique are discussed, for estimating the spectral norms of any function in a constant depth circuit class, using the coding theoretic concept of weight distributions. Evaluating the spectral norms for any such function reduces to estimating certain nontrivial we ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
(Show Context)
 Implications of our results and technique are discussed, for estimating the spectral norms of any function in a constant depth circuit class, using the coding theoretic concept of weight distributions. Evaluating the spectral norms for any such function reduces to estimating certain nontrivial weight distributions of simple, linear codes.
Learning Fuzzy Decision Trees
, 1996
"... We present a recurrent neural network which learns to suggest the next move during the descent along the branches of a decision tree. More precisely, given a decision instance represented by a node in the decision tree, the network provides the degree of membership of each possible move to the fuzzy ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
We present a recurrent neural network which learns to suggest the next move during the descent along the branches of a decision tree. More precisely, given a decision instance represented by a node in the decision tree, the network provides the degree of membership of each possible move to the fuzzy set good move». These fuzzy values constitute the core of the probability of selecting the move out of the set of the children of the current node. This results in a natural way for driving the sharp discretestate process running along the decision tree by means of incremental methods on the continuousvalued parameters of the neural network. The bulk of the learning problem consists in stating useful links between the local decisions about the next move and the global decisions about the suitability of the final solution. The peculiarity of the learning task is that the network has to deal explicitly with the twofold charge of lighting up the best solution and generating the move sequence...
Learning Boolean Functions
 Theoretical Advances in Neural Computation and Learning
, 1994
"... We survey learning algorithms that are based on the Fourier Transform representation. In many cases we simplify the original proofs and integrate the proofs of related results. We hope that this would give the reader a complete and comprehensive understanding of both the results and the techniques. ..."
Abstract
 Add to MetaCart
We survey learning algorithms that are based on the Fourier Transform representation. In many cases we simplify the original proofs and integrate the proofs of related results. We hope that this would give the reader a complete and comprehensive understanding of both the results and the techniques. 1 Introduction The importance of using the "right" representation of a function in order to "approximate" it has been widely recognized. The Fourier Transform representation of a function is a classic representation which is widely used to approximate real functions (i.e. functions whose inputs are real numbers). However, the Fourier Transform representation for functions whose inputs are boolean has been far less studied. On the other hand it seems that the Fourier Transform representation can be used to learn many classes of boolean functions. At this point it would be worthwhile to say a few words about the Fourier Transform of functions whose inputs are boolean. The basis functions are ...