Results 11  20
of
1,051,405
Learning Talagrand DNF Formulas
"... Consider the following version of Talagrand’s probabilistic construction of a monotone function f: {0, 1} n → {0, 1}. Let f be an nterm monotone DNF formula where each term is selected independently and uniformly at random (with replacement) from the set of all n Ω(log log n) possible terms of leng ..."
Abstract
 Add to MetaCart
Consider the following version of Talagrand’s probabilistic construction of a monotone function f: {0, 1} n → {0, 1}. Let f be an nterm monotone DNF formula where each term is selected independently and uniformly at random (with replacement) from the set of all n Ω(log log n) possible terms
Projective DNF Formulae and Their Revision ⋆
"... Combinatorial and learnability results are proven for projective disjunctive normal forms, a class of DNF expressions introduced by Valiant. Dedicated to Prof. Peter Hammer on the occasion of his 70th birthday. 1 ..."
Abstract
 Add to MetaCart
Combinatorial and learnability results are proven for projective disjunctive normal forms, a class of DNF expressions introduced by Valiant. Dedicated to Prof. Peter Hammer on the occasion of his 70th birthday. 1
Projective DNF formulae and their revision
 In Learning Theory and Kernel Machines, 16th Annual Conference on Learning Theory and 7th Kernel Workshop, COLT/Kernel 2003
"... Valiant argued that biology imposes various constraints on learnability, and, motivated by these constraints, introduced his model of projection learning [14]. Projection learning aims to learn a target concept over some large domain, in this paper {0, 1} n, by learning some of its projections to a ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
Valiant argued that biology imposes various constraints on learnability, and, motivated by these constraints, introduced his model of projection learning [14]. Projection learning aims to learn a target concept over some large domain, in this paper {0, 1} n, by learning some of its projections to a class of smaller domains,
Learning Decision Lists
, 2001
"... This paper introduces a new representation for Boolean functions, called decision lists, and shows that they are efficiently learnable from examples. More precisely, this result is established for \kDL" { the set of decision lists with conjunctive clauses of size k at each decision. Since k ..."
Abstract

Cited by 427 (0 self)
 Add to MetaCart
kDL properly includes other wellknown techniques for representing Boolean functions such as kCNF (formulae in conjunctive normal form with at most k literals per clause), kDNF (formulae in disjunctive normal form with at most k literals per term), and decision trees of depth k, our result
On learning monotone DNF under product distributions
 In Proceedings of the Fourteenth Annual Conference on Computational Learning Theory
, 2001
"... We show that the class of monotone 2 O( √ log n)term DNF formulae can be PAC learned in polynomial time under the uniform distribution from random examples only. This is an exponential improvement over the best previous polynomialtime algorithms in this model, which could learn monotone o(log 2 n) ..."
Abstract

Cited by 32 (11 self)
 Add to MetaCart
We show that the class of monotone 2 O( √ log n)term DNF formulae can be PAC learned in polynomial time under the uniform distribution from random examples only. This is an exponential improvement over the best previous polynomialtime algorithms in this model, which could learn monotone o(log 2 n)term
On Learning Visual Concepts and DNF Formulae
, 1993
"... We consider the problem of learning DNF formulae in the mistakebound and the PAC models. We develop a new approach, which is called polynomial explainability, that is shown to be useful for learning some new subclasses of DNF (and CNF) formulae that were not known to be learnable before. Unlike pre ..."
Abstract

Cited by 21 (5 self)
 Add to MetaCart
previous learnability results for DNF (and CNF) formulae, these subclasses are not limited in the number of terms or in the number of variables per term; yet, they contain the subclasses of kDNF and ktermDNF (and the corresponding classes of CNF) as special cases. We apply our DNF results to the problem
Molecular learning of wDNF formulae
 In Proc. 11th Int. Meeting on DNA Computing (DNA
, 2005
"... Abstract. We introduce a class of generalized DNF formulae called wDNF or weighted disjunctive normal form, and present a molecular algorithm that learns a wDNF formula from training examples. Realized in DNA molecules, the wDNF machines have a natural probabilistic semantics, allowing for their app ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
Abstract. We introduce a class of generalized DNF formulae called wDNF or weighted disjunctive normal form, and present a molecular algorithm that learns a wDNF formula from training examples. Realized in DNA molecules, the wDNF machines have a natural probabilistic semantics, allowing
Mansour’s Conjecture is True for Random DNF Formulas
, 2010
"... In 1994, Y. Mansour conjectured that for every DNF formula on n variables with t terms there exists a polynomial p with t O(log(1/ǫ)) nonzero coefficients such that E x∈{0,1} n[(p(x) − f(x)) 2] ≤ ǫ. We make the first progress on this conjecture and show that it is true for randomly chosen DNF for ..."
Abstract

Cited by 7 (1 self)
 Add to MetaCart
In 1994, Y. Mansour conjectured that for every DNF formula on n variables with t terms there exists a polynomial p with t O(log(1/ǫ)) nonzero coefficients such that E x∈{0,1} n[(p(x) − f(x)) 2] ≤ ǫ. We make the first progress on this conjecture and show that it is true for randomly chosen DNF
On learning random DNF formulas under the uniform distribution
 In Proc. 9th Internat. Workshop on Randomization and Computation (RANDOM’05
, 2005
"... Abstract: We study the averagecase learnability of DNF formulas in the model of learning from uniformly distributed random examples. We define a natural model of random monotone DNF formulas and give an efficient algorithm which with high probability can learn, for any fixed constant γ> 0, a ran ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
Abstract: We study the averagecase learnability of DNF formulas in the model of learning from uniformly distributed random examples. We define a natural model of random monotone DNF formulas and give an efficient algorithm which with high probability can learn, for any fixed constant γ> 0, a
A path integral approach to the Kontsevich quantization formula
, 1999
"... We give a quantum field theory interpretation of Kontsevich’s deformation quantization formula for Poisson manifolds. We show that it is given by the perturbative expansion of the path integral of a simple topological bosonic open string theory. Its Batalin–Vilkovisky quantization yields a supercon ..."
Abstract

Cited by 306 (21 self)
 Add to MetaCart
We give a quantum field theory interpretation of Kontsevich’s deformation quantization formula for Poisson manifolds. We show that it is given by the perturbative expansion of the path integral of a simple topological bosonic open string theory. Its Batalin–Vilkovisky quantization yields a
Results 11  20
of
1,051,405