Results 1 
6 of
6
On the Complexity of Recognizing Iterated Differences of Polyhedra
, 1997
"... The iterated difference of polyhedra V = P1n(P2n(:::Pk):::) has been proposed independently in [11] and [7] as a sufficient condition for V to be exactly computable by a two layered neural network. An algorithm checking whether V IR d is an iterated difference of polyhedra is proposed in [11]. Howev ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
The iterated difference of polyhedra V = P1n(P2n(:::Pk):::) has been proposed independently in [11] and [7] as a sufficient condition for V to be exactly computable by a two layered neural network. An algorithm checking whether V IR d is an iterated difference of polyhedra is proposed in [11]. However, this algorithm is not practically usable because it has a high computational complexity and it was only conjectured to stop with a negative answer when applied to a region which is not an iterated difference of polyhedra. This paper sheds some light on the nature of iterated difference of polyhedra. The outcomes are: (i) an algorithm which always stops after a small number of iterations, (ii) su cient conditions for this algorithm to be polynomial and (iii) the proof that an iterated difference of polyhedra can be exactly computed by a twolayered neural network using only essential hyperplanes.
Advances for Exact Resolution of Polyhedral Dichotomies By Multilayer Neural Networks
"... We study the number of hidden layers required by a multilayer neural network with threshold units to compute a dichotomy from R d to f0; 1g, defined by a finite set of hyperplanes. We show that this question is far more intricate than computing Boolean functions, although this wellknown problem i ..."
Abstract
 Add to MetaCart
We study the number of hidden layers required by a multilayer neural network with threshold units to compute a dichotomy from R d to f0; 1g, defined by a finite set of hyperplanes. We show that this question is far more intricate than computing Boolean functions, although this wellknown problem is underlying our research. We present new advances on the characterization of dichotomies, from R 2 to f0; 1g, which require two hidden layers to be exactly realized.
Multilayer Neural Networks and Polyhedral Dichotomies
, 1997
"... We study the number of hidden layers required by a multilayer neural network with threshold units to compute a dichotomy f from R d to f0; 1g, defined by a finite set of hyperplanes. We show that this question is far more intricate than computing Boolean functions, although this wellknown problem ..."
Abstract
 Add to MetaCart
(Show Context)
We study the number of hidden layers required by a multilayer neural network with threshold units to compute a dichotomy f from R d to f0; 1g, defined by a finite set of hyperplanes. We show that this question is far more intricate than computing Boolean functions, although this wellknown problem is underlying our research. We present recent advances on the characterization of dichotomies, from R 2 to f0; 1g, which require two hidden layers to be exactly realized.
On the Complexity of Recognising Iterated Differences of Polyhedra
, 1997
"... The iterated difference of polyhedra V = P1n(P2n(: : : Pk ) : : :) has been proposed independently in [11] and [7] as a sufficient condition for V to be exactly computable by a twolayered neural network. An algorithm checking whether V ae IR d is an iterated difference of polyhedra is proposed in ..."
Abstract
 Add to MetaCart
The iterated difference of polyhedra V = P1n(P2n(: : : Pk ) : : :) has been proposed independently in [11] and [7] as a sufficient condition for V to be exactly computable by a twolayered neural network. An algorithm checking whether V ae IR d is an iterated difference of polyhedra is proposed in [11]. However, this algorithm is not practically usable because it has a high computational complexity and it was only conjectured to stop with a negative answer when applied to a region which is not an iterated difference of polyhedra. This paper sheds some light on the nature of iterated difference of polyhedra. The outcomes are : (i) an algorithm which always stops after a small number of iterations, (ii) sufficient conditions for this algorithm to be polynomial and (iii) the proof that an iterated difference of polyhedra can be exactly computed by a twolayered neural network using only essential hyperplanes.
On the Complexity of Recognising Regions Computable By TwoLayered Perceptrons
, 1998
"... This work is concerned with the computational complexity of the recognition of LP 2 , the class of regions of the Euclidian space that can be classified exactly by a twolayered perceptron. Some subclasses of LP 2 of particular interest are also studied, such as the class of iterated differences of ..."
Abstract
 Add to MetaCart
This work is concerned with the computational complexity of the recognition of LP 2 , the class of regions of the Euclidian space that can be classified exactly by a twolayered perceptron. Some subclasses of LP 2 of particular interest are also studied, such as the class of iterated differences of polyhedra, or the class of regions V that can be classified by a twolayered perceptron with as only hidden units the ones associated to (d \Gamma 1)dimensional facets of V . In this paper, we show that the recognition problem for LP 2 as well as most other subclasses considered here is NPHard in the most general case. We then identify special cases that admit polynomial time algorithms.
On the Complexity of Recognizing Regions of R^d Computable by TwoLayered Perceptrons
"... This work is concerned with the computational complexity of the recognition of LP 2 , the class of regions of the Euclidian space that can be classified exactly by a twolayered perceptron. Some subclasses of LP 2 of particular interest are also studied, such as the class of iterated differences of ..."
Abstract
 Add to MetaCart
This work is concerned with the computational complexity of the recognition of LP 2 , the class of regions of the Euclidian space that can be classified exactly by a twolayered perceptron. Some subclasses of LP 2 of particular interest are also studied, such as the class of iterated differences of polyhedra, or the class of regions V that can be classified by a twolayered perceptron with as only hidden units the ones associated to (d \Gamma 1)dimensional facets of V . In this paper, we show that the recognition problem for LP 2 as well as most other subclasses considered here is NPHard in the most general case. We then identify special cases that admit polynomial time algorithms.