### Citations

1893 |
Multilayer feedforward networks are universal approximations,
- Hornik, Stinchcombe, et al.
- 1989
(Show Context)
Citation Context ...functions with binary networks or binary multilayer networks. On the one hand, more recent work focused on approximately realizing real functions with multilayer neural networks with one hidden layer =-=[7, 8, 13]-=- or with two hidden units [2]. On the other hand, some authors [1, 14] were interested in finding bounds on the architecture of multilayer networks for exact realization of a finite set of points. Ano... |

1248 | Approximation by Superpositions of a Sigmoidal Function,”
- Cybenko
- 1989
(Show Context)
Citation Context ...functions with binary networks or binary multilayer networks. On the one hand, more recent work focused on approximately realizing real functions with multilayer neural networks with one hidden layer =-=[7, 8, 13]-=- or with two hidden units [2]. On the other hand, some authors [1, 14] were interested in finding bounds on the architecture of multilayer networks for exact realization of a finite set of points. Ano... |

482 |
On the approximate realization of continuous mappings by neural networks,
- Funahashi
- 1989
(Show Context)
Citation Context ...functions with binary networks or binary multilayer networks. On the one hand, more recent work focused on approximately realizing real functions with multilayer neural networks with one hidden layer =-=[7, 8, 13]-=- or with two hidden units [2]. On the other hand, some authors [1, 14] were interested in finding bounds on the architecture of multilayer networks for exact realization of a finite set of points. Ano... |

216 |
Threshold Logic and Its Applications.
- Muroga
- 1971
(Show Context)
Citation Context ...ahashi, Tomita and Kawabata have presented a notion of "cyclicity", in the same context of research, but with a different point of view. They start from the notion of summability of boolean =-=functions [15]-=-, and n-cyclicity can be viewed as a reinterpretation of n-summability. Given the hyperplanes associated to the hidden units of a fixed network (essential hyperplanes, plus redundant hyperplanes) , fi... |

59 | Geometric Algorithms - Grötschel, Lovász, et al. - 1988 |

51 |
Approximation theory and feedforward networks
- Blum, Li
- 1991
(Show Context)
Citation Context ...nary multilayer networks. On the one hand, more recent work focused on approximately realizing real functions with multilayer neural networks with one hidden layer [7, 8, 13] or with two hidden units =-=[2]-=-. On the other hand, some authors [1, 14] were interested in finding bounds on the architecture of multilayer networks for exact realization of a finite set of points. Another approach is to search th... |

44 |
On the capabilities of multilayer perceptrons
- Baum
- 1988
(Show Context)
Citation Context ...hand, more recent work focused on approximately realizing real functions with multilayer neural networks with one hidden layer [7, 8, 13] or with two hidden units [2]. On the other hand, some authors =-=[1, 14]-=- were interested in finding bounds on the architecture of multilayer networks for exact realization of a finite set of points. Another approach is to search the minimal architecture of multilayer netw... |

33 | Bounds on the number of hidden neurons in multilayer perceptrons,
- HUANG, HUANG
- 1991
(Show Context)
Citation Context ...hand, more recent work focused on approximately realizing real functions with multilayer neural networks with one hidden layer [7, 8, 13] or with two hidden units [2]. On the other hand, some authors =-=[1, 14]-=- were interested in finding bounds on the architecture of multilayer networks for exact realization of a finite set of points. Another approach is to search the minimal architecture of multilayer netw... |

25 |
On the Decision Regions of Multilayer Perceptrons,
- GIBSON, COWAN
- 1990
(Show Context)
Citation Context ...dral dichotomy f , from R d to f0; 1g, can be realized by a one-hiddenlayer network, then it cannot be in an XOR-situation, nor in an XOR-bow-tie, nor in an XOR-at-infinity. The proof can be found in =-=[11, 6]-=- for the XOR-situation, in [17] for the XOR-bow-tie, and in [6] for the XOR-at-infinity. The sketch of these proofs is always the same: the four regions (two in each class) and their respective labell... |

7 |
A combinatorial approach to understanding perceptron decision regions
- Gibson
- 1993
(Show Context)
Citation Context ...nother approach is to search the minimal architecture of multilayer networks for exactly realizing real functions, from R d to f0; 1g. Our work, of the latter kind, is a continuation of the effort of =-=[5, 6, 9, 10]-=- towards characterizing the real dichotomies which can be exactly realized with a single hidden layer neural network composed of threshold units. We show how this research is related to geometric algo... |

7 |
Exact classification with two-layer neural nets in n dimensions
- Sweatman, Gibson, et al.
- 1998
(Show Context)
Citation Context ...nother approach is to search the minimal architecture of multilayer networks for exactly realizing real functions, from R d to f0; 1g. Our work, of the latter kind, is a continuation of the effort of =-=[5, 6, 9, 10]-=- towards characterizing the real dichotomies which can be exactly realized with a single hidden layer neural network composed of threshold units. We show how this research is related to geometric algo... |

6 | Multilayer neural networks: one or two hidden layers
- Brightwell, Kenyon, et al.
- 1997
(Show Context)
Citation Context ...DVANCES 3.1 Local realization in R 2 The next two theorems proved that, in R 2 , the XOR-bow-tie and the XOR-at-infinity are the only restrictions to local realizability. Their proofs can be found in =-=[4, 3]-=-. Theorem 3 Let f be a polyhedral dichotomy on R 2 and let P be a point of multiple intersection. Let C P be a neighborhood of P which does not intersect any essential hyperplane other than those goin... |

5 |
Separability of internal representations in multilayer perceptrons with application to learning
- Takahashi, Tomita, et al.
- 1993
(Show Context)
Citation Context ... respective labellings induce an inconsistency in the system of inequalities associated to a one-hidden-layer solution. 2.2 Network approach In contrast with our geometrical definitions, note that in =-=[16], Takahash-=-i, Tomita and Kawabata have presented a notion of "cyclicity", in the same context of research, but with a different point of view. They start from the notion of summability of boolean funct... |

4 |
Complexity issues in neural network computations
- Cosnard, Koiran, et al.
- 1992
(Show Context)
Citation Context ...nother approach is to search the minimal architecture of multilayer networks for exactly realizing real functions, from R d to f0; 1g. Our work, of the latter kind, is a continuation of the effort of =-=[5, 6, 9, 10]-=- towards characterizing the real dichotomies which can be exactly realized with a single hidden layer neural network composed of threshold units. We show how this research is related to geometric algo... |

3 |
A step towards the frontier between one-hiddenlayer and two-hidden layer neural networks
- Cosnard, Koiran, et al.
- 1993
(Show Context)
Citation Context |

3 |
The complexity of multi-layered perceptrons
- Zweitering
- 1994
(Show Context)
Citation Context ... 1g, can be realized by a one-hiddenlayer network, then it cannot be in an XOR-situation, nor in an XOR-bow-tie, nor in an XOR-at-infinity. The proof can be found in [11, 6] for the XOR-situation, in =-=[17]-=- for the XOR-bow-tie, and in [6] for the XOR-at-infinity. The sketch of these proofs is always the same: the four regions (two in each class) and their respective labellings induce an inconsistency in... |