Results 1  10
of
118,270
On The Computational Power Of Neural Nets
 JOURNAL OF COMPUTER AND SYSTEM SCIENCES
, 1995
"... This paper deals with finite size networks which consist of interconnections of synchronously evolving processors. Each processor updates its state by applying a "sigmoidal" function to a linear combination of the previous states of all units. We prove that one may simulate all Turing Mach ..."
Abstract

Cited by 172 (23 self)
 Add to MetaCart
Machines by such nets. In particular, one can simulate any multistack Turing Machine in real time, and there is a net made up of 886 processors which computes a universal partialrecursive function. Products (high order nets) are not required, contrary to what had been stated in the literature. Non
On Combining Artificial Neural Nets
 Connection Science
, 1996
"... This paper reviews research on combining artificial neural nets, and provides an overview of, and an introduction to, the papers contained this Special Issue, and its companion (Connection Science, 9, 1). Two main approaches, ensemblebased, and modular, are identified and considered. An ensembl ..."
Abstract

Cited by 101 (3 self)
 Add to MetaCart
This paper reviews research on combining artificial neural nets, and provides an overview of, and an introduction to, the papers contained this Special Issue, and its companion (Connection Science, 9, 1). Two main approaches, ensemblebased, and modular, are identified and considered
and neural nets
"... A possible alternative to fine topology tuning for Neural Network (NN) optimization is to use Echo State Networks (ESNs), recurrent NNs built upon a large reservoir of sparsely randomly connected neurons. The promises of ESNs have been fulfilled for supervised learning tasks, but unsupervised learni ..."
Abstract
 Add to MetaCart
A possible alternative to fine topology tuning for Neural Network (NN) optimization is to use Echo State Networks (ESNs), recurrent NNs built upon a large reservoir of sparsely randomly connected neurons. The promises of ESNs have been fulfilled for supervised learning tasks, but unsupervised
Neural Networks Neural Nets
"... The use of artificial neural systems (ANS) for various recognition tasks is well founded in the literature [Aarts,1989; Ansari,1993; Carpenter,1988; Fukushima,1983; Mui,1994; Shang,1994; Touretzky,1989; Yong,1988]. The advantage of using neural nets, as they are commonly called, is not that the solu ..."
Abstract
 Add to MetaCart
The use of artificial neural systems (ANS) for various recognition tasks is well founded in the literature [Aarts,1989; Ansari,1993; Carpenter,1988; Fukushima,1983; Mui,1994; Shang,1994; Touretzky,1989; Yong,1988]. The advantage of using neural nets, as they are commonly called
Using mutual information for selecting features in supervised neural net learning
 IEEE TRANSACTIONS ON NEURAL NETWORKS
, 1994
"... This paper investigates the application of the mutual infor“ criterion to evaluate a set of candidate features and to select an informative subset to be used as input data for a neural network classifier. Because the mutual information measures arbitrary dependencies between random variables, it is ..."
Abstract

Cited by 339 (1 self)
 Add to MetaCart
This paper investigates the application of the mutual infor“ criterion to evaluate a set of candidate features and to select an informative subset to be used as input data for a neural network classifier. Because the mutual information measures arbitrary dependencies between random variables
Turing Computability With Neural Nets
 Applied Mathematics Letters
, 1991
"... . This paper shows the existence of a finite neural network, made up of sigmoidal neurons, which simulates a universal Turing machine. It is composed of less than 10 5 synchronously evolving processors, interconnected linearly. Highorder connections are not required. 1. Introduction This paper a ..."
Abstract

Cited by 81 (14 self)
 Add to MetaCart
addresses the question: What ultimate limitations, if any, are imposed by the use of neural nets as computing devices? In particular, and ignoring issues of training and practicality of implementation, one would like to know if every problem that can be solved by a digital computer is also solvable in
Combining Diverse Neural Nets
 THE KNOWLEDGE ENGINEERING REVIEW
, 1997
"... An appropriate use of neural computing techniques is to apply them to problems such as condition monitoring, fault diagnosis, control and sensing, where conventional solutions can be hard to obtain. However, when neural computing techniques are used, it is important that they are employed so as ..."
Abstract

Cited by 38 (1 self)
 Add to MetaCart
sets of neural nets are combined in ensembles and ensembles can be viewed as an example of the reliability through redundancy approach that is recommended for conventional software and hardware in safetycritical or safetyrelated applications. Although there has been recent interest in the use
On the Identification of Recurrent Neural Nets
"... In this paper observational equivalence for socalled Jordan networks, which are a special class of recurrent networks, is analysed. We show this type of neural nets to belong to a wider class of mixed networks and use the description of observational equivalence available for the latter class for o ..."
Abstract
 Add to MetaCart
In this paper observational equivalence for socalled Jordan networks, which are a special class of recurrent networks, is analysed. We show this type of neural nets to belong to a wider class of mixed networks and use the description of observational equivalence available for the latter class
Building Neural Net Software
, 1999
"... . In a recent paper [Neto et al. 97] we showed that programming languages can be translated on recurrent (analog, rational weighted) neural nets. The goal was not efficiency but simplicity. Indeed we used a numbertheoretic approach to machine programming, where (integer) numbers were coded in a una ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
. In a recent paper [Neto et al. 97] we showed that programming languages can be translated on recurrent (analog, rational weighted) neural nets. The goal was not efficiency but simplicity. Indeed we used a numbertheoretic approach to machine programming, where (integer) numbers were coded in a
Results 1  10
of
118,270