Results 1  10
of
50,129
Feed Forward Neural Networks
"... this contribution reviews shortly the application of neural network methods to medical problems and characterizes its advantages and problems in the context of the medical background. Various research shows that diagnostic capabilities of human are worse than the neural network strategy to diagnose ..."
Abstract
 Add to MetaCart
propagation algorithm approach is presented. Finally, as case study of neural rule based diagnosis septic shock diagnosis is described, on one hand by a growing neural network and on the other hand by a rule based system. Keywords Medical Diagnosis,Artificial Feed Forward Neural Networks,Back Propagation
Feed Forward Neural Network Entities
 Lecture Notes in Computer Science: Biological and Artificial Computation: From Neuroscience to Technology
, 1997
"... . Feed Forward Neural Networks (FFNNs) are computational techniques inspired by the physiology of the brain and used in the approximation of general mappings from one finite dimensional space to another. They present a practical application of the theoretical resolution of Hilbert's 13 th pro ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
. Feed Forward Neural Networks (FFNNs) are computational techniques inspired by the physiology of the brain and used in the approximation of general mappings from one finite dimensional space to another. They present a practical application of the theoretical resolution of Hilbert's 13 th
Parallelization of Backpropagation Training for FeedForward Neural Networks
, 1996
"... Abstract The main objective of the work presented herein is to speed up neural network training using parallel processing. The back propagation trained feedforward neural network was selected for this research, since it has attracted most interest among neural network researchers. ..."
Abstract
 Add to MetaCart
Abstract The main objective of the work presented herein is to speed up neural network training using parallel processing. The back propagation trained feedforward neural network was selected for this research, since it has attracted most interest among neural network researchers.
Second Differentials in Arbitrary FeedForward Neural Networks
, 1996
"... We extend here a general mathematical model for feedforward neural networks. Such a network is represented as a vectorial function f of two variables, x (the input of the network) and w (the weight vector). We have already shown that the differential of f can be computed with an extended backpropa ..."
Abstract
 Add to MetaCart
We extend here a general mathematical model for feedforward neural networks. Such a network is represented as a vectorial function f of two variables, x (the input of the network) and w (the weight vector). We have already shown that the differential of f can be computed with an extended back
Efficient Training of FeedForward Neural Networks
, 1997
"... Since the discovery of the backpropagation method, many modied and new algorithms have been proposed for training of feedforward neural networks. The problem with slow convergence rate has, however, not been solved when the training is on large scale problems. There is still a need for more ecien ..."
Abstract

Cited by 18 (0 self)
 Add to MetaCart
Since the discovery of the backpropagation method, many modied and new algorithms have been proposed for training of feedforward neural networks. The problem with slow convergence rate has, however, not been solved when the training is on large scale problems. There is still a need for more
Periodic Symmetric Functions with FeedForward Neural Networks
, 1995
"... This technical report presents a new theoretical approach to the problem of switching networks synthesis with McCullochPitts feedforward neural networks. It is shown that any ninputs periodical symmetric Boolean function F p with the period T and the first positive transition at x = a can be impl ..."
Abstract

Cited by 6 (6 self)
 Add to MetaCart
This technical report presents a new theoretical approach to the problem of switching networks synthesis with McCullochPitts feedforward neural networks. It is shown that any ninputs periodical symmetric Boolean function F p with the period T and the first positive transition at x = a can
Boltzmann Learning in a FeedForward Neural Network
, 1995
"... We show how a feedforward neural network can be sucessfully trained by using a simulated annealing (or Monte Carlo) technique. The network is initialized randomly. Then the configurations (weights of the network) are generated according to a Boltzmann distribution. By lowering the temperature of th ..."
Abstract
 Add to MetaCart
We show how a feedforward neural network can be sucessfully trained by using a simulated annealing (or Monte Carlo) technique. The network is initialized randomly. Then the configurations (weights of the network) are generated according to a Boltzmann distribution. By lowering the temperature
Metropolis Learning in a FeedForward Neural Network
"... We show how a feedforward neural network can be sucessfully trained by using a simulated annealing (or Monte Carlo) technique. The network weights are initialized randomly. Then the configurations (weights of the network) are generated according to a Boltzmann distribution using the Metropolis algo ..."
Abstract
 Add to MetaCart
We show how a feedforward neural network can be sucessfully trained by using a simulated annealing (or Monte Carlo) technique. The network weights are initialized randomly. Then the configurations (weights of the network) are generated according to a Boltzmann distribution using the Metropolis
Metropolis Learning in a FeedForward Neural Network
"... We show how a feedforward neural network can be sucessfully trained by using a simulated annealing (or Monte Carlo) technique. The network weights are initialized randomly. Then the configurations (weights of the network) are generated according to a Boltzmann distribution using the Metropolis algo ..."
Abstract
 Add to MetaCart
We show how a feedforward neural network can be sucessfully trained by using a simulated annealing (or Monte Carlo) technique. The network weights are initialized randomly. Then the configurations (weights of the network) are generated according to a Boltzmann distribution using the Metropolis
Benchmarking FeedForward Neural Networks: Models and Measures
, 1992
"... Existing metrics for the learning performance of feedforward neural networks do not provide a satisfactory basis for comparison because the choice of the training epoch limit can determine the results of the comparison. I propose new metrics which have the desirable property of being independent of ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
Existing metrics for the learning performance of feedforward neural networks do not provide a satisfactory basis for comparison because the choice of the training epoch limit can determine the results of the comparison. I propose new metrics which have the desirable property of being independent
Results 1  10
of
50,129