Results 1  10
of
431,042
Feed Forward Neural Networks
"... this contribution reviews shortly the application of neural network methods to medical problems and characterizes its advantages and problems in the context of the medical background. Various research shows that diagnostic capabilities of human are worse than the neural network strategy to diagnose ..."
Abstract
 Add to MetaCart
propagation algorithm approach is presented. Finally, as case study of neural rule based diagnosis septic shock diagnosis is described, on one hand by a growing neural network and on the other hand by a rule based system. Keywords Medical Diagnosis,Artificial Feed Forward Neural Networks,Back Propagation
Feed Forward Neural Network Entities
 Lecture Notes in Computer Science: Biological and Artificial Computation: From Neuroscience to Technology
, 1997
"... . Feed Forward Neural Networks (FFNNs) are computational techniques inspired by the physiology of the brain and used in the approximation of general mappings from one finite dimensional space to another. They present a practical application of the theoretical resolution of Hilbert's 13 th pro ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
. Feed Forward Neural Networks (FFNNs) are computational techniques inspired by the physiology of the brain and used in the approximation of general mappings from one finite dimensional space to another. They present a practical application of the theoretical resolution of Hilbert's 13 th
Parallelization of Backpropagation Training for FeedForward Neural Networks
, 1996
"... Abstract The main objective of the work presented herein is to speed up neural network training using parallel processing. The back propagation trained feedforward neural network was selected for this research, since it has attracted most interest among neural network researchers. ..."
Abstract
 Add to MetaCart
Abstract The main objective of the work presented herein is to speed up neural network training using parallel processing. The back propagation trained feedforward neural network was selected for this research, since it has attracted most interest among neural network researchers.
Second Differentials in Arbitrary FeedForward Neural Networks
, 1996
"... We extend here a general mathematical model for feedforward neural networks. Such a network is represented as a vectorial function f of two variables, x (the input of the network) and w (the weight vector). We have already shown that the differential of f can be computed with an extended backpropa ..."
Abstract
 Add to MetaCart
We extend here a general mathematical model for feedforward neural networks. Such a network is represented as a vectorial function f of two variables, x (the input of the network) and w (the weight vector). We have already shown that the differential of f can be computed with an extended back
Periodic Symmetric Functions with FeedForward Neural Networks
, 1995
"... This technical report presents a new theoretical approach to the problem of switching networks synthesis with McCullochPitts feedforward neural networks. It is shown that any ninputs periodical symmetric Boolean function F p with the period T and the first positive transition at x = a can be impl ..."
Abstract

Cited by 6 (6 self)
 Add to MetaCart
This technical report presents a new theoretical approach to the problem of switching networks synthesis with McCullochPitts feedforward neural networks. It is shown that any ninputs periodical symmetric Boolean function F p with the period T and the first positive transition at x = a can
Response Analysis of FeedForward Neural Network Predictors
 in Proc. 1997 International Conference on Acoustics, Speech, and Siganl Processing
, 1997
"... In this paper, we investigate the characteristics of some onestepahead nonlinear predictors based on a twolayer feedforward neural network (2LFNN). The behavior of neural networks (NN) is investigated in the frequency domain using two frequency response estimation techniques, and in the time dom ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
In this paper, we investigate the characteristics of some onestepahead nonlinear predictors based on a twolayer feedforward neural network (2LFNN). The behavior of neural networks (NN) is investigated in the frequency domain using two frequency response estimation techniques, and in the time
Benchmarking FeedForward Neural Networks: Models and Measures
, 1992
"... Existing metrics for the learning performance of feedforward neural networks do not provide a satisfactory basis for comparison because the choice of the training epoch limit can determine the results of the comparison. I propose new metrics which have the desirable property of being independent of ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
Existing metrics for the learning performance of feedforward neural networks do not provide a satisfactory basis for comparison because the choice of the training epoch limit can determine the results of the comparison. I propose new metrics which have the desirable property of being independent
Maximizing the Margin with Feedforward Neural Networks
, 2002
"... Feedforward Neural Networks (FNNs) and Support Vector Machines (SVMs) are two machine learning frameworks developed from very different starting points of view. In this work a new learning model for FNNs is proposed such that, in the linearly separable case, tends to obtain the same solution that S ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
Feedforward Neural Networks (FNNs) and Support Vector Machines (SVMs) are two machine learning frameworks developed from very different starting points of view. In this work a new learning model for FNNs is proposed such that, in the linearly separable case, tends to obtain the same solution
Metropolis Learning in a FeedForward Neural Network
"... We show how a feedforward neural network can be sucessfully trained by using a simulated annealing (or Monte Carlo) technique. The network weights are initialized randomly. Then the configurations (weights of the network) are generated according to a Boltzmann distribution using the Metropolis algo ..."
Abstract
 Add to MetaCart
We show how a feedforward neural network can be sucessfully trained by using a simulated annealing (or Monte Carlo) technique. The network weights are initialized randomly. Then the configurations (weights of the network) are generated according to a Boltzmann distribution using the Metropolis
Boltzmann Learning in a FeedForward Neural Network
, 1995
"... We show how a feedforward neural network can be sucessfully trained by using a simulated annealing (or Monte Carlo) technique. The network is initialized randomly. Then the configurations (weights of the network) are generated according to a Boltzmann distribution. By lowering the temperature of th ..."
Abstract
 Add to MetaCart
We show how a feedforward neural network can be sucessfully trained by using a simulated annealing (or Monte Carlo) technique. The network is initialized randomly. Then the configurations (weights of the network) are generated according to a Boltzmann distribution. By lowering the temperature
Results 1  10
of
431,042