Results 1 - 10
of
1,666
Face Recognition with Multi-Layer Perceptrons
"... We present a face recognition system (still in development) with principal component analysis for feature extraction, and multi-layered perceptrons for classification. For single multi-layered perceptrons, the average performance of our best classifier is 89.1% on a test set of 200 face images, whil ..."
Abstract
- Add to MetaCart
We present a face recognition system (still in development) with principal component analysis for feature extraction, and multi-layered perceptrons for classification. For single multi-layered perceptrons, the average performance of our best classifier is 89.1% on a test set of 200 face images
Multi-layer Perceptrons with Discrete Weights
- PROC. OF LJCNN INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, SAN DIEGO (CA-USA), PP I1-623- II-630
, 1990
"... Digital VLSI neural networks are currently very promising for implementing high performance versions of such networks. In such digital circuits the multiply phase of neuron activity is crucial for yielding high performance. This paper studies the feasibility of restricting the weight values in multi ..."
Abstract
-
Cited by 3 (1 self)
- Add to MetaCart
in multi-layer perceptrons to powers-of-two or sums of powers-of-two. Multipliers could be thus substituted by shifters and adders on digital hardware, saving both time and chip area, under the assumption that the neuron activation function is computed through a look-up table (LUT), and that a LUT may
Internal Representations of Multi-Layered Perceptrons
"... Feedforward neural networks make incomprehensible decisions resulting from mappings learned from training examples defined in high dimensional feature spaces. What kind of internal representations are developed by multi-layered perceptrons (MLPs) and how do they change during training? Scatterograms ..."
Abstract
-
Cited by 1 (0 self)
- Add to MetaCart
Feedforward neural networks make incomprehensible decisions resulting from mappings learned from training examples defined in high dimensional feature spaces. What kind of internal representations are developed by multi-layered perceptrons (MLPs) and how do they change during training
Performance of Multi-Layer Perceptron with Neurogenesis
"... Abstract—Neurogenesis is that new neurons are gen-erated in the human brain. The new neurons create new network. It is known that the neurogenesis causes the im-provement of memory, learning, and thinking ability by combining new neurons with biological neural network. We consider that the neurogene ..."
Abstract
- Add to MetaCart
that the neurogenesis can be applied to an ar-tificial neural network. In this study, we propose the Multi-Layer Perceptron (MLP) with neurogenesis and apply to pattern recognition. In the MLP with neurogenesis, some neurons are generated in a hidden layer. We propose random, periodic and chaotic timing methods
Investigation of Characteristics of Multi-Layer Perceptron
"... It is said that there are about 10 billion neurons in the human’s brain. The network is formed by connecting of more than one neuron. However, neurons had been considered to be lost with age until several years ago. It was impossible to generate new neuron in the adult brain. This process is called ..."
Abstract
- Add to MetaCart
artificial network model which was applied the neurogenesis to Recurrent Neural Network (RNN) [4] and Multi-Layer Perceptron (MLP) [5]. In this study, we investigate in more detail the influences of neurogenesis. We apply the behavior of neurogenesis to Multi-Layer Perceptron (MLP) which is one of a feed
1991], The design and complexity of exact multi-layered perceptrons
- International Journalof Neural Systems
"... We investigate the network complexity of multi-layered perceptrons for solving ex-actly a given problem. We limit our study to the class of combinatorial optimization problems. It is shown how these problems can be reformulated as binary classification problems and how they can be solved by multi-la ..."
Abstract
-
Cited by 10 (2 self)
- Add to MetaCart
We investigate the network complexity of multi-layered perceptrons for solving ex-actly a given problem. We limit our study to the class of combinatorial optimization problems. It is shown how these problems can be reformulated as binary classification problems and how they can be solved by multi-layered
An Extension of Multi-Layer Perceptron Based on Layer-Topology
"... Abstract—There are a lot of extensions made to the classic model of multi-layer perceptron (MLP). A notable amount of them has been designed to hasten the learning process without considering the quality of generalization. The paper proposes a new MLP extension based on exploiting topology of the in ..."
Abstract
-
Cited by 2 (2 self)
- Add to MetaCart
Abstract—There are a lot of extensions made to the classic model of multi-layer perceptron (MLP). A notable amount of them has been designed to hasten the learning process without considering the quality of generalization. The paper proposes a new MLP extension based on exploiting topology
An Extension of Multi-Layer Perceptron Based on Layer-Topology
"... Abstract—There are a lot of extensions made to the classic model of multi-layer perceptron (MLP). A notable amount of them has been designed to hasten the learning process without considering the quality of generalization. The paper proposes a new MLP extension based on exploiting topology of the in ..."
Abstract
- Add to MetaCart
Abstract—There are a lot of extensions made to the classic model of multi-layer perceptron (MLP). A notable amount of them has been designed to hasten the learning process without considering the quality of generalization. The paper proposes a new MLP extension based on exploiting topology
A Greedy Learning Approach for Multi-Layer Perceptrons
"... Neural networks as a general mechanism for learning and adaptation became increasingly popular in recent years. Mainly due to the development of the backpropagation learning procedure which allowed to train Multi-Layer Perceptrons. Unfortunately, back propagation is well-known for its particularly l ..."
Abstract
- Add to MetaCart
Neural networks as a general mechanism for learning and adaptation became increasingly popular in recent years. Mainly due to the development of the backpropagation learning procedure which allowed to train Multi-Layer Perceptrons. Unfortunately, back propagation is well-known for its particularly
Results 1 - 10
of
1,666