• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations

Tools

Sorted by:
Try your query at:
Semantic Scholar Scholar Academic
Google Bing DBLP
Results 1 - 10 of 1,666
Next 10 →

The complexity of multi-layered perceptrons

by P. J. Zwietering , 1994
"... ..."
Abstract - Cited by 5 (0 self) - Add to MetaCart
Abstract not found

Face Recognition with Multi-Layer Perceptrons

by Erik Hjelmås, Jørn Wroldsen
"... We present a face recognition system (still in development) with principal component analysis for feature extraction, and multi-layered perceptrons for classification. For single multi-layered perceptrons, the average performance of our best classifier is 89.1% on a test set of 200 face images, whil ..."
Abstract - Add to MetaCart
We present a face recognition system (still in development) with principal component analysis for feature extraction, and multi-layered perceptrons for classification. For single multi-layered perceptrons, the average performance of our best classifier is 89.1% on a test set of 200 face images

Multi-layer Perceptrons with Discrete Weights

by M. Marchesi, G. Orlandi, F. Piazza, L. Pollonara, A. Uncini - PROC. OF LJCNN INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, SAN DIEGO (CA-USA), PP I1-623- II-630 , 1990
"... Digital VLSI neural networks are currently very promising for implementing high performance versions of such networks. In such digital circuits the multiply phase of neuron activity is crucial for yielding high performance. This paper studies the feasibility of restricting the weight values in multi ..."
Abstract - Cited by 3 (1 self) - Add to MetaCart
in multi-layer perceptrons to powers-of-two or sums of powers-of-two. Multipliers could be thus substituted by shifters and adders on digital hardware, saving both time and chip area, under the assumption that the neuron activation function is computed through a look-up table (LUT), and that a LUT may

Internal Representations of Multi-Layered Perceptrons

by Włodzisław Duch
"... Feedforward neural networks make incomprehensible decisions resulting from mappings learned from training examples defined in high dimensional feature spaces. What kind of internal representations are developed by multi-layered perceptrons (MLPs) and how do they change during training? Scatterograms ..."
Abstract - Cited by 1 (0 self) - Add to MetaCart
Feedforward neural networks make incomprehensible decisions resulting from mappings learned from training examples defined in high dimensional feature spaces. What kind of internal representations are developed by multi-layered perceptrons (MLPs) and how do they change during training

Performance of Multi-Layer Perceptron with Neurogenesis

by Yuta Yokoyamay, Chihiro Ikutay, Yoko Uwatey, Yoshifumi Nishioy
"... Abstract—Neurogenesis is that new neurons are gen-erated in the human brain. The new neurons create new network. It is known that the neurogenesis causes the im-provement of memory, learning, and thinking ability by combining new neurons with biological neural network. We consider that the neurogene ..."
Abstract - Add to MetaCart
that the neurogenesis can be applied to an ar-tificial neural network. In this study, we propose the Multi-Layer Perceptron (MLP) with neurogenesis and apply to pattern recognition. In the MLP with neurogenesis, some neurons are generated in a hidden layer. We propose random, periodic and chaotic timing methods

Investigation of Characteristics of Multi-Layer Perceptron

by Yuta Yokoyama, Yoko Uwate, Yoshifumi Nishio
"... It is said that there are about 10 billion neurons in the human’s brain. The network is formed by connecting of more than one neuron. However, neurons had been considered to be lost with age until several years ago. It was impossible to generate new neuron in the adult brain. This process is called ..."
Abstract - Add to MetaCart
artificial network model which was applied the neurogenesis to Recurrent Neural Network (RNN) [4] and Multi-Layer Perceptron (MLP) [5]. In this study, we investigate in more detail the influences of neurogenesis. We apply the behavior of neurogenesis to Multi-Layer Perceptron (MLP) which is one of a feed

1991], The design and complexity of exact multi-layered perceptrons

by P. J. Zwietering, E. H. L. Aarts, J. Wessels, P. J. Zwieteringl, E. H. L. Aarts, J. Wesselsl - International Journalof Neural Systems
"... We investigate the network complexity of multi-layered perceptrons for solving ex-actly a given problem. We limit our study to the class of combinatorial optimization problems. It is shown how these problems can be reformulated as binary classification problems and how they can be solved by multi-la ..."
Abstract - Cited by 10 (2 self) - Add to MetaCart
We investigate the network complexity of multi-layered perceptrons for solving ex-actly a given problem. We limit our study to the class of combinatorial optimization problems. It is shown how these problems can be reformulated as binary classification problems and how they can be solved by multi-layered

An Extension of Multi-Layer Perceptron Based on Layer-Topology

by Jānis Zuters
"... Abstract—There are a lot of extensions made to the classic model of multi-layer perceptron (MLP). A notable amount of them has been designed to hasten the learning process without considering the quality of generalization. The paper proposes a new MLP extension based on exploiting topology of the in ..."
Abstract - Cited by 2 (2 self) - Add to MetaCart
Abstract—There are a lot of extensions made to the classic model of multi-layer perceptron (MLP). A notable amount of them has been designed to hasten the learning process without considering the quality of generalization. The paper proposes a new MLP extension based on exploiting topology

An Extension of Multi-Layer Perceptron Based on Layer-Topology

by unknown authors
"... Abstract—There are a lot of extensions made to the classic model of multi-layer perceptron (MLP). A notable amount of them has been designed to hasten the learning process without considering the quality of generalization. The paper proposes a new MLP extension based on exploiting topology of the in ..."
Abstract - Add to MetaCart
Abstract—There are a lot of extensions made to the classic model of multi-layer perceptron (MLP). A notable amount of them has been designed to hasten the learning process without considering the quality of generalization. The paper proposes a new MLP extension based on exploiting topology

A Greedy Learning Approach for Multi-Layer Perceptrons

by Achim G. Hoffmann
"... Neural networks as a general mechanism for learning and adaptation became increasingly popular in recent years. Mainly due to the development of the backpropagation learning procedure which allowed to train Multi-Layer Perceptrons. Unfortunately, back propagation is well-known for its particularly l ..."
Abstract - Add to MetaCart
Neural networks as a general mechanism for learning and adaptation became increasingly popular in recent years. Mainly due to the development of the backpropagation learning procedure which allowed to train Multi-Layer Perceptrons. Unfortunately, back propagation is well-known for its particularly
Next 10 →
Results 1 - 10 of 1,666
Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University