Results 1 
8 of
8
Training Digital Circuits with Hamming Clustering
 IEEE TRANSACTIONS ON CIRCUIT AND SYSTEMS
, 2000
"... A new algorithm, called Hamming Clustering (HC), for the solution of classification problems with binary inputs is proposed. It builds a logical network containing only and, or and not ports, which, besides satisfying all the inputoutput pairs included in a given finite consistent training set, ..."
Abstract

Cited by 19 (15 self)
 Add to MetaCart
(Show Context)
A new algorithm, called Hamming Clustering (HC), for the solution of classification problems with binary inputs is proposed. It builds a logical network containing only and, or and not ports, which, besides satisfying all the inputoutput pairs included in a given finite consistent training set, is able to reconstruct the underlying Boolean function. The basic
Learning highly nonseparable Boolean functions using Constructive Feedforward Neural Network
"... Abstract. Learning problems with inherent nonseparable Boolean logic is still a challenge that has not been addressed by neural or kernel classifiers. The kseparability concept introduced recently allows for characterization of complexity of nonseparable learning problems. A simple constructive f ..."
Abstract

Cited by 11 (8 self)
 Add to MetaCart
(Show Context)
Abstract. Learning problems with inherent nonseparable Boolean logic is still a challenge that has not been addressed by neural or kernel classifiers. The kseparability concept introduced recently allows for characterization of complexity of nonseparable learning problems. A simple constructive feedforward network that uses a modified form of the error function and a windowlike functions to localize outputs after projections on a line has been tested on such problems with quite good results. The computational cost of training is low because most nodes and connections are fixed and only weights of one node are modified at each training step. Several examples of learning Boolean functions and results of classification tests on realworld multiclass datasets are presented. 1
W.: Projection Pursuit Constructive Neural Networks Based on Quality of Projected Clusters
 Lecture Notes in Computer Science 5164 (2008) 754–762
"... Abstract. Linear projection pursuit index measuring quality of projected clusters (QPC) is used to discover nonlocal clusters in highdimensional multiclass data, reduction of dimensionality, feature selection, visualization of data and classification. Constructive neural networks that optimize the ..."
Abstract

Cited by 10 (4 self)
 Add to MetaCart
(Show Context)
Abstract. Linear projection pursuit index measuring quality of projected clusters (QPC) is used to discover nonlocal clusters in highdimensional multiclass data, reduction of dimensionality, feature selection, visualization of data and classification. Constructive neural networks that optimize the QPC index are able to discover simplest models of complex data, solving problems that standard networks based on error minimization are not able to handle. Tests on problems with complex Boolean logic and a few real world datasets show high efficiency of this approach. 1
A Comparison of Methods for Learning of Highly NonSeparable Problems
"... Abstract. Learning in cases that are almost linearly separable is easy, but for highly nonseparable problems all standard machine learning methods fail. Many strategies to build adaptive systems are based on the “divideandconquer ” principle. Constructive neural network architectures with novel t ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
(Show Context)
Abstract. Learning in cases that are almost linearly separable is easy, but for highly nonseparable problems all standard machine learning methods fail. Many strategies to build adaptive systems are based on the “divideandconquer ” principle. Constructive neural network architectures with novel training methods allow to overcome some drawbacks of standard backpropagation MLP networks. They are able to handle complex multidimensional problems in reasonable time, creating models with small number of neurons. In this paper a comparison of our new constructive c3sep algorithm based on kseparability idea with several sequential constructive learning methods is reported. Tests have been performed on parity function, 3 artificial Monks problems, and a few benchmark problems. Simple and accurate solutions have been discovered using c3sep algorithm even in highly nonseparable cases. 1
Constructive Neural Network Algorithms that Solve Highly NonSeparable Problems
"... Abstract Learning from data with complex nonlocal relations and multimodal class distribution for widely used classification algorithms is still very hard. Even if accurate solution is found the resulting model may be too complex for a given data and will not generalize well. New types of learning ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
(Show Context)
Abstract Learning from data with complex nonlocal relations and multimodal class distribution for widely used classification algorithms is still very hard. Even if accurate solution is found the resulting model may be too complex for a given data and will not generalize well. New types of learning algorithms are needed to extend capabilities of standard machine learning systems. Projection pursuit methods can avoid “curse of dimensionality ” by discovering interesting structures in lowdimensional subspace. This paper introduces constructive neural architectures based on projection pursuit techniques that are able to discover simplest models of data with inherent highly complex logical structures.
A Unified Approach to Sequential Constructive Methods
 WIRN’98  THE 10TH ITALIAN WORKSHOP ON NEURAL NETS
, 1998
"... A general treatment of a particular class of learning techniques for neural networks, called sequential constructive methods, is proposed. They subsequently add units to the hidden layer until all the inputoutput relations contained in a given training set are satisfied. Every addition ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
A general treatment of a particular class of learning techniques for neural networks, called sequential constructive methods, is proposed. They subsequently add units to the hidden layer until all the inputoutput relations contained in a given training set are satisfied. Every addition
A Constructive Technique Based on Linear Programming for Training Switching Neural Networks
"... Abstract. A general constructive approach for training neural networks in classification problems is presented. This approach is used to construct a particular connectionist model, named Switching Neural Network (SNN), based on the conversion of the original problem in a Boolean lattice domain. Th ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract. A general constructive approach for training neural networks in classification problems is presented. This approach is used to construct a particular connectionist model, named Switching Neural Network (SNN), based on the conversion of the original problem in a Boolean lattice domain. The training of an SNN can be performed through a constructive algorithm, called Switch Programming (SP), based on the solution of a proper linear programming problem. Simulation results obtained on the StatLog benchmark show the good quality of the SNNs trained with SP.
Fast Projection Pursuit Based on Quality of Projected Clusters
"... Abstract. Projection pursuit index measuring quality of projected clusters (QPC) introduced recently optimizes projection directions by minimizing leaveoneout error searching for pure localized clusters. QPC index has been used in constructive neural networks to discover nonlocal clusters in high ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract. Projection pursuit index measuring quality of projected clusters (QPC) introduced recently optimizes projection directions by minimizing leaveoneout error searching for pure localized clusters. QPC index has been used in constructive neural networks to discover nonlocal clusters in highdimensional multiclass data, reduce dimensionality, aggregate features, visualize and classify data. However, for n training instances such optimization requires O(n 2) calculations. Fast approximate version of QPC introduced here obtains results of similar quality with O(n) effort, as illustrated in a number of classification and data visualization problems.