Citations
6206 |
Fuzzy sets
- Zadeh
- 1965
(Show Context)
Citation Context ...ta, rule insertion, rule extraction, adaptation and reasoning. KBNNshave been developed either as a combination of symbolic AI systems and NN [3, 25,49,50],sor as a combination of fuzzy logic systems =-=[56]-=- and NN [18,22–32,40,55], or as other hybridssystems [24,40,55]. Rule insertion and rule extraction operations are typical operations for asKBNN to accommodate existing knowledge along with data, and ... |
4698 |
Self-organizing maps
- Kohonen
- 1997
(Show Context)
Citation Context ...ithms are not suitable for adaptive, on-line learning,sthat include: multi-layer perceptrons trained with the backpropagation algorithm, radial basissfunction networks [13], self-organising maps SOMs =-=[35,36]-=-, and fuzzy neural networkss[22,40]. These models usually operate on a fixed size connectionist structure, that limits itssability to accommodating new data; they may require both new data and the pre... |
2195 | An Introduction to Genetic Algorithms - Mitchell - 1998 |
799 | The cascade-correlation learning architecture
- Fahlman, Lebiere
- 1990
(Show Context)
Citation Context ...e a data set of classification examples for Leukemia cancer disease, that consists of twosclasses and a large input space – the expression values of 6,817 genes monitored bysAffymatrix arrays is used =-=[12]-=-. The initial number of examples is 72, but more examples arescontinuously being collected, so the classification system should be able to accommodatesthem and improve its performance. The two types o... |
444 |
Neural Darwinism: The Theory of Neuronal Group Selection
- Edelman
- 1989
(Show Context)
Citation Context ...hesW2 connections)s- as a geometrical center of the three nodes:sW1(ragg)=(W1(r1)+W1(r2)+W1(r3))/3s(9)s- as a weighted statistical center:sW2(ragg)=(W2(r1) Nex(r1)+W2(r2) Nex(r2)+W2(r3) Nex(r3))/Nsums=-=(10)-=-swhere: Nex(ragg)= Nsum = Nex(r1)+Nex(r2)+Nex(r3); Rragg = d(W1(ragg),W1(rj)) + Rjs<=Rmax, where rj is the rule node among the three nodes that has a maximum distance fromsthe new node ragg and Rj is ... |
399 | A growing neural gas network learns topologies, - Fritzke - 1995 |
394 | A course in fuzzy systems and control - Wang - 1996 |
328 | et al., "Molecular classification of cancer: class discovery and class prediction by gene expression monitoring - Golub, Slonim, et al. - 1999 |
310 |
A survey and critique of techniques for extracting rules from trained artificial neural networks
- Andrews, Diederich, et al.
- 1995
(Show Context)
Citation Context ... (b) only the rules extracted from the GDP model are shown. Similarsrules for the CPI, Interest rates, and the Unemployment rate, are extracted from thescorresponding EFuNN models.s14sInp var [1] [2] =-=[3]-=- [4] [5] [6] [7] [8] Cluster Output NumbsCPI(t–1) Inter(t–1) Unem(t–1) GDP(t–1) CPI(t) Inter(t) Unem(t) GDP(t) radius GDP(t+1) examp.sRule#s1 (1 0.7) (2s0.8) (2s0.7) (2s0.8) (1s0.7) (2s0.8) (2s0.7) (2... |
239 |
Intelligent agents: Theory and practice’, Knowledge engineering review
- Wooldridge, Jennings
- 1995
(Show Context)
Citation Context ...ures. This is expected tosresult in more precise models.sFurther applications include: adaptive speech and language processing [30]; moresapplications in Bioinformatics; intelligent agents on the WWW =-=[54]-=-; financial and economicsanalysis and prediction; adaptive mobile robot control; adaptive process control; adaptivesexpert systems; adaptive artificial life systems.sAcknowledgementssThis research is ... |
234 | Pruning algorithms: a survey,” - Reed - 1993 |
204 | Second order derivatives for network pruning: optimal brain surgeon. - B, Stork - 1992 |
196 | The neural basis of cognitive development: a constructivist manifesto - Quartz, Sejnowski - 1997 |
190 |
Pattern Recognition by Self-Organizing Neural Networks
- Carpenter, Grossberg
- 1991
(Show Context)
Citation Context ...e rules extracted from the GDP model are shown. Similarsrules for the CPI, Interest rates, and the Unemployment rate, are extracted from thescorresponding EFuNN models.s14sInp var [1] [2] [3] [4] [5] =-=[6]-=- [7] [8] Cluster Output NumbsCPI(t–1) Inter(t–1) Unem(t–1) GDP(t–1) CPI(t) Inter(t) Unem(t) GDP(t) radius GDP(t+1) examp.sRule#s1 (1 0.7) (2s0.8) (2s0.7) (2s0.8) (1s0.7) (2s0.8) (2s0.7) (2s0.9) 0.15 (... |
187 |
Learning and Tuning Fuzzy Logic Controller through Reinforcements,
- Berenji, Khedkar
- 1992
(Show Context)
Citation Context ...the system's operation, e.g., the system creates “on the fly” new inputs,snew outputs, new modules and connections;s(4) memorize data exemplars for a further refinement, or for information retrieval;s=-=(5)-=- learn and improve through active interaction with other IS and with the environmentsin a multi-modular, hierarchical fashion;s(6) adequately represent space and time in their different scales; have p... |
161 |
Foundations of neural networks, fuzzy systems and knowledge engineering
- Kasabov
- 1996
(Show Context)
Citation Context ...ning. KBNNshave been developed either as a combination of symbolic AI systems and NN [3, 25,49,50],sor as a combination of fuzzy logic systems [56] and NN [18,22–32,40,55], or as other hybridssystems =-=[24,40,55]-=-. Rule insertion and rule extraction operations are typical operations for asKBNN to accommodate existing knowledge along with data, and to produce an explanationson what the system has learned.sMany ... |
106 |
ANFIS: Adaptive network-based fuzzy inference systems.
- Jang
- 1993
(Show Context)
Citation Context ..., on-line learning,sthat include: multi-layer perceptrons trained with the backpropagation algorithm, radial basissfunction networks [13], self-organising maps SOMs [35,36], and fuzzy neural networkss=-=[22,40]-=-. These models usually operate on a fixed size connectionist structure, that limits itssability to accommodating new data; they may require both new data and the previously usedsones in order to adjus... |
69 |
Feature Space Mapping as a universal adaptive system
- Duch, Diercksen
- 1995
(Show Context)
Citation Context ...e of the above seven issues have already been addressed in different connectionistsand fuzzy connectionist systems. Such systems can successfully perform incremental learnings[7–15], on-line learning =-=[9,13,20]-=-; can deal with rules [3,5,17,32,34,40,49,50,55]. The lattersclass of neural networks (NN) are also called knowledge-based neural networks (KBNN).sOn-line learning is concerned with learning data as t... |
65 |
Self-Organizing Maps (Second Edition
- Kohonen
- 1997
(Show Context)
Citation Context ...ithms are not suitable for adaptive, on-line learning,sthat include: multi-layer perceptrons trained with the backpropagation algorithm, radial basissfunction networks [13], self-organising maps SOMs =-=[35,36]-=-, and fuzzy neural networkss[22,40]. These models usually operate on a fixed size connectionist structure, that limits itssability to accommodating new data; they may require both new data and the pre... |
51 | Learning Fuzzy Rules and Approximate Reasoning in Fuzzy Neural Networks and Hybrid Systems - Kasabov - 1996 |
47 | Funn - a fuzzy neural network architecture for adaptive learning and knowledge acquisition in multi-modular distributed environments
- Kasabov, Kim, et al.
- 1997
(Show Context)
Citation Context ...e already been addressed in different connectionistsand fuzzy connectionist systems. Such systems can successfully perform incremental learnings[7–15], on-line learning [9,13,20]; can deal with rules =-=[3,5,17,32,34,40,49,50,55]-=-. The lattersclass of neural networks (NN) are also called knowledge-based neural networks (KBNN).sOn-line learning is concerned with learning data as the system operates (usually in a realstime) and ... |
47 | Neural Fuzzy Systems,
- Chin-Teng, Lee, et al.
- 1996
(Show Context)
Citation Context ...e already been addressed in different connectionistsand fuzzy connectionist systems. Such systems can successfully perform incremental learnings[7–15], on-line learning [9,13,20]; can deal with rules =-=[3,5,17,32,34,40,49,50,55]-=-. The lattersclass of neural networks (NN) are also called knowledge-based neural networks (KBNN).sOn-line learning is concerned with learning data as the system operates (usually in a realstime) and ... |
39 | On-line learning processes in artificial neural networks”, in: Math. foundations of neural networks
- Heskes, Kappen
- 1993
(Show Context)
Citation Context ...e of the above seven issues have already been addressed in different connectionistsand fuzzy connectionist systems. Such systems can successfully perform incremental learnings[7–15], on-line learning =-=[9,13,20]-=-; can deal with rules [3,5,17,32,34,40,49,50,55]. The lattersclass of neural networks (NN) are also called knowledge-based neural networks (KBNN).sOn-line learning is concerned with learning data as t... |
36 |
ECOS: A framework for evolving connectionist systems and the ECO learning paradigm
- Kasabov
- 1998
(Show Context)
Citation Context ... and Knowledge Processing with the Use ofsHybrid Connectionist-based TechniquessIntelligent information systems (IS) for many real-world complex problems should meetssome requirements as listed below =-=[28]-=-:s(1) learn fast from a large amount of data, e.g. through one-pass training;s(2) adapt in an on-line mode where new data is incrementally accommodated;s(3) have an 'open' structure where new features... |
35 | Growing and pruning neural tree networks - Sankar, Mammone - 1993 |
31 | Structural learning with forgetting - Ishikawa - 1996 |
29 | Networks that grow when they learn and shrink when they forget
- Alpaydin, “GAL
- 1991
(Show Context)
Citation Context ...-line learning is concerned with learning data as the system operates (usually in a realstime) and data might exist only for a short time. NN models for on-line learning aresintroduced and studied in =-=[1,9,13,20,34,55]-=-. Several investigations proved that the mostspopular neural network models and algorithms are not suitable for adaptive, on-line learning,sthat include: multi-layer perceptrons trained with the backp... |
27 |
FuzzyARTMAP: A neural network architecture for incremental supervised learning of analog multi-dimensional maps
- Carpenter, Markuzon, et al.
- 1991
(Show Context)
Citation Context ...les extracted from the GDP model are shown. Similarsrules for the CPI, Interest rates, and the Unemployment rate, are extracted from thescorresponding EFuNN models.s14sInp var [1] [2] [3] [4] [5] [6] =-=[7]-=- [8] Cluster Output NumbsCPI(t–1) Inter(t–1) Unem(t–1) GDP(t–1) CPI(t) Inter(t) Unem(t) GDP(t) radius GDP(t+1) examp.sRule#s1 (1 0.7) (2s0.8) (2s0.7) (2s0.8) (1s0.7) (2s0.8) (2s0.7) (2s0.9) 0.15 (2s0.... |
25 |
Rule-based neural networks for classification and probability estimation
- Goodman, Higgins, et al.
- 1992
(Show Context)
Citation Context ...e already been addressed in different connectionistsand fuzzy connectionist systems. Such systems can successfully perform incremental learnings[7–15], on-line learning [9,13,20]; can deal with rules =-=[3,5,17,32,34,40,49,50,55]-=-. The lattersclass of neural networks (NN) are also called knowledge-based neural networks (KBNN).sOn-line learning is concerned with learning data as the system operates (usually in a realstime) and ... |
25 | Connectionist Models of Commonsense Reasoning, in: Neural Networks for Knowledge Representation and Inference
- Sun
- 1994
(Show Context)
Citation Context ...e already been addressed in different connectionistsand fuzzy connectionist systems. Such systems can successfully perform incremental learnings[7–15], on-line learning [9,13,20]; can deal with rules =-=[3,5,17,32,34,40,49,50,55]-=-. The lattersclass of neural networks (NN) are also called knowledge-based neural networks (KBNN).sOn-line learning is concerned with learning data as the system operates (usually in a realstime) and ... |
24 |
The ECOS Framework and the ECO Learning Method for Evolving Connectionist Systems
- Kasabov
- 1998
(Show Context)
Citation Context ...n on-line mode, with a goalsof having their optimal values at each time of the functioning of the system according to asgiven set criteria, is a challenging task. Here three approaches have been used =-=[29]-=-: (a) asstatistically-based approach—statistical parameters are allocated to the rule nodes and theirsvalues are used for optimization purposes; (b) an evolutionary approach with the use of as10sgenet... |
21 |
Online learning in radial basis function networks
- Freeman, Saad
- 1997
(Show Context)
Citation Context ...e of the above seven issues have already been addressed in different connectionistsand fuzzy connectionist systems. Such systems can successfully perform incremental learnings[7–15], on-line learning =-=[9,13,20]-=-; can deal with rules [3,5,17,32,34,40,49,50,55]. The lattersclass of neural networks (NN) are also called knowledge-based neural networks (KBNN).sOn-line learning is concerned with learning data as t... |
19 |
Adaptable connectionist production systems
- Kasabov
- 1996
(Show Context)
Citation Context ...ata and knowledge manipulation,sincluding learning from data, rule insertion, rule extraction, adaptation and reasoning. KBNNshave been developed either as a combination of symbolic AI systems and NN =-=[3, 25,49,50]-=-,sor as a combination of fuzzy logic systems [56] and NN [18,22–32,40,55], or as other hybridssystems [24,40,55]. Rule insertion and rule extraction operations are typical operations for asKBNN to acc... |
19 | A simple weight decay can improve generalisation - Krogh, Hertz - 1992 |
18 |
A decision making model using a fuzzy neural network, in:
- Hashiyama, Furuhashi, et al.
- 1992
(Show Context)
Citation Context ...0] (e.g., IF x1 is A AND x2 is B THEN y is a.x1 +sb.x2 +c, where A,B and C are fuzzy values and a, b and c are constants);s(5) Fuzzy rules of type (3) with degrees of importance and certainty degrees =-=[18,24,32]-=-s(e.g., IF x1 is A (DI1) AND x2 is B (DI2) THEN y is C (CFc), where DI1 and DI2srepresent the importance of each of the condition elements for the rule output, and thesCFc represents the strength of t... |
16 | Genetic algorithms for the design of fuzzy neural networks - Watts, Kasabov - 1998 |
13 |
Circuits of Production Rule - GenNets – The genetic programming of nervous systems
- DeGaris
- 1993
(Show Context)
Citation Context ...extracted from the GDP model are shown. Similarsrules for the CPI, Interest rates, and the Unemployment rate, are extracted from thescorresponding EFuNN models.s14sInp var [1] [2] [3] [4] [5] [6] [7] =-=[8]-=- Cluster Output NumbsCPI(t–1) Inter(t–1) Unem(t–1) GDP(t–1) CPI(t) Inter(t) Unem(t) GDP(t) radius GDP(t+1) examp.sRule#s1 (1 0.7) (2s0.8) (2s0.7) (2s0.8) (1s0.7) (2s0.8) (2s0.7) (2s0.9) 0.15 (2s0.9) 1... |
11 | Vector quantization with growing and splitting elastic net”, in: ICANN’93 - Fritzke - 1993 |
11 | Investigating the adaptation and forgetting in fuzzy neural networks by using the method of training and zeroing - Kasabov - 1996 |
9 |
eds, “Brain-like Computing and Intelligent Information Systems
- Amari, Kasabov
(Show Context)
Citation Context ...telligent information systems (IS) for many real-world complex problems should meetssome requirements as listed below [28]:s(1) learn fast from a large amount of data, e.g. through one-pass training;s=-=(2)-=- adapt in an on-line mode where new data is incrementally accommodated;s(3) have an 'open' structure where new features (relevant to the task) can be introduced atsany stage of the system's operation,... |
8 | Evolving fuzzy neural networks—algorithms, applications, and biological motivation - Kasabov - 1998 |
5 |
Adaptive learning system and method
- Kasabov
- 2000
(Show Context)
Citation Context ... is in the output cluster defined bysits center (y is C, to a membership degree of MDc) and by its radius Rj-out, withsNex(j) examples represented by this rule ] (see [5,9,34,36]).s(7) Temporal rules =-=[30]-=- (e.g., IF x1 is present at a time moment t1 (with a certaintysdegree and/or importance factor of DI1) AND x2 is present at a time moment t2 (withsa certainty degree/importance factor DI2) THEN y is C... |
5 | Pruning via Dynamic Adaptation of the Forgetting Rate in Structural Learning - Miller, Zurada, et al. - 1996 |
4 | Calcium regulation of the neuronal cone growth”, Trends in Neuroscience - Kater, Mattson, et al. |
3 | An adaptive classification scheme to approximate decision boundaries using local Bayes criterias
- Encarnacao, Gross
- 1992
(Show Context)
Citation Context ...sis of the weights W3 of an evolved EFuNN, temporal correlation betweenstime consecutive exemplars can be expressed in terms of rules and conditional probabilities,se.g.:sIF r1s(t–1) THEN r2s(t)(0.3)s=-=(11)-=-sThe meaning of the above rule is that some examples that belong to the rule (prototype) r2sfollow in time examples from the rule prototype r1 with a relative conditional probability ofs0.3.s3.4. EFuN... |
3 | Rules of chaotic behaviour extracted from the fuzzy neural network FuNN - Kozma, Kasabov - 1998 |
3 | P.Smolensky, A technique for trimming the fat from a network via relevance assessment, in: D.Touretzky (ed - unknown authors - 1989 |
3 | Application of Genetic Algorithms to the Construction of Topologies for Multilayer Perceptrons - unknown authors - 1993 |
2 | Spatio-temporal evolving fuzzy neural networks and their applications for on-line, adaptive phoneme recognition - Kasabov, Watts |
2 |
HyFIS: adaptive hybrid connectionist fuzzy inference systems
- Kim, Kasabov
(Show Context)
Citation Context ...e already been addressed in different connectionistsand fuzzy connectionist systems. Such systems can successfully perform incremental learnings[7–15], on-line learning [9,13,20]; can deal with rules =-=[3,5,17,32,34,40,49,50,55]-=-. The lattersclass of neural networks (NN) are also called knowledge-based neural networks (KBNN).sOn-line learning is concerned with learning data as the system operates (usually in a realstime) and ... |
2 | Refinement of approximate domain theories by knowledge-based neural networks
- Towel, Shavlik, et al.
- 1990
(Show Context)
Citation Context ...e already been addressed in different connectionistsand fuzzy connectionist systems. Such systems can successfully perform incremental learnings[7–15], on-line learning [9,13,20]; can deal with rules =-=[3,5,17,32,34,40,49,50,55]-=-. The lattersclass of neural networks (NN) are also called knowledge-based neural networks (KBNN).sOn-line learning is concerned with learning data as the system operates (usually in a realstime) and ... |
1 |
Uchino and T.Miki, "A new Effective Algorithm for Neo Fuzzy Neuron Model
- Yamakawa, Kusanagi, et al.
- 1993
(Show Context)
Citation Context ...e already been addressed in different connectionistsand fuzzy connectionist systems. Such systems can successfully perform incremental learnings[7–15], on-line learning [9,13,20]; can deal with rules =-=[3,5,17,32,34,40,49,50,55]-=-. The lattersclass of neural networks (NN) are also called knowledge-based neural networks (KBNN).sOn-line learning is concerned with learning data as the system operates (usually in a realstime) and ... |