#### DMCA

## Experiments with repeating weighted boosting search for optimization in signal processing applications (2005)

Venue: | IEEE Trans. Syst. Man Cybern. B, Cybern |

Citations: | 23 - 17 self |

### Citations

13233 | Statistical Learning Theory
- Vapnik
- 1998
(Show Context)
Citation Context ...B, which is depicted in Fig. 12. C. Novel Kernel Classifier Design Approach The state-of-the-art kernel modeling techniques, such as the support vector machine (SVM) and relevant vector machine (RVM) =-=[38]-=-–[41], have widely been adopted in classification applications. Typically, a kernel classification technique considers every training input pattern as a candidate kernel center and uses a single fixed... |

10047 | Genetic Algorithms - Goldberg - 1989 |

4844 |
Pattern classification and scene analysis
- Duda, Hart
- 1973
(Show Context)
Citation Context ...ion , with (21) and (25) (26) respectively, where for and for . The Fisher ratio, which is defined as the ratio of the interclass difference and the intraclass spread, in the direction of is given by =-=[45]-=- (27) At the th stage of incremental modeling, the th kernel term is constructed by maximizing the Fisher ratio (27) with respect to the kernel mean vector and the diagonal covariance matrix . The alg... |

3886 |
Adaptation in Natural and Artificial Systems. The
- Holland
- 1975
(Show Context)
Citation Context ... optimization techniques have been developed, see for example [1]-[14]. Within the wide field of engineering, two well-known classes of such global optimization methods are the genetic algorithm (GA) =-=[7]-=--[10] and adaptive simulated annealing (ASA) [11]-[14]. Both the GA and ASA have attracted considerable attention in signal processing applications, see for example [15]-[23]. The GA and ASA belong to... |

3498 | A Decision-theoretic Generalization of On-line Learning and an Application to Boosting. in - Freund, Schapire - 1995 |

1548 | Handbook of genetic algorithms. - Davis - 1991 |

1475 |
Pattern Recognition and Neural Networks,
- Ripley
- 1996
(Show Context)
Citation Context ...uare solution for the corresponding sparse classifier weight vector is readily available, given the least square solution of . The synthetic two-class problem and Diabetes in Pima Indians, taken from =-=[46]-=-, were used to investigate this proposed kernel classifier design approach, which is referred to as the OFS-RWBS, and to compare the results with those obtained using the existing state-of-the-art met... |

965 | Sparse Bayesian learning and the relevance vector machine
- Tipping
- 2001
(Show Context)
Citation Context ...ich is depicted in Fig. 12. C. Novel Kernel Classifier Design Approach The state-of-the-art kernel modeling techniques, such as the support vector machine (SVM) and relevant vector machine (RVM) [38]–=-=[41]-=-, have widely been adopted in classification applications. Typically, a kernel classification technique considers every training input pattern as a candidate kernel center and uses a single fixed kern... |

871 | The Strength of Weak Learnability.
- Schapire
- 1990
(Show Context)
Citation Context ...e population with it until the process converges. The weightings used in the convex combination are adapted to reflect the “goodness” of corresponding potential solutions using the idea from boosting =-=[25]-=--[28]. The process is repeated a number of times or “generations” to improve the probability of finding a global optimal solution. An elitist strategy is adopted by retaining the best solution found i... |

685 | Learning with Kernels: Support Vector - Scholkopf, Smola - 2001 |

496 |
Simulated Annealing: Theory and Applications,
- Laarhoven, Aarts
- 1987
(Show Context)
Citation Context ... a single solution in the parameter space with certain guiding principles that imitate the random behavior of molecules during the annealing process. Unlike the conventional simulated annealing [11], =-=[24]-=-, the ASA adopts an important mechanism called the reannealing scheme, which not only speeds up the search process but also makes the optimization process robust to different problems. The motivation ... |

258 |
Minimizing Multimodal Functions of Continuous Variables with the "Simulated Annealing Algorithm",
- Corana, Marchesi, et al.
- 1987
(Show Context)
Citation Context ... for example, [1]–[14]. Within the wide field of engineering, two well-known classes of such global optimization methods are the genetic algorithm (GA) [7]–[10] and adaptive simulated annealing (ASA) =-=[11]-=-–[14]. Both the GA and ASA have attracted considerable attention in signal processing applications; see, for example, [15]–[23]. The GA and ASA belong to a class of so-called guided random search meth... |

223 | Simulated Annealing: Practice Versus Theory, - Ingber - 1993 |

183 | Comparing support vector machines with Gaussian kernels to radial basis function classifiers,” - Scholkopf, Kah-Kay, et al. - 1997 |

169 | Prediction games and arcing algorithms - Breiman - 1997 |

158 | Eds., Handbook of Global Optimization. - Pardalos, Romeijn - 2002 |

138 | An introduction to boosting and leveraging.
- Meir, Ratsch
- 2003
(Show Context)
Citation Context ...ulation with it until the process converges. The weightings used in the convex combination are adapted to reflect the “goodness” of corresponding potential solutions using the idea from boosting [25]–=-=[28]-=-. The process is repeated a number of times or “generations” to improve the probability of finding a global optimal solution. An elitist strategy is adopted by retaining the 1083-4419/$20.00 © 2005 IE... |

122 | Genetic Algorithms and Very Fast Simulated Reannealing: A Comparison, - Ingber, Rosen - 1992 |

110 | Global Optimization in Action. - Pinter - 1996 |

98 | Adaptive Simulated Annealing (ASA): Lessons learned;
- Ingber
- 1996
(Show Context)
Citation Context ...s research communities have always been interested in the topic of global optimization, due to its importance, and a variety of global optimization techniques have been developed, see for example [1]-=-=[14]-=-. Within the wide field of engineering, two well-known classes of such global optimization methods are the genetic algorithm (GA) [7]-[10] and adaptive simulated annealing (ASA) [11]-[14]. Both the GA... |

94 |
Genetic algorithms: Concepts and designs.
- Man, Tang, et al.
- 1999
(Show Context)
Citation Context ...ization techniques have been developed; see, for example, [1]–[14]. Within the wide field of engineering, two well-known classes of such global optimization methods are the genetic algorithm (GA) [7]–=-=[10]-=- and adaptive simulated annealing (ASA) [11]–[14]. Both the GA and ASA have attracted considerable attention in signal processing applications; see, for example, [15]–[23]. The GA and ASA belong to a ... |

94 | Learning with kernels: Support vector machines, regularization, optimization, and beyond,” - Smola - 2002 |

82 |
Adaptive IIR Filtering,"
- Shynk
- 1989
(Show Context)
Citation Context ...of the proposed RWBS algorithm. A. Infinite-duration Impulse Response (IIR) Filter Design The adaptive IIR filter is a classical research area, and many properties of IIR filters are well known [29], =-=[30]-=-. Because the cost function of IIR filters is generally multimodal with respect to the filter coefficients and the usual gradient-based algorithm can easily be stuck at local minima, global optimizati... |

63 |
Joint data and channel estimation using blind trellis search algorithms
- Seshadri
- 1994
(Show Context)
Citation Context ...te, however, is too expensive to compute except for the simplest case. In practice, suboptimal solutions are sought for computational purposes. The algorithm based on a blind trellis search technique =-=[36]-=- is such an example. The joint minimization process (15) can also be performed using an iterative loop first over the data sequences and then over all the possible channels (16) The inner optimization... |

59 |
Schapire, \The strength of weak learnability
- E
- 1990
(Show Context)
Citation Context ...e population with it until the process converges. The weightings used in the convex combination are adapted to reflect the “goodness” of corresponding potential solutions using the idea from boosting =-=[25]-=-–[28]. The process is repeated a number of times or “generations” to improve the probability of finding a global optimal solution. An elitist strategy is adopted by retaining the 1083-4419/$20.00 © 20... |

50 | Adaptive simulated annealing for optimization in signal processing applications
- Chen, Luk
- 1999
(Show Context)
Citation Context ...orithm. It is self-evident that the RWBS is extremely simple, requiring a minimum programming effort. The GA is anything but simple, in terms of programming efforts. The ASA, in the form presented in =-=[21]-=-, is much easier to programme than the GA but still cannot compete with the simplicity of the RWBS. The difficulty with programming a GA can be circumvented by simply using some existing GA software p... |

41 |
Stochastic techniques for global optimization: a survey of recent Advances,
- Schoen
- 1991
(Show Context)
Citation Context ...ous research communities have always been interested in the topic of global optimization, due to its importance, and a variety of global optimization techniques have been developed; see, for example, =-=[1]-=-–[14]. Within the wide field of engineering, two well-known classes of such global optimization methods are the genetic algorithm (GA) [7]–[10] and adaptive simulated annealing (ASA) [11]–[14]. Both t... |

33 | Floudas. Deterministic Global Optimization : Theory, Methods and Applications. Nonconvex Optimization and Its Applications - A - 2000 |

31 | Terminal Repeller Unconstrained Subenergy Tunneling (TRUST) for Fast Global Optimization, - Cetin, Barhe, et al. - 1993 |

31 |
Digital lattice and ladder filter synthesis,”
- Gray, Markel
- 1973
(Show Context)
Citation Context ...re that all the have magnitudes less than 1. Thus, the filter coefficient vector used in optimization is Converting the reflection coefficients back to the direct-form coefficients is straightforward =-=[35]-=-. Example 1: This example is taken from [30]. The system and filter transfer functions are, respectively (8) (9) Fig. 4. Convergence performance averaged over 100 experiments for IIR filter design Exa... |

26 | Deterministic global optimization: Theory, algorithms and applications. - Floudas - 2000 |

23 | Low implementation cost IIR digital filter design using genetic algorithms - Wilson, Macleod - 1993 |

21 |
Genetic and learning automata algorithms for adaptive digital filters,” in
- Nambiar, Tang, et al.
- 1992
(Show Context)
Citation Context ...lter for system identification, where x(k) is the system input, y(k) the filter output, and d(k) the noisy plant observation. have been applied to IIR filter design; see, for example, [15], [16], and =-=[31]-=-–[34]. Consider the use of IIR filter in system identification application, as depicted in Fig. 3, where the IIR filter with the model transfer function (6) is used to model the unknown plant with the... |

20 |
The genetic search approach. A new learning algorithm for adaptive IIR filtering,”
- Ng, Leung, et al.
- 1996
(Show Context)
Citation Context ...ive IIR filter for system identification, where x(k) is the system input, y(k) the filter output, and d(k) the noisy plant observation. have been applied to IIR filter design; see, for example, [15], =-=[16]-=-, and [31]–[34]. Consider the use of IIR filter in system identification application, as depicted in Fig. 3, where the IIR filter with the model transfer function (6) is used to model the unknown plan... |

18 | Wu,“Maximum likelihood joint channel and data estimation using genetic algorithms”,
- Chen, Y
- 1998
(Show Context)
Citation Context ...en over all the possible channels (16) The inner optimization can be carried out using the standard Viterbi algorithm (VA). The previous research has used the quantized channel algorithm [37], the GA =-=[18]-=-, and the ASA [21] to perform the outer optimization. In this study, we apply the RWBS algorithm to perform the outer optimization. Specifically, given the channel estimate , let the data sequence dec... |

16 | A.: Using genetic algorithms for album page layouts - Geigel, Loui - 2003 |

15 | Fast kernel classifier construction using orthogonal forward selection to minimise leave-one-out misclassification rate
- Hong, Chen, et al.
- 2006
(Show Context)
Citation Context ... and has to be learned via cross validation. This subsection reports an alternative kernel classifier design approach that incrementally constructs a sparse kernel classifier using the RWBS algorithm =-=[42]-=-. Unlike most690 IEEE TRANSACTIONS ON SYSTEMS, MAN, AND CYBERNETICS—PART B: CYBERNETICS, VOL. 35, NO. 4, AUGUST 2005 where the diagonal covariance matrix has the form of diag . Given the pairs of tra... |

15 |
RBF neural network center selection based on Fisher ratio class separability measure
- Mao
- 2002
(Show Context)
Citation Context ...as (24) where the “new” weight vector , satisfying the triangular system . A sparse -term classifier model can be selected by incrementally maximizing a class separability measure in an OFS procedure =-=[43]-=-, [44]. Define the two class sets , and let the numbers of points in be , respectively, with . The means and variances of training samples belonging to classes in the direction of basis are given by F... |

14 | Optimizing stability bounds of finite-precision PID controller structures,” - Chen, Wu, et al. - 2002 |

14 |
A quantized channel approach to blind equalization, in
- Zervas, Proakis, et al.
- 1992
(Show Context)
Citation Context ...uences and then over all the possible channels (16) The inner optimization can be carried out using the standard Viterbi algorithm (VA). The previous research has used the quantized channel algorithm =-=[37]-=-, the GA [18], and the ASA [21] to perform the outer optimization. In this study, we apply the RWBS algorithm to perform the outer optimization. Specifically, given the channel estimate , let the data... |

14 | Kernel-based nonlinear beamforming construction using orthogonal forward selection with Fisher ratio class separability measure
- Chen, Hanzo, et al.
- 2004
(Show Context)
Citation Context ...) where the “new” weight vector , satisfying the triangular system . A sparse -term classifier model can be selected by incrementally maximizing a class separability measure in an OFS procedure [43], =-=[44]-=-. Define the two class sets , and let the numbers of points in be , respectively, with . The means and variances of training samples belonging to classes in the direction of basis are given by Fig. 12... |

11 | Genetic algorithm optimization for blind channel identification with higher order cumulant fitting - Chen, Wu, et al. - 1997 |

11 | Digital IIR filter design using adaptive simulated annealing”,
- Chen, Istepanian, et al.
- 2001
(Show Context)
Citation Context ...for system identification, where x(k) is the system input, y(k) the filter output, and d(k) the noisy plant observation. have been applied to IIR filter design; see, for example, [15], [16], and [31]–=-=[34]-=-. Consider the use of IIR filter in system identification application, as depicted in Fig. 3, where the IIR filter with the model transfer function (6) is used to model the unknown plant with the syst... |

9 | IIR model identification using batch-recursive adaptive simulated annealing algorithm,” - Chen - 2000 |

5 | Efficient global optimization using SPSA - Maryak, Chin - 1999 |

5 | Adaptive simulated annealing for the optimal design of electromagnetic devices - Wang, Yan, et al. - 1996 |

5 | Genetic algorithm optimisation for blind channel identification with higher-order cumulant fitting - Chen, Wu, et al. - 1997 |

4 |
Genetic algorithms for digital signal processing
- White, Flockton
- 1994
(Show Context)
Citation Context ...re the genetic algorithm (GA) [7]–[10] and adaptive simulated annealing (ASA) [11]–[14]. Both the GA and ASA have attracted considerable attention in signal processing applications; see, for example, =-=[15]-=-–[23]. The GA and ASA belong to a class of so-called guided random search methods. The underlying mechanisms for guiding optimization search process are, however, very Manuscript received June 7, 2004... |

1 |
Using the adaptive simulated annealing algorithm to estimate ocean-bottom geoacoustic properties from measured and synthetic transmission loss data
- Neumann, Muncill
- 2004
(Show Context)
Citation Context ...e genetic algorithm (GA) [7]–[10] and adaptive simulated annealing (ASA) [11]–[14]. Both the GA and ASA have attracted considerable attention in signal processing applications; see, for example, [15]–=-=[23]-=-. The GA and ASA belong to a class of so-called guided random search methods. The underlying mechanisms for guiding optimization search process are, however, very Manuscript received June 7, 2004; rev... |