#### DMCA

## Hybrid Wavelet Model Construction Using Orthogonal Forward Selection with Boosting Search

Citations: | 3 - 0 self |

### Citations

3207 |
A Wavelet Tour of Signal Processing.
- Mallat
- 1999
(Show Context)
Citation Context ...al applications require the kernel function holds good local property to describe the local character of original function. Wavelet techniques have shown promise for nonstationary function estimation =-=[12, 13]-=-. Since the local property of wavelet makes efficient the estimation of the function having local characters, it is valuable for us to study the combination of wavelet and OLSR. In order to obtain an ... |

2602 |
Ten Lectures on Wavelets.
- Daubechies
- 1992
(Show Context)
Citation Context ...al applications require the kernel function holds good local property to describe the local character of original function. Wavelet techniques have shown promise for nonstationary function estimation =-=[12, 13]-=-. Since the local property of wavelet makes efficient the estimation of the function having local characters, it is valuable for us to study the combination of wavelet and OLSR. In order to obtain an ... |

435 | Othogonal least squares learning algorithm for radial basis function networks," - Chen, C, et al. - 1991 |

269 | Orthogonal least squares methods and their application to nonlinear system identification, - Chen, Billings, et al. - 1989 |

64 | Regression estimation with support vector learning machines,”
- Smola
- 1996
(Show Context)
Citation Context ...an kernel. 2. Theory Consider the problem of fitting the N pairs of training data {(), ()} 1 N x l y l l = with the regression model M yl () = yl ˆ() + el () = ∑ w () (), 1,2, , i 1 iφi l + el l= L N =-=(1)-=- = where yˆ( l) denotes the “approximated” model output, w i ’s the model weights, el () the modeling error at x () l and φ i () l = k((), c i x()) l are the regressors generated from a given kernel f... |

42 |
Least Squares Support Vector
- Suykens, Vandewalle
- 1999
(Show Context)
Citation Context ...vector [ 1, , ] , output vector , and error vector T w = w L wM [ (1), , ( )] T y = y L y N [(1), ,( )] T e = e L e N Then the regression model (1) can be presented as following matrix form y = Φw+ e =-=(2)-=- The goal of modeling data is to find the best linear combination of the column of Φ (i.e. the best value for w ) to explain y according to some criteria. The popular criteria is to minimize the sum o... |

41 | Linear programs for automatic accuracy control in regression
- Smola, Sch�olkopf, et al.
- 1999
(Show Context)
Citation Context ...lar matrix with the unit diagonal element and H = [ H1, H2, L, HM] with T the orthogonal columns that satisfy i j 0 if = HH i ≠ j. The regression model (2) can alternatively be expressed as y = Hθ+ e =-=(3)-=- where the new weight vector [ 1, , ] satisfies the triangular system . Although the problem is converted to find the best solution in the linear space spanned by the column of H (i.e. the best value ... |

40 | Combined genetic algorithm optimization and regularized orthogonal least squares learning for radial basis function networks
- Chen, Wu, et al.
- 1999
(Show Context)
Citation Context ...ed in our experiment. In all experiments following, we assign Ps = 10 NG = 50, Nb = 9 and ς = 0.02 The example [4] is a highly oscillating function rx ( ) = sin(11 π /(0.35x+ 1)) in the fix domain x ∈=-=[0,10]-=- . We generated 30 training sets of size l = 100 from this function by adding an independent 2 Gaussian noise ni~ N(0,0.1 ) too. As the method described in Subsection 3.2, one can −7 readily select th... |

25 | Wavelet support vector machine,"
- Zhang, Zhou, et al.
- 2004
(Show Context)
Citation Context ...ultidimensional wavelet function can be written as N the product of 1-d wavelet function h( x) = ∏ h( x ) i = 1 i N with { x = ( x1, L , xN) ∈R } . In this paper, we use the same mother wavelet as in =-=[14]-=-, that 2 is hx ( ) = cos(1.75 x)exp( − x 2) . In order to obtain an even sparser model, this paper also constructs a novel hybrid wavelet as kernel function. ⎧ 1 x−c 2 cos[1.75( x −c)]exp[ ( ) ] if x≤... |

24 | Using wavelet transform and fuzzy neural network for VPC detection from the Holter ECG. - Shyu, Wu, et al. - 2004 |

23 | Experiments with repeating weighted boosting search for optimization in signal processing applications
- Chen, Wang, et al.
- 2005
(Show Context)
Citation Context ...the parameters of the k th wavelet or hybrid wavelet regressor, that is dk and k , such as the genetic algorithm and adaptive simulated annealing. RWBS is recently proposed global searching algorithm =-=[11]-=-. It is extremely simple and easy to implement, involving a minimum programming effort. So, we perform this optimization by RWBS. c Let the vector uk contain both center parameters and scale parameter... |

12 | Kernel basis pursuit
- Guigue, Rakotomamonjy, et al.
- 2005
(Show Context)
Citation Context ...t of kernels with different scales. MSSVR outperforms traditional methods in terms of precision and sparseness, which will also be illuminated in our experiments. Kernel basis pursuit (KBP) algorithm =-=[6]-=- is another possible solution which enables us to build a l1 - regularized multiple-kernel estimator for regression. However, KBP is prone to over-fit the noisy data. We will compare its performance w... |

4 | Building meaningful representations for nonlinear modelling of 1D- and 2D-signals: Applications to biomedical signals - Dubois, Quenet, et al. - 2006 |

2 |
Non-flat function estimation with a multiscale support vector regression. Neurocomputing
- Zheng, Wang, et al.
- 2006
(Show Context)
Citation Context ... variance for all kernel regressors and estimate both the steep and smooth variations using an unchanged scale. Recently, a revised version of SVR, namely multiscale support vector regression (MSSVR) =-=[4, 5]-=-, is proposed by combining several feature spaces rather than a single feature space in standard SVR. The constructed multi-feature space is induced by a set of kernels with different scales. MSSVR ou... |

2 | B.: ECG Signal Analysis through Hidden Markov Models - Rodrigo, Bernadette, et al. |

1 |
Y.: Training Sparse MS-SVR with an Expectation-Maximization Algorithm
- Zheng, Wang, et al.
(Show Context)
Citation Context ... variance for all kernel regressors and estimate both the steep and smooth variations using an unchanged scale. Recently, a revised version of SVR, namely multiscale support vector regression (MSSVR) =-=[4, 5]-=-, is proposed by combining several feature spaces rather than a single feature space in standard SVR. The constructed multi-feature space is induced by a set of kernels with different scales. MSSVR ou... |

1 | S.: An Imput-Delay Neural-Network-Based Approach for Piecewise ECG - Amitava, Patrick - 2005 |