Results 1  10
of
1,714,362
Maximum likelihood from incomplete data via the EM algorithm
 JOURNAL OF THE ROYAL STATISTICAL SOCIETY, SERIES B
, 1977
"... A broadly applicable algorithm for computing maximum likelihood estimates from incomplete data is presented at various levels of generality. Theory showing the monotone behaviour of the likelihood and convergence of the algorithm is derived. Many examples are sketched, including missing value situat ..."
Abstract

Cited by 11807 (17 self)
 Add to MetaCart
situations, applications to grouped, censored or truncated data, finite mixture models, variance component estimation, hyperparameter estimation, iteratively reweighted least squares and factor analysis.
Local Regularization Assisted Orthogonal Least Squares Regression
 IEEE Transactions on Neural Networks, submitted
, 2001
"... A locally regularized orthogonal least squares (LROLS) algorithm is proposed for constructing parsimonious or sparse regression models that generalize well. By associating each orthogonal weight in the regression model with an individual regularization parameter, the ability for the orthogonal least ..."
Abstract

Cited by 36 (10 self)
 Add to MetaCart
A locally regularized orthogonal least squares (LROLS) algorithm is proposed for constructing parsimonious or sparse regression models that generalize well. By associating each orthogonal weight in the regression model with an individual regularization parameter, the ability for the orthogonal
Orthogonal Least Squares Algorithm Applied to the Initialization of
"... Abstract. An e cient procedure is proposed for initializing twolayer perceptrons and for determining the optimal number of hidden neurons. This is based on the Orthogonal Least Squares method, which istypical of RBF as well as Wavelet networks. Some experiments are discussed, in which the proposed ..."
Abstract
 Add to MetaCart
Abstract. An e cient procedure is proposed for initializing twolayer perceptrons and for determining the optimal number of hidden neurons. This is based on the Orthogonal Least Squares method, which istypical of RBF as well as Wavelet networks. Some experiments are discussed, in which the proposed
1Orthogonal Least Squares Regression with Tunable Kernels
"... A novel technique is proposed to construct sparse regression models based on the orthogonal least squares method with tunable kernels. The proposed technique tunes the centre vector and diagonal covariance matrix of individual regressor by incrementally minimising the training mean square error usin ..."
Abstract
 Add to MetaCart
A novel technique is proposed to construct sparse regression models based on the orthogonal least squares method with tunable kernels. The proposed technique tunes the centre vector and diagonal covariance matrix of individual regressor by incrementally minimising the training mean square error
Orthogonal Least Square with Boosting for Regression Modeling
"... A novel technique is presented to construct sparse regression models based on the orthogonal least square method with boosting. This technique tunes the mean vector and diagonal covariance matrix of individual regressor by incrementally minimizing the training mean square error. An efficient weighte ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
A novel technique is presented to construct sparse regression models based on the orthogonal least square method with boosting. This technique tunes the mean vector and diagonal covariance matrix of individual regressor by incrementally minimizing the training mean square error. An efficient
ORTHOGONAL LEAST SQUARES SOLUTIONS FOR LINEAR OPERATORS
"... Abstract. This paper solves the problem of nding, in a least squares sense, the coefcients of a series expansion of a function in terms of a chosen orthogonal basis from the knowledge not of the function itself but from the action of a linear operator upon it. The coefciens are evaluated by inner pr ..."
Abstract
 Add to MetaCart
Abstract. This paper solves the problem of nding, in a least squares sense, the coefcients of a series expansion of a function in terms of a chosen orthogonal basis from the knowledge not of the function itself but from the action of a linear operator upon it. The coefciens are evaluated by inner
On the Difference Between Orthogonal Matching Pursuit and Orthogonal Least Squares
, 2007
"... Abstract — Greedy algorithms are often used to solve underdetermined inverse problems when the solution is constrained to be sparse, i.e. the solution is only expected to have a relatively small number of nonzero elements. Two different algorithms have been suggested to solve such problems in the s ..."
Abstract

Cited by 20 (1 self)
 Add to MetaCart
in the signal processing and control community, orthogonal Matching Pursuit and orthogonal Least Squares respectively. In the current literature, there exist a great deal of confusion between the two strategies. For example, the later strategy has often be called orthogonal Matching Pursuit and has repeatedly
Fast orthogonal least squares algorithm for efficient subset model selection
 In fact, it is often the case that variations in the
, 1995
"... AbstractAn efficient implementation of the orthogonal least squares algorithm for subset model selection is derived in this correspondence. Computational complexity of the algorithm is examined and the result shows that this new fast orthogonal least squares algorithm significantly reduces computat ..."
Abstract

Cited by 34 (9 self)
 Add to MetaCart
AbstractAn efficient implementation of the orthogonal least squares algorithm for subset model selection is derived in this correspondence. Computational complexity of the algorithm is examined and the result shows that this new fast orthogonal least squares algorithm significantly reduces
using Orthogonal Least Squares Application to a depollution
"... an interpretable fuzzy rule base from data ..."
Results 1  10
of
1,714,362