Results 1  10
of
12,634
SVMTorch: Support Vector Machines for LargeScale Regression Problems
 JOURNAL OF MACHINE LEARNING RESEARCH
, 2001
"... Support Vector Machines (SVMs) for regression problems are trained by solving a quadratic optimization problem which needs on the order of l 2 memory and time resources to solve, where l is the number of training examples. In this paper, we propose a decomposition algorithm, SVMTorch 1 , whic ..."
Abstract

Cited by 312 (10 self)
 Add to MetaCart
Support Vector Machines (SVMs) for regression problems are trained by solving a quadratic optimization problem which needs on the order of l 2 memory and time resources to solve, where l is the number of training examples. In this paper, we propose a decomposition algorithm, SVMTorch 1
Regression Problems
"... Abstract — Type 2 fuzzy systems have been under investigation for a while and the projection of type 2 understanding for uncertainty management onto the connectionist models –i.e. neural networks seems an interesting field of research. This paper considers neurons having multiple bias values defini ..."
Abstract
 Add to MetaCart
Abstract — Type 2 fuzzy systems have been under investigation for a while and the projection of type 2 understanding for uncertainty management onto the connectionist models –i.e. neural networks seems an interesting field of research. This paper considers neurons having multiple bias values defining a new structure that resembles the uncertainty handling capability of type 2 fuzzy models. Such a neuron provides many activation levels that are combined to obtain the neuron response. A neural network with this new model is presented. Several simulation results are shown and the universal approximation property is emphasized. Keywordstype 2 neuron model, type 2 neural networks I.
Boosting Methodology for Regression Problems
 The Seventh International Workshop on Artificial Intelligence and Statistics (Uncertainty '99
, 1999
"... Classification problems have dominated research on boosting to date. The application of boosting to regression problems, on the other hand, has received little investigation. In this paper we develop a new boosting method for regression problems. We cast the regression problem as a classification pr ..."
Abstract

Cited by 33 (1 self)
 Add to MetaCart
Classification problems have dominated research on boosting to date. The application of boosting to regression problems, on the other hand, has received little investigation. In this paper we develop a new boosting method for regression problems. We cast the regression problem as a classification
Least angle regression
, 2004
"... The purpose of model selection algorithms such as All Subsets, Forward Selection and Backward Elimination is to choose a linear model on the basis of the same set of data to which the model will be applied. Typically we have available a large collection of possible covariates from which we hope to s ..."
Abstract

Cited by 1326 (37 self)
 Add to MetaCart
implements the Lasso, an attractive version of ordinary least squares that constrains the sum of the absolute regression coefficients; the LARS modification calculates all possible Lasso estimates for a given problem, using an order of magnitude less computer time than previous methods. (2) A different LARS
Additive Logistic Regression: a Statistical View of Boosting
 Annals of Statistics
, 1998
"... Boosting (Freund & Schapire 1996, Schapire & Singer 1998) is one of the most important recent developments in classification methodology. The performance of many classification algorithms can often be dramatically improved by sequentially applying them to reweighted versions of the input dat ..."
Abstract

Cited by 1750 (25 self)
 Add to MetaCart
data, and taking a weighted majority vote of the sequence of classifiers thereby produced. We show that this seemingly mysterious phenomenon can be understood in terms of well known statistical principles, namely additive modeling and maximum likelihood. For the twoclass problem, boosting can
Regularization paths for generalized linear models via coordinate descent
, 2009
"... We develop fast algorithms for estimation of generalized linear models with convex penalties. The models include linear regression, twoclass logistic regression, and multinomial regression problems while the penalties include ℓ1 (the lasso), ℓ2 (ridge regression) and mixtures of the two (the elastic ..."
Abstract

Cited by 724 (15 self)
 Add to MetaCart
We develop fast algorithms for estimation of generalized linear models with convex penalties. The models include linear regression, twoclass logistic regression, and multinomial regression problems while the penalties include ℓ1 (the lasso), ℓ2 (ridge regression) and mixtures of the two (the
General problem • Regression problem.
, 2005
"... • The aim is to find f(X) for predicting Y given the values of X. • Linear model: Y = f(X) + , where E()=0 and is independent of X. f(X) =XTβ, for a set of parameters β. • Another approach is to use the linear basis expansions. • Replace X with a transformation of it, and subsequently use a linear ..."
Abstract
 Add to MetaCart
linear model in the new space of input features. November 1, 2005 2 Working Group on Statistical Learning General problem cont’d • Let hm(X) : Rp 7 → R the mth transformation of X. • Then f(X) =∑Mm=1 hm(X)βm. • Examples of hm(X) are polynomial and trigonometric expansions, e.g.
Training Linear SVMs in Linear Time
, 2006
"... Linear Support Vector Machines (SVMs) have become one of the most prominent machine learning techniques for highdimensional sparse data commonly encountered in applications like text classification, wordsense disambiguation, and drug design. These applications involve a large number of examples n ..."
Abstract

Cited by 549 (6 self)
 Add to MetaCart
as well as a large number of features N, while each example has only s << N nonzero features. This paper presents a CuttingPlane Algorithm for training linear SVMs that provably has training time O(sn) for classification problems and O(sn log(n)) for ordinal regression problems. The algorithm
Runtime Guarantees for Regression Problems ∗
"... We study theoretical runtime guarantees for a class of optimization problems that occur in a wide variety of inference problems. These problems are motivated by the LASSO framework and have applications in machine learning and computer vision. Ourworkshowsacloseconnectionbetweentheseproblems and cor ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
We study theoretical runtime guarantees for a class of optimization problems that occur in a wide variety of inference problems. These problems are motivated by the LASSO framework and have applications in machine learning and computer vision. Ourworkshowsacloseconnectionbetweentheseproblems
1 Regression problems for magnitudes.
, 2006
"... Least squares linear regression is so popular that it is sometimes applied without checking that its basic requirements are satised. In particular, in studying earthquake phenomena, the conditions a) that the uncertainty on the independent variable is at least one order of magnitude smaller than the ..."
Abstract
 Add to MetaCart
be estimated and is arbitrarily set to 1. We apply these results to magnitude scale conversion, which is a common problem in seismology, with important implications in seismic hazard evaluation, and analyze it through speci c tests. Our analysis concludes that the commonly used standard regression may induce
Results 1  10
of
12,634