Results 1  10
of
3,146,303
Valuing American options by simulation: A simple leastsquares approach
 Review of Financial Studies
, 2001
"... This article presents a simple yet powerful new approach for approximating the value of America11 options by simulation. The kcy to this approach is the use of least squares to estimate the conditional expected payoff to the optionholder from continuation. This makes this approach readily applicable ..."
Abstract

Cited by 511 (9 self)
 Add to MetaCart
This article presents a simple yet powerful new approach for approximating the value of America11 options by simulation. The kcy to this approach is the use of least squares to estimate the conditional expected payoff to the optionholder from continuation. This makes this approach readily
LeastSquares Policy Iteration
 JOURNAL OF MACHINE LEARNING RESEARCH
, 2003
"... We propose a new approach to reinforcement learning for control problems which combines valuefunction approximation with linear architectures and approximate policy iteration. This new approach ..."
Abstract

Cited by 461 (12 self)
 Add to MetaCart
We propose a new approach to reinforcement learning for control problems which combines valuefunction approximation with linear architectures and approximate policy iteration. This new approach
Least Median of Squares Regression
 JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION
, 1984
"... ..."
LSQR: An Algorithm for Sparse Linear Equations and Sparse Least Squares
 ACM Trans. Math. Software
, 1982
"... An iterative method is given for solving Ax ~ffi b and minU Ax b 112, where the matrix A is large and sparse. The method is based on the bidiagonalization procedure of Golub and Kahan. It is analytically equivalent to the standard method of conjugate gradients, but possesses more favorable numerica ..."
Abstract

Cited by 649 (21 self)
 Add to MetaCart
gradient algorithms, indicating that I~QR is the most reliable algorithm when A is illconditioned. Categories and Subject Descriptors: G.1.2 [Numerical Analysis]: ApprorJmationleast squares approximation; G.1.3 [Numerical Analysis]: Numerical Linear Algebralinear systems (direct and
Direct least Square Fitting of Ellipses
, 1998
"... This work presents a new efficient method for fitting ellipses to scattered data. Previous algorithms either fitted general conics or were computationally expensive. By minimizing the algebraic distance subject to the constraint 4ac  b² = 1 the new method incorporates the ellipticity constraint ..."
Abstract

Cited by 421 (3 self)
 Add to MetaCart
This work presents a new efficient method for fitting ellipses to scattered data. Previous algorithms either fitted general conics or were computationally expensive. By minimizing the algebraic distance subject to the constraint 4ac  b² = 1 the new method incorporates the ellipticity constraint into the normalization factor. The proposed method combines several advantages: (i) It is ellipsespecific so that even bad data will always return an ellipse; (ii) It can be solved naturally by a generalized eigensystem and (iii) it is extremely robust, efficient and easy to implement.
Benchmarking Least Squares Support Vector Machine Classifiers
 NEURAL PROCESSING LETTERS
, 2001
"... In Support Vector Machines (SVMs), the solution of the classification problem is characterized by a (convex) quadratic programming (QP) problem. In a modified version of SVMs, called Least Squares SVM classifiers (LSSVMs), a least squares cost function is proposed so as to obtain a linear set of eq ..."
Abstract

Cited by 446 (46 self)
 Add to MetaCart
In Support Vector Machines (SVMs), the solution of the classification problem is characterized by a (convex) quadratic programming (QP) problem. In a modified version of SVMs, called Least Squares SVM classifiers (LSSVMs), a least squares cost function is proposed so as to obtain a linear set
The approximation power of moving leastsquares
 Math. Comp
, 1998
"... Abstract. A general method for nearbest approximations to functionals on Rd, using scattereddata information is discussed. The method is actually the moving leastsquares method, presented by the BackusGilbert approach. It is shown that the method works very well for interpolation, smoothing and ..."
Abstract

Cited by 159 (7 self)
 Add to MetaCart
Abstract. A general method for nearbest approximations to functionals on Rd, using scattereddata information is discussed. The method is actually the moving leastsquares method, presented by the BackusGilbert approach. It is shown that the method works very well for interpolation, smoothing
Regularized LeastSquares Classification
"... We consider the solution of binary classification problems via Tikhonov regularization in a Reproducing Kernel Hilbert Space using the square loss, and denote the resulting algorithm Regularized LeastSquares Classification (RLSC). We sketch ..."
Abstract

Cited by 100 (1 self)
 Add to MetaCart
We consider the solution of binary classification problems via Tikhonov regularization in a Reproducing Kernel Hilbert Space using the square loss, and denote the resulting algorithm Regularized LeastSquares Classification (RLSC). We sketch
Results 1  10
of
3,146,303