Results 1  10
of
1,358,718
QuasiNewton methods on Grassmannians and multilinear approximations of tensors
, 2009
"... Abstract. In this paper we proposed quasiNewton and limited memory quasiNewton methods for objective functions defined on Grassmann manifolds or a product of Grassmann manifolds. Specifically we defined bfgs and lbfgs updates in local and global coordinates on Grassmann manifolds or a product of ..."
Abstract

Cited by 17 (3 self)
 Add to MetaCart
, product of Grassmannians, Grassmann quasiNewton, Grassmann bfgs, Grassmann lbfgs, multilinear rank, symmetric multilinear rank, tensor, symmetric tensor, approximations
Relaxation of Crystals with the QuasiNewton Method
, 1998
"... A quasiNewton Method is used to simultaneously relax the internal coordinates and lattice parameters of crystals under pressure. The symmetry of the crystal structure is preserved during the relaxation. From the inverse of the Hessian matrix, elastic properties and some optical phonon frequencies a ..."
Abstract

Cited by 10 (0 self)
 Add to MetaCart
A quasiNewton Method is used to simultaneously relax the internal coordinates and lattice parameters of crystals under pressure. The symmetry of the crystal structure is preserved during the relaxation. From the inverse of the Hessian matrix, elastic properties and some optical phonon frequencies
QuasiNewton Methods for Image Restoration by
"... Many iterative methods that are used to solve Ax = b can be derived as quasiNewton methods for minimizing the quadratic function 1 2 xT A T Ax −x T A T b. In this paper, several such methods are considered, including conjugate gradient least squares (CGLS), BarzilaiBorwein (BB), residual norm stee ..."
Abstract
 Add to MetaCart
Many iterative methods that are used to solve Ax = b can be derived as quasiNewton methods for minimizing the quadratic function 1 2 xT A T Ax −x T A T b. In this paper, several such methods are considered, including conjugate gradient least squares (CGLS), BarzilaiBorwein (BB), residual norm
SCALING METHODS FOR QUASINEWTON METHODS
"... Abstract. This paper presents two new selfscaling variablemetric algorithms. The first is based on a known twoparameter family of ranktwo updating formulae, the second employs an initial scaling of the estimated inverse Hessian which modifies the first selfscaling algorithm. The algorithms are ..."
Abstract
 Add to MetaCart
are compared with similar published algorithms, notably those due to Oren, Shanno and Phua, Biggs and with BFGS (the best known quasiNewton method). The best of these new and published algorithms are also modified to employ inexact line searches with marginal effect. The new algorithms are superior
QuasiNewton particle MetropolisHastings?
"... Abstract: Particle MetropolisHastings enables Bayesian parameter inference in general nonlinear state space models (SSMs). However, in many implementations a random walk proposal is used and this can result in poor mixing if not tuned correctly using tedious pilot runs. Therefore, we consider a new ..."
Abstract
 Add to MetaCart
new proposal inspired by quasiNewton algorithms that may achieve better mixing with less tuning. An advantage compared to other Hessian based proposals, is that it only requires estimates of the gradient of the logposterior. A possible application of this new proposal is parameter inference
Constructing Free Energy Approximations and Generalized Belief Propagation Algorithms
 IEEE Transactions on Information Theory
, 2005
"... Important inference problems in statistical physics, computer vision, errorcorrecting coding theory, and artificial intelligence can all be reformulated as the computation of marginal probabilities on factor graphs. The belief propagation (BP) algorithm is an efficient way to solve these problems t ..."
Abstract

Cited by 574 (13 self)
 Add to MetaCart
to generate valid approximations: the “Bethe method, ” the “junction graph method, ” the “cluster variation method, ” and the “region graph method.” Finally, we explain how to tell whether a regionbased approximation, and its corresponding GBP algorithm, is likely to be accurate, and describe empirical
Active Learning with Statistical Models
, 1995
"... For manytypes of learners one can compute the statistically "optimal" way to select data. We review how these techniques have been used with feedforward neural networks [MacKay, 1992# Cohn, 1994]. We then showhow the same principles may be used to select data for two alternative, statist ..."
Abstract

Cited by 667 (10 self)
 Add to MetaCart
, statisticallybased learning architectures: mixtures of Gaussians and locally weighted regression. While the techniques for neural networks are expensive and approximate, the techniques for mixtures of Gaussians and locally weighted regression are both efficient and accurate.
Training Products of Experts by Minimizing Contrastive Divergence
, 2002
"... It is possible to combine multiple latentvariable models of the same data by multiplying their probability distributions together and then renormalizing. This way of combining individual “expert ” models makes it hard to generate samples from the combined model but easy to infer the values of the l ..."
Abstract

Cited by 823 (75 self)
 Add to MetaCart
derivatives with regard to the parameters can be approximated accurately and efficiently. Examples are presented of contrastive divergence learning using several types of expert on several types of data.
Fronts propagating with curvature dependent speed: algorithms based on Hamilton–Jacobi formulations
, 1988
"... We devise new numerical algorithms, called PSC algorithms, for following fronts propagating with curvaturedependent speed. The speed may be an arbitrary function of curvature, and the front also can be passively advected by an underlying flow. These algorithms approximate the equations of motion, w ..."
Abstract

Cited by 1167 (60 self)
 Add to MetaCart
We devise new numerical algorithms, called PSC algorithms, for following fronts propagating with curvaturedependent speed. The speed may be an arbitrary function of curvature, and the front also can be passively advected by an underlying flow. These algorithms approximate the equations of motion
Bayesian Model Selection in Social Research (with Discussion by Andrew Gelman & Donald B. Rubin, and Robert M. Hauser, and a Rejoinder)
 SOCIOLOGICAL METHODOLOGY 1995, EDITED BY PETER V. MARSDEN, CAMBRIDGE,; MASS.: BLACKWELLS.
, 1995
"... It is argued that Pvalues and the tests based upon them give unsatisfactory results, especially in large samples. It is shown that, in regression, when there are many candidate independent variables, standard variable selection procedures can give very misleading results. Also, by selecting a singl ..."
Abstract

Cited by 548 (21 self)
 Add to MetaCart
single model, they ignore model uncertainty and so underestimate the uncertainty about quantities of interest. The Bayesian approach to hypothesis testing, model selection and accounting for model uncertainty is presented. Implementing this is straightforward using the simple and accurate BIC
Results 1  10
of
1,358,718