Results 1  10
of
923
Constructive updating/downdating of oblique projectors
, 2006
"... A generalization of the GramSchmidt procedure is achieved by providing equations for updating and downdating oblique projectors. The work is motivated by the problem of adaptive signal representation outside the orthogonal basis setting. The proposed techniques are shown to be relevant to the probl ..."
Abstract

Cited by 7 (7 self)
 Add to MetaCart
A generalization of the GramSchmidt procedure is achieved by providing equations for updating and downdating oblique projectors. The work is motivated by the problem of adaptive signal representation outside the orthogonal basis setting. The proposed techniques are shown to be relevant
A rankrevealing method with updating, downdating and applications
 SIAM J. Matrix Anal. Appl
"... Abstract. A new rank revealing method is proposed. For a given matrix and a threshold for nearzero singular values, by employing a globally convergent iterative scheme as well as a deflation technique the method calculates approximate singular values below the threshold one by one and returns the a ..."
Abstract

Cited by 29 (7 self)
 Add to MetaCart
the approximate rank of the matrix along with an orthonormal basis for the approximate null space. When a row or column is inserted or deleted, algorithms for updating/downdating the approximate rank and null space are straightforward, stable and efficient. Numerical results exhibiting the advantages of our code
Algorithm 887: Cholmod, supernodal sparse cholesky factorization and update/downdate
 ACM Transactions on Mathematical Software
, 2008
"... CHOLMOD is a set of routines for factorizing sparse symmetric positive definite matrices of the form A or A A T, updating/downdating a sparse Cholesky factorization, solving linear systems, updating/downdating the solution to the triangular system Lx = b, and many other sparse matrix functions for b ..."
Abstract

Cited by 109 (8 self)
 Add to MetaCart
CHOLMOD is a set of routines for factorizing sparse symmetric positive definite matrices of the form A or A A T, updating/downdating a sparse Cholesky factorization, solving linear systems, updating/downdating the solution to the triangular system Lx = b, and many other sparse matrix functions
Dynamic supernodes in sparse Cholesky update/downdate and triangular solves
 ACM Trans. Math. Software
, 2006
"... The supernodal method for sparse Cholesky factorization represents the factor L as a set of supernodes, each consisting of a contiguous set of columns of L with identical nonzero pattern. A conventional supernode is stored as a dense submatrix. While this is suitable for sparse Cholesky factorizatio ..."
Abstract

Cited by 30 (10 self)
 Add to MetaCart
factorization where the nonzero pattern of L does not change, it is not suitable for methods that modify a sparse Cholesky factorization after a lowrank change to A (an update/downdate, A = A±WW T). Supernodes merge and split apart during an update/downdate. Dynamic supernodes are introduced, which allow a
A RANKREVEALING METHOD WITH UPDATING, DOWNDATING, AND APPLICATIONS. PART II ∗
"... Abstract. As one of the basic problems in matrix computation, rankrevealing arises in a wide variety of applications in scientific computing. Although the singular value decomposition is the standard rankrevealing method, it is costly in both computing time and storage when the rank or the nullity ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
or the nullity is low, and it is inefficient in updating and downdating when rows and columns are inserted or deleted. Consequently, alternative methods are in demand in those situations. Following up on a recent rankrevealing algorithm by Li and Zeng for the low nullity case, we present a new rankrevealing
Parallel Methods in Updating/Downdating Problems of the Latent Semantic Indexing
, 2006
"... Abstract. In the case of large databases, which are encoded on some sort of a parallel computer (e.g., a supercomputer or a cluster of personal computers), it is frequently needed to update or downdate either documents or terms. This task can be done in parallel based on the theory of the Singular V ..."
Abstract
 Add to MetaCart
Abstract. In the case of large databases, which are encoded on some sort of a parallel computer (e.g., a supercomputer or a cluster of personal computers), it is frequently needed to update or downdate either documents or terms. This task can be done in parallel based on the theory of the Singular
On the Stability of Sequential Updates and Downdates
 IEEE TRANSACTIONS ON SIGNAL PROCESSING
, 1994
"... The updating and downdating of QR decompositions has important applications in a number of areas. There is essentially one standard updating algorithm, based on plane rotations, which is backwards stable. Three downdating algorithms have been treated in the literature: the LINPACK algorithm, the met ..."
Abstract

Cited by 12 (0 self)
 Add to MetaCart
The updating and downdating of QR decompositions has important applications in a number of areas. There is essentially one standard updating algorithm, based on plane rotations, which is backwards stable. Three downdating algorithms have been treated in the literature: the LINPACK algorithm
Bundle Adjustment  A Modern Synthesis
 VISION ALGORITHMS: THEORY AND PRACTICE, LNCS
, 2000
"... This paper is a survey of the theory and methods of photogrammetric bundle adjustment, aimed at potential implementors in the computer vision community. Bundle adjustment is the problem of refining a visual reconstruction to produce jointly optimal structure and viewing parameter estimates. Topics c ..."
Abstract

Cited by 555 (12 self)
 Add to MetaCart
covered include: the choice of cost function and robustness; numerical optimization including sparse Newton methods, linearly convergent approximations, updating and recursive methods; gauge (datum) invariance; and quality control. The theory is developed for general robust cost functions rather than
Least angle regression
 Ann. Statist
"... The purpose of model selection algorithms such as All Subsets, Forward Selection and Backward Elimination is to choose a linear model on the basis of the same set of data to which the model will be applied. Typically we have available a large collection of possible covariates from which we hope to s ..."
Abstract

Cited by 1308 (43 self)
 Add to MetaCart
The purpose of model selection algorithms such as All Subsets, Forward Selection and Backward Elimination is to choose a linear model on the basis of the same set of data to which the model will be applied. Typically we have available a large collection of possible covariates from which we hope to select a parsimonious set for the efficient prediction of a response variable. Least Angle Regression (LARS), a new model selection algorithm, is a useful and less greedy version of traditional forward selection methods. Three main properties are derived: (1) A simple modification of the LARS algorithm implements the Lasso, an attractive version of ordinary least squares that constrains the sum of the absolute regression coefficients; the LARS modification calculates all possible Lasso estimates for a given problem, using an order of magnitude less computer time than previous methods. (2) A different LARS modification efficiently implements Forward Stagewise linear regression, another promising
UPDATING AND DOWNDATING TECHNIQUES FOR OPTIMIZING NETWORK COMMUNICABILITY
"... Abstract. The total communicability of a network (or graph) is defined as the sum of the entries in the exponential of the adjacency matrix of the network, possibly normalized by the number of nodes. This quantity offers a good measure of how easily information spreads across the network, and can be ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
sparse, and yet have a large total communicability. These methods are based on updating, downdating and rewiring techniques that take into account the change in total communicability resulting from the addition or deletion of an edge. To this end, we introduce new edge centrality measures which can
Results 1  10
of
923