Results 1  10
of
109
A multilinear singular value decomposition
 SIAM J. Matrix Anal. Appl
, 2000
"... Abstract. We discuss a multilinear generalization of the singular value decomposition. There is a strong analogy between several properties of the matrix and the higherorder tensor decomposition; uniqueness, link with the matrix eigenvalue decomposition, firstorder perturbation effects, etc., are ..."
Abstract

Cited by 472 (22 self)
 Add to MetaCart
(Show Context)
Abstract. We discuss a multilinear generalization of the singular value decomposition. There is a strong analogy between several properties of the matrix and the higherorder tensor decomposition; uniqueness, link with the matrix eigenvalue decomposition, firstorder perturbation effects, etc., are analyzed. We investigate how tensor symmetries affect the decomposition and propose a multilinear generalization of the symmetric eigenvalue decomposition for pairwise symmetric tensors.
The Complex Structures Singular Value
, 1993
"... A tutorial introduction to the complex structured singular value (µ) is presented, with an emphasis on the mathematical aspects of µ. The µbased methods discussed here have been useful for analyzing the performance and robustness properties of linear feedback systems. Several tests ..."
Abstract

Cited by 192 (14 self)
 Add to MetaCart
A tutorial introduction to the complex structured singular value (µ) is presented, with an emphasis on the mathematical aspects of µ. The µbased methods discussed here have been useful for analyzing the performance and robustness properties of linear feedback systems. Several tests
Eigenvalues, and instabilities of solitary waves.
 Philos. Trans. R. Soc. Lond. Ser. A
, 1992
"... ..."
Balancing For Nonlinear Systems
 Systems & Control Letters
, 1993
"... We present a method of balancing for nonlinear systems which is an extension of balancing for linear systems in the sense that it is based on the input and output energy of a system. We deal with the input and output energy function of a stable nonlinear systems and propose a method to use these fun ..."
Abstract

Cited by 97 (17 self)
 Add to MetaCart
(Show Context)
We present a method of balancing for nonlinear systems which is an extension of balancing for linear systems in the sense that it is based on the input and output energy of a system. We deal with the input and output energy function of a stable nonlinear systems and propose a method to use these functions to get a balanced form for a stable nonlinear system. It is a local result, but gives `broader' results then we obtain by just linearizing the system. Keywords: balancing, nonlinear systems, energy functions, HamiltonJacobi equations, Hankel singular values. 1 Introduction Balancing for linear systems is a well known subject on which there has been a lot of research in the last decade. It started with a paper of Moore [6] in 1981, where balancing is introduced with the aim of using it as a tool for model reduction. If a linear system is in balanced form the Hankel singular values are a measure for the importance of state components. This means that the influence of the correspondin...
Level set approach to mean curvature flow in arbitrary codimension
 J. Differential Geom
, 1996
"... codimension ..."
LargeScale Optimization of Eigenvalues
 SIAM J. Optimization
, 1991
"... Optimization problems involving eigenvalues arise in many applications. Let x be a vector of real parameters and let A(x) be a continuously differentiable symmetric matrix function of x. We consider a particular problem which occurs frequently: the minimization of the maximum eigenvalue of A(x), ..."
Abstract

Cited by 84 (3 self)
 Add to MetaCart
(Show Context)
Optimization problems involving eigenvalues arise in many applications. Let x be a vector of real parameters and let A(x) be a continuously differentiable symmetric matrix function of x. We consider a particular problem which occurs frequently: the minimization of the maximum eigenvalue of A(x), subject to linear constraints and bounds on x. The eigenvalues of A(x) are not differentiable at points x where they coalesce, so the optimization problem is said to be nonsmooth. Furthermore, it is typically the case that the optimization objective tends to make eigenvalues coalesce at a solution point. There are three main purposes of the paper. The first is to present a clear and selfcontained derivation of the Clarke generalized gradient of the max eigenvalue function in terms of a "dual matrix". The second purpose is to describe a new algorithm, based on the ideas of a previous paper by the author (SIAM J. Matrix Anal. Appl. 9 (1988) 256268), which is suitable for solving l...
Optimality Conditions and Duality Theory for Minimizing Sums of the Largest Eigenvalues of Symmetric Matrices
, 1993
"... This paper gives max characterizations for the sum of the largest eigenvalues of a symmetric matrix. The elements which achieve the maximum provide a concise characterization of the generalized gradient of the eigenvalue sum in terms of a dual matrix. The dual matrix provides the information requi ..."
Abstract

Cited by 69 (4 self)
 Add to MetaCart
This paper gives max characterizations for the sum of the largest eigenvalues of a symmetric matrix. The elements which achieve the maximum provide a concise characterization of the generalized gradient of the eigenvalue sum in terms of a dual matrix. The dual matrix provides the information required to either verify firstorder optimality conditions at a point or to generate a descent direction for the eigenvalue sum from that point, splitting a multiple eigenvalue if necessary. A model minimization algorithm is outlined, and connections with the classical literature on sums of eigenvalues are explained. Sums of the largest eigenvalues in absolute value are also addressed.
Derivatives of Spectral Functions
, 1996
"... A spectral function of a Hermitian matrix X is a function which depends only on the eigenvalues of X , 1 (X) 2 (X) : : : n (X), and hence may be written f( 1 (X); 2 (X); : : : ; n (X)) for some symmetric function f . Such functions appear in a wide variety of matrix optimization problems. We ..."
Abstract

Cited by 67 (13 self)
 Add to MetaCart
A spectral function of a Hermitian matrix X is a function which depends only on the eigenvalues of X , 1 (X) 2 (X) : : : n (X), and hence may be written f( 1 (X); 2 (X); : : : ; n (X)) for some symmetric function f . Such functions appear in a wide variety of matrix optimization problems. We give a simple proof that this spectral function is differentiable at X if and only if the function f is differentiable at the vector (X), and we give a concise formula for the derivative. We then apply this formula to deduce an analogous expression for the Clarke generalized gradient of the spectral function. A similar result holds for real symmetric matrices. 1 Introduction and notation Optimization problems involving a symmetric matrix variable, X say, frequently involve symmetric functions of the eigenvalues of X in the objective or constraints. Examples include the maximum eigenvalue of X, or log(det X) (for positive definite X), or eigenvalue constraints such as positive semidefinit...
A general method for errorsinvariables problems in computer vision.
 Proc CVPR IEEE, 2:2018 —
, 2021
"... ..."