Results 1  10
of
54
Structured polynomial eigenvalue problems: Good vibrations from good linearizations
 SIAM J. Matrix Anal. Appl
"... Abstract. Many applications give rise to nonlinear eigenvalue problems with an underlying structured matrix polynomial. In this paper several useful classes of structured polynomial (e.g., palindromic, even, odd) are identified and the relationships between them explored. A special class of lineariz ..."
Abstract

Cited by 73 (23 self)
 Add to MetaCart
(Show Context)
Abstract. Many applications give rise to nonlinear eigenvalue problems with an underlying structured matrix polynomial. In this paper several useful classes of structured polynomial (e.g., palindromic, even, odd) are identified and the relationships between them explored. A special class of linearizations that reflect the structure of these polynomials, and therefore preserve symmetries in their spectra, is introduced and investigated. We analyze the existence and uniqueness of such linearizations, and show how they may be systematically constructed.
Symmetric linearizations for matrix polynomials
 SIAM J. MATRIX ANAL. APPL
, 2006
"... A standard way of treating the polynomial eigenvalue problem P(λ)x = 0 is to convert it into an equivalent matrix pencil—a process known as linearization. Two vector spaces of pencils L1(P) and L2(P), and their intersection DL(P), have recently been defined and studied by Mackey, Mackey, Mehl, and M ..."
Abstract

Cited by 54 (19 self)
 Add to MetaCart
(Show Context)
A standard way of treating the polynomial eigenvalue problem P(λ)x = 0 is to convert it into an equivalent matrix pencil—a process known as linearization. Two vector spaces of pencils L1(P) and L2(P), and their intersection DL(P), have recently been defined and studied by Mackey, Mackey, Mehl, and Mehrmann. The aim of our work is to gain new insight into these spaces and the extent to which their constituent pencils inherit structure from P. For arbitrary polynomials we show that every pencil in DL(P) is block symmetric and we obtain a convenient basis for DL(P) built from block Hankel matrices. This basis is then exploited to prove that the first deg(P) pencils in a sequence constructed by Lancaster in the 1960s generate DL(P). When P is symmetric, we show that the symmetric pencils in L1(P) comprise DL(P), while for Hermitian P the Hermitian pencils in L1(P) form a proper subset of DL(P) that we explicitly characterize. Almost all pencils in each of these subsets are shown to be linearizations. In addition to obtaining new results, this work provides a selfcontained treatment of some of the key properties of DL(P) together with some new, more concise proofs.
Backward error of polynomial eigenproblems solved by linearization
 MANCHESTER INSTITUTE FOR MATHEMATICAL SCIENCES, THE UNIVERSITY OF MANCHESTER
, 2006
"... The most widely used approach for solving the polynomial eigenvalue problem P(λ)x = ��m i=0 λi � Ai x =0inn × n matrices Ai is to linearize to produce a larger order pencil L(λ) =λX + Y, whose eigensystem is then found by any method for generalized eigenproblems. For a given polynomial P, infinite ..."
Abstract

Cited by 43 (11 self)
 Add to MetaCart
The most widely used approach for solving the polynomial eigenvalue problem P(λ)x = ��m i=0 λi � Ai x =0inn × n matrices Ai is to linearize to produce a larger order pencil L(λ) =λX + Y, whose eigensystem is then found by any method for generalized eigenproblems. For a given polynomial P, infinitely many linearizations L exist and approximate eigenpairs of P computed via linearization can have widely varying backward errors. We show that if a certain onesided factorization relating L to P can be found then a simple formula permits recovery of right eigenvectors of P from those of L, and the backward error of an approximate eigenpair of P can be bounded in terms of the backward error for the corresponding approximate eigenpair of L. A similar factorization has the same implications for left eigenvectors. We use this technique to derive backward error bounds depending only on the norms of the Ai for the companion pencils and for the vector space DL(P) of pencils recently identified by Mackey, Mackey, Mehl, and Mehrmann. In all cases, sufficient conditions are identified for an optimal backward error for P. These results are shown to be entirely consistent with those of Higham, Mackey, and Tisseur on the conditioning of linearizations of P. Other contributions of this work are a block scaling of the companion pencils
Linearizations of singular matrix polynomials and the recovery of minimal indices
, 2009
"... A standard way of dealing with a regular matrix polynomial P(λ) is to convert it into an equivalent matrix pencil – a process known as linearization. Two vector spaces of pencils L1(P) and L2(P) that generalize the first and second companion forms have recently been introduced by Mackey, Mackey, M ..."
Abstract

Cited by 23 (10 self)
 Add to MetaCart
(Show Context)
A standard way of dealing with a regular matrix polynomial P(λ) is to convert it into an equivalent matrix pencil – a process known as linearization. Two vector spaces of pencils L1(P) and L2(P) that generalize the first and second companion forms have recently been introduced by Mackey, Mackey, Mehl and Mehrmann. Almost all of these pencils are linearizations for P (λ) when P is regular. The goal of this work is to show that most of the pencils in L1(P)andL2(P)arestill linearizations when P (λ) is a singular square matrix polynomial, and that these linearizations can be used to obtain the complete eigenstructure of P (λ), comprised not only of the finite and infinite eigenvalues, but also for singular polynomials of the left and right minimal indices and minimal bases. We show explicitly how to recover the minimal indices and bases of the polynomial P (λ) fromthe minimalindices and bases of linearizations in L1(P)andL2(P). As a consequence of the recovery formulae for minimal indices, we prove that the vector space DL(P)=L1(P) ∩ L2(P)will never contain any linearization for a square singular polynomial P (λ). Finally, the results are extended to other linearizations of singular polynomials defined in terms of more general polynomial bases.
Scaling, sensitivity and stability in the numerical solution of quadratic eigenvalue problems
 Internat. J. Numer. Methods Eng
, 2006
"... The most common way of solving the quadratic eigenvalue problem (QEP) (λ 2 M +λD+K)x = 0 is to convert it into a linear problem (λX +Y)z = 0 of twice the dimension and solve the linear problem by the QZ algorithm or a Krylov method. In doing so, it is important to understand the influence of the lin ..."
Abstract

Cited by 22 (9 self)
 Add to MetaCart
(Show Context)
The most common way of solving the quadratic eigenvalue problem (QEP) (λ 2 M +λD+K)x = 0 is to convert it into a linear problem (λX +Y)z = 0 of twice the dimension and solve the linear problem by the QZ algorithm or a Krylov method. In doing so, it is important to understand the influence of the linearization process on the accuracy and stability of the computed solution. We discuss these issues for three particular linearizations: the standard companion linearization and two linearizations that preserve symmetry in the problem. For illustration we employ a model QEP describing the motion of a beam simply supported at both ends and damped at the midpoint. We show that the above linearizations lead to poor numerical results for the beam problem, but that a twoparameter scaling proposed by Fan, Lin and Van Dooren cures the instabilities. We also show that half of the eigenvalues of the beam QEP are pure imaginary and are eigenvalues of the undamped problem. Our analysis makes use of recently developed theory explaining the sensitivity and stability of linearizations, the main conclusions of which are summarized. As well as arguing that scaling should routinely be used, we give guidance on how to choose a linearization and illustrate the practical value of condition numbers and backward errors. key words: quadratic eigenvalue problem, sensitivity, condition number, backward error, stability,
Detecting and solving hyperbolic quadratic eigenvalue problems
, 2007
"... Reports available from: And by contacting: ..."
(Show Context)
FIEDLER COMPANION LINEARIZATIONS AND THE RECOVERY OF MINIMAL INDICES
, 2010
"... A standard way of dealing with a matrix polynomial P (λ) is to convert it into an equivalent matrix pencil – a process known as linearization. For any regular matrix polynomial, a new family of linearizations generalizing the classical first and second Frobenius companion forms has recently been i ..."
Abstract

Cited by 19 (11 self)
 Add to MetaCart
A standard way of dealing with a matrix polynomial P (λ) is to convert it into an equivalent matrix pencil – a process known as linearization. For any regular matrix polynomial, a new family of linearizations generalizing the classical first and second Frobenius companion forms has recently been introduced by Antoniou and Vologiannidis, extending some linearizations previously defined by Fiedler for scalar polynomials. We prove that these pencils are linearizations even when P (λ) is a singular square matrix polynomial, and show explicitly how to recover the left and right minimal indices and minimal bases of the polynomial P (λ) from the minimal indices and bases of these linearizations. In addition, we provide a simple way to recover the eigenvectors of a regular polynomial from those of any of these linearizations, without any computational cost. The existence of an eigenvector recovery procedure is essential for a linearization to be relevant for applications.
Definite matrix polynomials and their linearization by definite pencils
 Manchester Institute for Mathematical Sciences, The University of Manchester
, 2008
"... Abstract. Hyperbolic matrix polynomials are an important class of Hermitian matrix polynomials that contain overdamped quadratics as a special case. They share with definite pencils the spectral property that their eigenvalues are real and semisimple. We extend the definition of hyperbolic matrix po ..."
Abstract

Cited by 14 (7 self)
 Add to MetaCart
(Show Context)
Abstract. Hyperbolic matrix polynomials are an important class of Hermitian matrix polynomials that contain overdamped quadratics as a special case. They share with definite pencils the spectral property that their eigenvalues are real and semisimple. We extend the definition of hyperbolic matrix polynomial in a way that relaxes the requirement of definiteness of the leading coefficient matrix, yielding what we call definite polynomials. We show that this class of polynomials has an elegant characterization in terms of definiteness intervals on the extended real line, and that it includes definite pencils as a special case. A fundamental question is whether a definite matrix polynomial P can be linearized in a structurepreserving way. We show that the answer to this question is affirmative: P is definite if and only if it has a definite linearization in H(P), a certain vector space of Hermitian pencils; and for definite P we give a complete characterization of all the linearizations in H(P) that are definite. For the important special case of quadratics, we show how a definite quadratic polynomial can be transformed into a definite linearization with a positive definite leading coefficient matrix—a form that is particularly attractive numerically.
Trimmed linearizations for structured matrix polynomials
, 2008
"... Dedicated to Richard S. Varga on the occasion of his 80th birthday. We discuss the eigenvalue problem for general and structured matrix polynomials which may be singular and may have eigenvalues at infinity. We derive condensed forms that allow (partial) deflation of the infinite eigenvalue and sing ..."
Abstract

Cited by 14 (3 self)
 Add to MetaCart
(Show Context)
Dedicated to Richard S. Varga on the occasion of his 80th birthday. We discuss the eigenvalue problem for general and structured matrix polynomials which may be singular and may have eigenvalues at infinity. We derive condensed forms that allow (partial) deflation of the infinite eigenvalue and singular structure of the matrix polynomial. The remaining reduced order staircase form leads to new types of linearizations which determine the finite eigenvalues and corresponding eigenvectors. The new linearizations also simplify the construction of structure preserving linearizations.