• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations

Tools

Sorted by:
Try your query at:
Semantic Scholar Scholar Academic
Google Bing DBLP
Results 1 - 10 of 1,762
Next 10 →

The mathematics of eigenvalue optimization

by A. S. Lewis - MATHEMATICAL PROGRAMMING
"... Optimization problems involving the eigenvalues of symmetric and nonsymmetric matrices present a fascinating mathematical challenge. Such problems arise often in theory and practice, particularly in engineering design, and are amenable to a rich blend of classical mathematical techniques and contemp ..."
Abstract - Cited by 115 (11 self) - Add to MetaCart
Optimization problems involving the eigenvalues of symmetric and nonsymmetric matrices present a fascinating mathematical challenge. Such problems arise often in theory and practice, particularly in engineering design, and are amenable to a rich blend of classical mathematical techniques

Distance metric learning with eigenvalue optimization

by Yiming Ying, Peng Li, Sören Sonnenburg, Francis Bach, Cheng Soon Ong - Journal of Machine Learning Research (Special Topics on Kernel and Metric Learning , 2012
"... The main theme of this paper is to develop a novel eigenvalue optimization framework for learning a Mahalanobis metric. Within this context, we introduce a novel metric learning approach called DML-eig which is shown to be equivalent to a well-known eigenvalue optimization problem called minimizing ..."
Abstract - Cited by 46 (2 self) - Add to MetaCart
The main theme of this paper is to develop a novel eigenvalue optimization framework for learning a Mahalanobis metric. Within this context, we introduce a novel metric learning approach called DML-eig which is shown to be equivalent to a well-known eigenvalue optimization problem called minimizing

ON EIGENVALUE OPTIMIZATION* ALEXANDER SHAPIRO

by Michael K. H. Fan
"... Abstract. In this paper we study optimization problems involving eigenvalues of symmetric matrices. One of the difficulties with numerical analysis of such problems is that the eigenvalues, considered as functions of a symmetric matrix, are not differentiable at those points where they coalesce. We ..."
Abstract - Add to MetaCart
Abstract. In this paper we study optimization problems involving eigenvalues of symmetric matrices. One of the difficulties with numerical analysis of such problems is that the eigenvalues, considered as functions of a symmetric matrix, are not differentiable at those points where they coalesce. We

On the eigenvalues optimization of beams with damping patches

by Veturia Chiroiu
"... Abstract:- The paper discusses the behavior of beams with external nonlocal damping patches made from traditional and auxetic materials. Unlike ordinary local damping models, the nonlocal damping force is modeled as a weighted average of the velocity field over the spatial domain, determined by a ke ..."
Abstract - Cited by 1 (0 self) - Add to MetaCart
kernel function based on distance measures. The performance with respect to eigenvalues is discussed in order to avoid resonance. The optimization is performed by determining the location of patches from maximizing eigenvalues or gap between them. Key-Words:- eigenvalues, optimization, damping patches

Column Subset Selection, Matrix Factorization, and Eigenvalue Optimization

by Joel A. Tropp , 2008
"... Given a fixed matrix, the problem of column subset selection requests a column submatrix that has favorable spectral properties. Most research from the algorithms and numerical linear algebra communities focuses on a variant called rank-revealing QR, which seeks a well-conditioned collection of colu ..."
Abstract - Cited by 20 (1 self) - Add to MetaCart
Given a fixed matrix, the problem of column subset selection requests a column submatrix that has favorable spectral properties. Most research from the algorithms and numerical linear algebra communities focuses on a variant called rank-revealing QR, which seeks a well-conditioned collection of columns that spans the (numerical) range of the matrix. The functional analysis literature contains another strand of work on column selection whose algorithmic implications have not been explored. In particular, a celebrated result of Bourgain and Tzafriri demonstrates that each matrix with normalized columns contains a large column submatrix that is exceptionally well conditioned. Unfortunately, standard proofs of this result cannot be regarded as algorithmic. This paper presents

Eigenvalue Optimization in C 2 Subdivision and Boundary Subdivision

by Sara M Grundel , 2011
"... die mir täglich die nötige Freude und Energie gegeben haben. ..."
Abstract - Cited by 3 (1 self) - Add to MetaCart
die mir täglich die nötige Freude und Energie gegeben haben.

Improved Approximation Algorithms for Maximum Cut and Satisfiability Problems Using Semidefinite Programming

by M. X. Goemans, D.P. Williamson - Journal of the ACM , 1995
"... We present randomized approximation algorithms for the maximum cut (MAX CUT) and maximum 2-satisfiability (MAX 2SAT) problems that always deliver solutions of expected value at least .87856 times the optimal value. These algorithms use a simple and elegant technique that randomly rounds the solution ..."
Abstract - Cited by 1211 (13 self) - Add to MetaCart
We present randomized approximation algorithms for the maximum cut (MAX CUT) and maximum 2-satisfiability (MAX 2SAT) problems that always deliver solutions of expected value at least .87856 times the optimal value. These algorithms use a simple and elegant technique that randomly rounds

Eigenvalue Optimization of Structures via Polynomial Semidefinite Programming

by Yoshihiro Kanno, Makoto Ohsaki , 2007
"... ..."
Abstract - Add to MetaCart
Abstract not found

Randomized Gossip Algorithms

by Stephen Boyd, Arpita Ghosh, Balaji Prabhakar, Devavrat Shah - IEEE TRANSACTIONS ON INFORMATION THEORY , 2006
"... Motivated by applications to sensor, peer-to-peer, and ad hoc networks, we study distributed algorithms, also known as gossip algorithms, for exchanging information and for computing in an arbitrarily connected network of nodes. The topology of such networks changes continuously as new nodes join a ..."
Abstract - Cited by 532 (5 self) - Add to MetaCart
method that solves the optimization problem over the network. The relation of averaging time to the second largest eigenvalue naturally relates it to the mixing time of a random walk with transition probabilities derived from the gossip algorithm. We use this connection to study the performance

Laplacian eigenmaps and spectral techniques for embedding and clustering.

by Mikhail Belkin , Partha Niyogi - Proceeding of Neural Information Processing Systems, , 2001
"... Abstract Drawing on the correspondence between the graph Laplacian, the Laplace-Beltrami op erator on a manifold , and the connections to the heat equation , we propose a geometrically motivated algorithm for constructing a representation for data sampled from a low dimensional manifold embedded in ..."
Abstract - Cited by 668 (7 self) - Add to MetaCart
of the manifold on which the data may possibly reside. Recently, there has been some interest (Tenenbaum et aI, 2000 ; The core algorithm is very simple, has a few local computations and one sparse eigenvalu e problem. The solution reflects th e intrinsic geom etric structure of the manifold. Th e justification
Next 10 →
Results 1 - 10 of 1,762
Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University