Results 11  20
of
1,369
Learning a kernel matrix for nonlinear dimensionality reduction
 In Proceedings of the Twenty First International Conference on Machine Learning (ICML04
, 2004
"... We investigate how to learn a kernel matrix for high dimensional data that lies on or near a low dimensional manifold. Noting that the kernel matrix implicitly maps the data into a nonlinear feature space, we show how to discover a mapping that “unfolds ” the underlying manifold from which the data ..."
Abstract

Cited by 152 (9 self)
 Add to MetaCart
(Show Context)
We investigate how to learn a kernel matrix for high dimensional data that lies on or near a low dimensional manifold. Noting that the kernel matrix implicitly maps the data into a nonlinear feature space, we show how to discover a mapping that “unfolds ” the underlying manifold from which the data was sampled. The kernel matrix is constructed by maximizing the variance in feature space subject to local constraints that preserve the angles and distances between nearest neighbors. The main optimization involves an instance of semidefinite programming—a fundamentally different computation than previous algorithms for manifold learning, such as Isomap and locally linear embedding. The optimized kernels perform better than polynomial and Gaussian kernels for problems in manifold learning, but worse for problems in large margin classification. We explain these results in terms of the geometric properties of different kernels and comment on various interpretations of other manifold learning algorithms as kernel methods.
GloptiPoly: Global Optimization over Polynomials with Matlab and SeDuMi
 ACM Trans. Math. Soft
, 2002
"... GloptiPoly is a Matlab/SeDuMi addon to build and solve convex linear matrix inequality relaxations of the (generally nonconvex) global optimization problem of minimizing a multivariable polynomial function subject to polynomial inequality, equality or integer constraints. It generates a series of ..."
Abstract

Cited by 141 (22 self)
 Add to MetaCart
(Show Context)
GloptiPoly is a Matlab/SeDuMi addon to build and solve convex linear matrix inequality relaxations of the (generally nonconvex) global optimization problem of minimizing a multivariable polynomial function subject to polynomial inequality, equality or integer constraints. It generates a series of lower bounds monotonically converging to the global optimum. Global optimality is detected and isolated optimal solutions are extracted automatically. Numerical experiments show that for most of the small and mediumscale problems described in the literature, the global optimum is reached at low computational cost. 1
SOSTOOLS: Sum of squares optimization toolbox for MATLAB
, 2004
"... Version 2.00 ..."
(Show Context)
Sums of Squares and Semidefinite Programming Relaxations for Polynomial Optimization Problems with Structured Sparsity
 SIAM JOURNAL ON OPTIMIZATION
, 2006
"... Unconstrained and inequality constrained sparse polynomial optimization problems (POPs) are considered. A correlative sparsity pattern graph is defined to find a certain sparse structure in the objective and constraint polynomials of a POP. Based on this graph, sets of supports for sums of squares ..."
Abstract

Cited by 122 (29 self)
 Add to MetaCart
(Show Context)
Unconstrained and inequality constrained sparse polynomial optimization problems (POPs) are considered. A correlative sparsity pattern graph is defined to find a certain sparse structure in the objective and constraint polynomials of a POP. Based on this graph, sets of supports for sums of squares (SOS) polynomials that lead to efficient SOS and semidefinite programming (SDP) relaxations are obtained. Numerical results from various test problems are included to show the improved performance of the SOS and SDP relaxations.
Learning the Kernel with Hyperkernels
, 2003
"... This paper addresses the problem of choosing a kernel suitable for estimation with a Support Vector Machine, hence further automating machine learning. This goal is achieved by defining a Reproducing Kernel Hilbert Space on the space of kernels itself. Such a formulation leads to a statistical es ..."
Abstract

Cited by 115 (2 self)
 Add to MetaCart
(Show Context)
This paper addresses the problem of choosing a kernel suitable for estimation with a Support Vector Machine, hence further automating machine learning. This goal is achieved by defining a Reproducing Kernel Hilbert Space on the space of kernels itself. Such a formulation leads to a statistical estimation problem very much akin to the problem of minimizing a regularized risk functional. We state the
Semidefinite programming based algorithms for sensor network localization
 ACM Transactions on Sensor Networks
, 2006
"... An SDP relaxation based method is developed to solve the localization problem in sensor networks using incomplete and inaccurate distance information. The problem is set up to find a set of sensor positions such that given distance constraints are satisfied. The nonconvex constraints in the formulat ..."
Abstract

Cited by 113 (7 self)
 Add to MetaCart
(Show Context)
An SDP relaxation based method is developed to solve the localization problem in sensor networks using incomplete and inaccurate distance information. The problem is set up to find a set of sensor positions such that given distance constraints are satisfied. The nonconvex constraints in the formulation are then relaxed in order to yield a semidefinite program which can be solved efficiently. The basic model is extended in order to account for noisy distance information. In particular, a maximum likelihood based formulation and an interval based formulation are discussed. The SDP solution can then also be used as a starting point for steepest descent based local optimization techniques that can further refine the SDP solution. We also describe the extension of the basic method to develop an iterative distributed SDP method for solving very large scale semidefinite programs that arise out of localization problems for large dense networks and are intractable using centralized methods. The performance evaluation of the technique with regard to estimation accuracy and computation time is also presented by the means of extensive simulations. Our SDP scheme also seems to be applicable to solving other Euclidean geometry problems where points are locally connected.
A robust minimax approach to classification
 JOURNAL OF MACHINE LEARNING RESEARCH
, 2002
"... When constructing a classifier, the probability of correct classification of future data points should be maximized. We consider a binary classification problem where the mean and covariance matrix of each class are assumed to be known. No further assumptions are made with respect to the classcondi ..."
Abstract

Cited by 104 (7 self)
 Add to MetaCart
(Show Context)
When constructing a classifier, the probability of correct classification of future data points should be maximized. We consider a binary classification problem where the mean and covariance matrix of each class are assumed to be known. No further assumptions are made with respect to the classconditional distributions. Misclassification probabilities are then controlled in a worstcase setting: that is, under all possible choices of classconditional densities with given mean and covariance matrix, we minimize the worstcase (maximum) probability of misclassification of future data points. For a linear decision boundary, this desideratum is translated in a very direct way into a (convex) second order cone optimization problem, with complexity similar to a support vector machine problem. The minimax problem can be interpreted geometrically as minimizing the maximum of the Mahalanobis distances to the two classes. We address the issue of robustness with respect to estimation errors (in the means and covariances of the
Complete search in continuous global optimization and constraint satisfaction
 ACTA NUMERICA 13
, 2004
"... ..."
Interiorpoint method for nuclear norm approximation with application to system identification
"... ..."
Proving Program Invariance and Termination by Parametric Abstraction, Lagrangian Relaxation and Semidefinite Programming
 IN VMCAI’2005: VERIFICATION, MODEL CHECKING, AND ABSTRACT INTERPRETATION, VOLUME 3385 OF LNCS
, 2005
"... In order to verify semialgebraic programs, we automatize the Floyd/Naur/Hoare proof method. The main task is to automatically infer valid invariants and rank functions. First we express the program semantics in polynomial form. Then the unknown rank function and invariants are abstracted in parametr ..."
Abstract

Cited by 90 (1 self)
 Add to MetaCart
In order to verify semialgebraic programs, we automatize the Floyd/Naur/Hoare proof method. The main task is to automatically infer valid invariants and rank functions. First we express the program semantics in polynomial form. Then the unknown rank function and invariants are abstracted in parametric form. The implication in the Floyd/Naur/Hoare verification conditions is handled by abstraction into numerical constraints by Lagrangian relaxation. The remaining universal quantification is handled by semidefinite programming relaxation. Finally the parameters are computed using semidefinite programming solvers. This new approach exploits the recent progress in the numerical resolution of linear or bilinear matrix inequalities by semidefinite programming using efficient polynomial primal/dual interior point methods generalizing those wellknown in linear programming to convex optimization. The framework is applied to invariance and termination proof of sequential, nondeterministic, concurrent, and fair parallel imperative polynomial programs and can easily be extended to other safety and liveness properties.