Results 1  10
of
23
Vector diffusion maps and the connection laplacian
 CComm. Pure Appl. Math
"... Abstract. We introduce vector diffusion maps (VDM), a new mathematical framework for organizing and analyzing massive high dimensional data sets, images and shapes. VDM is a mathematical and algorithmic generalization of diffusion maps and other nonlinear dimensionality reduction methods, such as L ..."
Abstract

Cited by 48 (13 self)
 Add to MetaCart
Abstract. We introduce vector diffusion maps (VDM), a new mathematical framework for organizing and analyzing massive high dimensional data sets, images and shapes. VDM is a mathematical and algorithmic generalization of diffusion maps and other nonlinear dimensionality reduction methods, such as LLE, ISOMAP and Laplacian eigenmaps. While existing methods are either directly or indirectly related to the heat kernel for functions over the data, VDM is based on the heat kernel for vector fields. VDM provides tools for organizing complex data sets, embedding them in a low dimensional space, and interpolating and regressing vector fields over the data. In particular, it equips the data with a metric, which we refer to as the vector diffusion distance. In the manifold learning setup, where the data set is distributed on (or near) a low dimensional manifold M d embedded in R p, we prove the relation between VDM and the connectionLaplacian operator for vector fields over the manifold. Key words. Dimensionality reduction, vector fields, heat kernel, parallel transport, local principal component analysis, alignment. 1. Introduction. Apopularwaytodescribethe
A Cheeger inequality for the graph connection Laplacian
, 2012
"... The O(d) Synchronization problem consists of estimating a set of n unknown orthogonal d × d matrices O1,..., On from noisy measurements of a subset of the pairwise ratios OiO −1 j. We formulate and prove a Cheegertype inequality that relates a measure of how well it is possible to solve the O(d) sy ..."
Abstract

Cited by 24 (14 self)
 Add to MetaCart
The O(d) Synchronization problem consists of estimating a set of n unknown orthogonal d × d matrices O1,..., On from noisy measurements of a subset of the pairwise ratios OiO −1 j. We formulate and prove a Cheegertype inequality that relates a measure of how well it is possible to solve the O(d) synchronization problem with the spectra of an operator, the graph Connection Laplacian. We also show how this inequality provides a worst case performance guarantee for a spectral method to solve this problem.
EXACT AND STABLE RECOVERY OF ROTATIONS FOR ROBUST SYNCHRONIZATION
, 1211
"... Abstract. The synchronization problem over the special orthogonal group SO(d) consists of estimating a set of unknown rotations R1, R2,..., Rn from noisy measurements of a subset of their pairwise ratios R −1 i Rj. The problem has found applications in computer vision, computer graphics, and sensor ..."
Abstract

Cited by 22 (9 self)
 Add to MetaCart
(Show Context)
Abstract. The synchronization problem over the special orthogonal group SO(d) consists of estimating a set of unknown rotations R1, R2,..., Rn from noisy measurements of a subset of their pairwise ratios R −1 i Rj. The problem has found applications in computer vision, computer graphics, and sensor network localization, among others. Its least squares solution can be approximated by either spectral relaxation or semidefinite programming followed by a rounding procedure, analogous to the approximation algorithms of MaxCut. The contribution of this paper is threefold: First, we introduce a robust penalty function involving the sum of unsquared deviations and derive a relaxation that leads to a convex optimization problem; Second, we apply the alternating direction method to minimize the penalty function; Finally, under a specific model of the measurement noise and the measurement graph, we prove that the rotations are exactly and stably recovered, exhibiting a phase transition behavior in terms of the proportion of noisy measurements. Numerical simulations confirm the phase transition behavior for our method as well as its improved accuracy compared to existing methods. Key words. Synchronization of rotations; least unsquared deviation; semidefinite relaxation; and alternating direction method 1. Introduction. The
Eigenvector synchronization, graph rigidity and the molecule problem
, 2012
"... ..."
(Show Context)
Global registration of multiple point clouds using semidefinite programming. arXiv:1306.5226 [cs.CV
, 2013
"... ABSTRACT. Consider N points in R d and M local coordinate systems that are related through unknown rigid transforms. For each point we are given (possibly noisy) measurements of its local coordinates in some of the coordinate systems. Alternatively, for each coordinate system, we observe the coordin ..."
Abstract

Cited by 8 (4 self)
 Add to MetaCart
(Show Context)
ABSTRACT. Consider N points in R d and M local coordinate systems that are related through unknown rigid transforms. For each point we are given (possibly noisy) measurements of its local coordinates in some of the coordinate systems. Alternatively, for each coordinate system, we observe the coordinates of a subset of the points. The problem of estimating the global coordinates of the N points (up to a rigid transform) from such measurements comes up in distributed approaches to molecular conformation and sensor network localization, and also in computer vision and graphics. The leastsquares formulation, though nonconvex, has a well known closedform solution for the case M = 2 (based on the singular value decomposition). However, no closed form solution is known for M ≥ 3. In this paper, we propose a semidefinite relaxation of the leastsquares formulation, and prove conditions for exact and stable recovery for both this relaxation and for a previously proposed spectral relaxation. In particular, using results from rigidity theory and the theory of semidefinite programming, we prove that the semidefinite relaxation can guarantee recovery under more adversarial measurements compared to the spectral counterpart. We perform numerical experiments on simulated data to confirm the theoretical findings. We empirically demonstrate that (a) unlike the spectral relaxation, the relaxation gap is mostly zero for the semidefinite program (i.e., we are able to solve the original nonconvex problem) up to a certain noise threshold, and (b) the semidefinite program performs significantly better than spectral and manifoldoptimization methods, particularly at large noise levels.
Distributed Maximum Likelihood Sensor Network Localization
 IEEE Transactions on Signal Processing
, 2014
"... Abstract—We propose a class of convex relaxations to solve the sensor network localization problem, based on a maximum likelihood (ML) formulation. This class, as well as the tightness of the relaxations, depends on the noise probability density function (PDF) of the collected measurements.We deri ..."
Abstract

Cited by 7 (3 self)
 Add to MetaCart
(Show Context)
Abstract—We propose a class of convex relaxations to solve the sensor network localization problem, based on a maximum likelihood (ML) formulation. This class, as well as the tightness of the relaxations, depends on the noise probability density function (PDF) of the collected measurements.We derive a computational efficient edgebased version of this ML convex relaxation class and we design a distributed algorithm that enables the sensor nodes to solve these edgebased convex programs locally by communicating only with their close neighbors. This algorithm relies on the alternating direction method of multipliers (ADMM), it converges to the centralized solution, it can run asynchronously, and it is computation errorresilient. Finally, we compare our proposed distributed scheme with other available methods, both analytically and numerically, and we argue the added value of ADMM, especially for largescale networks. Index Terms—Distributed optimization, convex relaxations, sensor network localization, distributed algorithms, ADMM, distributed localization, sensor networks, maximum likelihood. I.
EUCLIDEAN DISTANCE GEOMETRY AND APPLICATIONS
"... Abstract. Euclidean distance geometry is the study of Euclidean geometry based on the concept of distance. This is useful in several applications where the inputdataconsistsofanincompleteset of distances, and the output is a set of points in Euclidean space that realizes the given distances. We surv ..."
Abstract

Cited by 6 (1 self)
 Add to MetaCart
(Show Context)
Abstract. Euclidean distance geometry is the study of Euclidean geometry based on the concept of distance. This is useful in several applications where the inputdataconsistsofanincompleteset of distances, and the output is a set of points in Euclidean space that realizes the given distances. We survey some of the theory of Euclidean distance geometry and some of its most important applications, including molecular conformation, localization of sensor networks and statics. Key words. Matrix completion, barandjoint framework, graph rigidity, inverse problem, protein conformation, sensor network.
The Spectrum of Random Innerproduct Kernel Matrices
, 1202
"... Abstract: We consider nbyn matrices whose (i,j)th entry is f(X T i Xj), where X1,...,Xn are i.i.d. standard Gaussian random vectors in R p, and f is a realvalued function. The eigenvalue distribution of these random kernel matrices is studied at the “large p, large n ” regime. It is shown that, ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
(Show Context)
Abstract: We consider nbyn matrices whose (i,j)th entry is f(X T i Xj), where X1,...,Xn are i.i.d. standard Gaussian random vectors in R p, and f is a realvalued function. The eigenvalue distribution of these random kernel matrices is studied at the “large p, large n ” regime. It is shown that, when p,n → ∞ and p/n = γ which is a constant, and f is properly scaled so that Var(f(X T i Xj)) is O(p −1), the spectral density converges weakly to a limiting density on R. The limiting density is dictated by a cubic equation involving its Stieltjes transform. While for smooth kernel functions the limiting spectral density has been previously shown to be the MarcenkoPastur distribution, our analysis is applicable to nonsmooth kernel functions, resulting in a new family of limiting densities.
Cramérrao bounds for synchronization of rotations
 CoRR
"... Synchronization of rotations is the problem of estimating a set of rotations Ri ∈ SO(n), i = 1... N based on noisy measurements of relative rotations RiR ⊤ j. This fundamental problem has found many recent applications, most importantly in structural biology. We provide a framework to study synchron ..."
Abstract

Cited by 5 (5 self)
 Add to MetaCart
(Show Context)
Synchronization of rotations is the problem of estimating a set of rotations Ri ∈ SO(n), i = 1... N based on noisy measurements of relative rotations RiR ⊤ j. This fundamental problem has found many recent applications, most importantly in structural biology. We provide a framework to study synchronization as estimation on Riemannian manifolds for arbitrary n under a large family of noise models. The noise models we address encompass zeromean isotropic noise, and we develop tools for Gaussianlike as well as heavytail types of noise in particular. As a main contribution, we derive the CramérRao bounds of synchronization, that is, lowerbounds on the variance of unbiased estimators. We find that these bounds are structured by the pseudoinverse of the measurement graph Laplacian, where edge weights are proportional to measurement quality. We leverage this to provide interpretation in terms of random walks and visualization tools for these bounds in both the anchored and anchorfree scenarios. Similar bounds previously established were limited to rotations in the plane and Gaussianlike noise. Synchronization of rotations, estimation on manifolds, estimation on graphs, graph Laplacian, Fisher information, CramérRao bounds, distributions on the rotation group, Langevin. 2000 Math Subject Classification: 62F99, 94C15, 22C05, 05C12, 1
Ranking and sparsifying a connection graph
"... Many problems arising in dealing with highdimensional data sets involve connection graphs in which each edge is associated with both an edge weight and a ddimensional linear transformation. We consider vectorized versions of the PageRank and effective resistance which can be used as basic tools f ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
(Show Context)
Many problems arising in dealing with highdimensional data sets involve connection graphs in which each edge is associated with both an edge weight and a ddimensional linear transformation. We consider vectorized versions of the PageRank and effective resistance which can be used as basic tools for organizing and analyzing complex data sets. For example, the generalized PageRank and effective resistance can be utilized to derive and modify diffusion distances for vector diffusion maps in data and image processing. Furthermore, the edge ranking of the connection graphs determined by the vectorized PageRank and effective resistance are an essential part of sparsification algorithms which simplify and preserve the global structure of connection graphs.