Results 1  10
of
20
A Cheeger inequality for the graph connection laplacian. available online
, 2012
"... Abstract. The O(d) Synchronization problem consists of estimating a set of n unknown orthogonal d × d matrices O1,..., On from noisy measurements of a subset of the pairwise ratios OiO −1 j. We formulate and prove a Cheegertype inequality that relates a measure of how well it is possible to solve t ..."
Abstract

Cited by 23 (13 self)
 Add to MetaCart
Abstract. The O(d) Synchronization problem consists of estimating a set of n unknown orthogonal d × d matrices O1,..., On from noisy measurements of a subset of the pairwise ratios OiO −1 j. We formulate and prove a Cheegertype inequality that relates a measure of how well it is possible to solve the O(d) synchronization problem with the spectra of an operator, the graph Connection Laplacian. We also show how this inequality provides a worst case performance guarantee for a spectral method to solve this problem.
Consistent shape maps via semidefinite programming
 In Computer Graphics Forum
"... Recent advances in shape matching have shown that jointly optimizing the maps among the shapes in a collection can lead to significant improvements when compared to estimating maps between pairs of shapes in isolation. These methods typically invoke a cycleconsistency criterion — the fact that comp ..."
Abstract

Cited by 21 (3 self)
 Add to MetaCart
(Show Context)
Recent advances in shape matching have shown that jointly optimizing the maps among the shapes in a collection can lead to significant improvements when compared to estimating maps between pairs of shapes in isolation. These methods typically invoke a cycleconsistency criterion — the fact that compositions of maps along a cycle of shapes should approximate the identity map. This condition regularizes the network and allows for the correction of errors and imperfections in individual maps. In particular, it encourages the estimation of maps between dissimilar shapes by compositions of maps along a path of more similar shapes. In this paper, we introduce a novel approach for obtaining consistent shape maps in a collection that formulates the cycleconsistency constraint as the solution to a semidefinite program (SDP). The proposed approach is based on the observation that, if the ground truth maps between the shapes are cycleconsistent, then the matrix that stores all pairwise maps in blocks is lowrank and positive semidefinite. Motivated by recent advances in techniques for lowrank matrix recovery via semidefinite programming, we formulate the problem of estimating cycleconsistent maps as finding the closest positive semidefinite matrix to an input matrix that stores all the initial maps. By analyzing the KarushKuhnTucker (KKT) optimality condition of this program, we derive theoretical guarantees for the proposed algorithm, ensuring the correctness of the recovery when the errors in the inputs maps do not exceed certain thresholds. Besides this theoretical guarantee, experimental results on benchmark datasets show that the proposed approach outperforms stateoftheart multiple shape matching methods. 1
Decoding binary node labels from censored edge measurements: Phase transition and efficient recovery. available at arXiv:1404.4749 [cs.IT
, 2014
"... Abstract. We consider the problem of clustering a graphG into two communities by observing a subset of the vertex correlations. Specifically, we consider the inverse problem with observed variables Y = BGx⊕Z, where BG is the incidence matrix of a graph G, x is the vector of unknown vertex variables ..."
Abstract

Cited by 9 (4 self)
 Add to MetaCart
(Show Context)
Abstract. We consider the problem of clustering a graphG into two communities by observing a subset of the vertex correlations. Specifically, we consider the inverse problem with observed variables Y = BGx⊕Z, where BG is the incidence matrix of a graph G, x is the vector of unknown vertex variables (with a uniform prior) and Z is a noise vector with Bernoulli(ε) i.i.d. entries. All variables and operations are Boolean. This model is motivated by coding, synchronization, and community detection problems. In particular, it corresponds to a stochastic block model or a correlation clustering problem with two communities and censored edges. Without noise, exact recovery (up to global flip) of x is possible if and only the graph G is connected, with a sharp threshold at the edge probability log(n)/n for ErdősRényi random graphs. The first goal of this paper is to determine how the edge probability p needs to scale to allow exact recovery in the presence of noise. Defining the degree (oversampling) rate of the graph by α = np / log(n), it is shown that exact recovery is possible if and only if α> 2/(1 − 2ε)2 + o(1/(1 − 2ε)2). In other words, 2/(1 − 2ε)2 is the information theoretic threshold for exact recovery at lowSNR. In addition, an efficient recovery algorithm based on semidefinite programming is proposed and shown to succeed in the threshold regime up to twice the optimal rate. For a deterministic graph G, defining the degree rate as α = d / log(n), where d is the minimum degree of the graph, it is shown that the proposed method achieves the rate α> 4((1 + λ)/(1 − λ)2)/(1 − 2ε)2 + o(1/(1 − 2ε)2), where 1 − λ is the spectral gap of the graph G. A preliminary version of this paper appeared in ISIT 2014 [ABBS14]. 1.
Convex recovery from interferometric measurements. arXiv preprint arXiv:1307.6864
, 2013
"... This note formulates a deterministic recovery result for vectors x from quadratic measurements of the form (Ax)i(Ax)j for some leftinvertible A. Recovery is exact, or stable in the noisy case, when the couples (i, j) are chosen as edges of a wellconnected graph. One possible way of obtaining the ..."
Abstract

Cited by 9 (0 self)
 Add to MetaCart
(Show Context)
This note formulates a deterministic recovery result for vectors x from quadratic measurements of the form (Ax)i(Ax)j for some leftinvertible A. Recovery is exact, or stable in the noisy case, when the couples (i, j) are chosen as edges of a wellconnected graph. One possible way of obtaining the solution is as a feasible point of a simple semidefinite program. Furthermore, we show how the proportionality constant in the error estimate depends on the spectral gap of a dataweighted graph Laplacian. Such quadratic measurements have found applications in phase retrieval, angular synchronization, and more recently interferometric waveform inversion. Acknowledgments. The authors would like to thank Amit Singer for interesting discussions. 1
Nearoptimal joint object matching via convex relaxation. arxiv preprint arXiv:1402.1473
, 2014
"... Joint object matching aims at aggregating information from a large collection of similar instances (e.g. images, graphs, shapes) to improve the correspondences computed between pairs of objects, typically by exploiting global map compatibility. Despite some practical advances on this problem, fro ..."
Abstract

Cited by 8 (1 self)
 Add to MetaCart
(Show Context)
Joint object matching aims at aggregating information from a large collection of similar instances (e.g. images, graphs, shapes) to improve the correspondences computed between pairs of objects, typically by exploiting global map compatibility. Despite some practical advances on this problem, from the theoretical point of view, the errorcorrection ability of existing algorithms are limited by a constant barrier — none of them can provably recover the correct solution when more than a constant fraction of input correspondences are corrupted. Moreover, prior approaches focus mostly on fully similar objects, while it is practically more demanding and realistic to match instances that are only partially similar to each other. In this paper, we propose an algorithm to jointly match multiple objects that exhibit only partial similarities, where the provided pairwise feature correspondences can be densely corrupted. By encoding a consistent partial map collection into a 01 semidefinite matrix, we attempt recovery via a twostep procedure, that is, a spectral technique followed by a parameterfree convex program called MatchLift. Under a natural randomized model, MatchLift exhibits nearoptimal errorcorrection ability, i.e. it guarantees the recovery of the groundtruth maps even when a dominant fraction of the inputs are randomly corrupted. We evaluate the proposed algorithm on various benchmark data sets including synthetic examples and realworld examples, all of which confirm the practical applicability of the proposed algorithm.
Global registration of multiple point clouds using semidefinite programming. arXiv:1306.5226 [cs.CV
, 2013
"... ABSTRACT. Consider N points in R d and M local coordinate systems that are related through unknown rigid transforms. For each point we are given (possibly noisy) measurements of its local coordinates in some of the coordinate systems. Alternatively, for each coordinate system, we observe the coordin ..."
Abstract

Cited by 8 (4 self)
 Add to MetaCart
(Show Context)
ABSTRACT. Consider N points in R d and M local coordinate systems that are related through unknown rigid transforms. For each point we are given (possibly noisy) measurements of its local coordinates in some of the coordinate systems. Alternatively, for each coordinate system, we observe the coordinates of a subset of the points. The problem of estimating the global coordinates of the N points (up to a rigid transform) from such measurements comes up in distributed approaches to molecular conformation and sensor network localization, and also in computer vision and graphics. The leastsquares formulation, though nonconvex, has a well known closedform solution for the case M = 2 (based on the singular value decomposition). However, no closed form solution is known for M ≥ 3. In this paper, we propose a semidefinite relaxation of the leastsquares formulation, and prove conditions for exact and stable recovery for both this relaxation and for a previously proposed spectral relaxation. In particular, using results from rigidity theory and the theory of semidefinite programming, we prove that the semidefinite relaxation can guarantee recovery under more adversarial measurements compared to the spectral counterpart. We perform numerical experiments on simulated data to confirm the theoretical findings. We empirically demonstrate that (a) unlike the spectral relaxation, the relaxation gap is mostly zero for the semidefinite program (i.e., we are able to solve the original nonconvex problem) up to a certain noise threshold, and (b) the semidefinite program performs significantly better than spectral and manifoldoptimization methods, particularly at large noise levels.
Linear inverse problems on ErdősRényi graphs: Informationtheoretic limits and efficient recovery
"... Abstract—This paper considers the inverse problem with observed variables Y = BGX ⊕Z, where BG is the incidence matrix of a graph G, X is the vector of unknown vertex variables with a uniform prior, and Z is a noise vector with Bernoulli(ε) i.i.d. entries. All variables and operations are Boolean. T ..."
Abstract

Cited by 6 (4 self)
 Add to MetaCart
(Show Context)
Abstract—This paper considers the inverse problem with observed variables Y = BGX ⊕Z, where BG is the incidence matrix of a graph G, X is the vector of unknown vertex variables with a uniform prior, and Z is a noise vector with Bernoulli(ε) i.i.d. entries. All variables and operations are Boolean. This model is motivated by coding, synchronization, and community detection problems. In particular, it corresponds to a stochastic block model or a correlation clustering problem with two communities and censored edges. Without noise, exact recovery of X is possible if and only the graph G is connected, with a sharp threshold at the edge probability log(n)/n for ErdősRényi random graphs. The first goal of this paper is to determine how the edge probability p needs to scale to allow exact recovery in the presence of noise. Defining the degree (oversampling) rate of the graph by α = np / log(n), it is shown that exact recovery is possible if and only if α> 2/(1−2ε)2+o(1/(1−2ε)2). In other words, 2/(1−2ε)2 is the information theoretic threshold for exact recovery at lowSNR. In addition, an efficient recovery algorithm based on semidefinite programming is proposed and shown to succeed in the threshold regime up to twice the optimal rate. Full version available in [1]. I.
Cramérrao bounds for synchronization of rotations
 CoRR
"... Synchronization of rotations is the problem of estimating a set of rotations Ri ∈ SO(n), i = 1... N based on noisy measurements of relative rotations RiR ⊤ j. This fundamental problem has found many recent applications, most importantly in structural biology. We provide a framework to study synchron ..."
Abstract

Cited by 4 (4 self)
 Add to MetaCart
(Show Context)
Synchronization of rotations is the problem of estimating a set of rotations Ri ∈ SO(n), i = 1... N based on noisy measurements of relative rotations RiR ⊤ j. This fundamental problem has found many recent applications, most importantly in structural biology. We provide a framework to study synchronization as estimation on Riemannian manifolds for arbitrary n under a large family of noise models. The noise models we address encompass zeromean isotropic noise, and we develop tools for Gaussianlike as well as heavytail types of noise in particular. As a main contribution, we derive the CramérRao bounds of synchronization, that is, lowerbounds on the variance of unbiased estimators. We find that these bounds are structured by the pseudoinverse of the measurement graph Laplacian, where edge weights are proportional to measurement quality. We leverage this to provide interpretation in terms of random walks and visualization tools for these bounds in both the anchored and anchorfree scenarios. Similar bounds previously established were limited to rotations in the plane and Gaussianlike noise. Synchronization of rotations, estimation on manifolds, estimation on graphs, graph Laplacian, Fisher information, CramérRao bounds, distributions on the rotation group, Langevin. 2000 Math Subject Classification: 62F99, 94C15, 22C05, 05C12, 1
Vector diffusion maps and random matrices with random blocks
, 2014
"... Vector diffusion maps (VDM) is a modern data analysis technique that is starting to be applied for the analysis of high dimensional and massive datasets. Motivated by this technique, we study matrices that are akin to the ones appearing in the null case of VDM, i.e the case where there is no structu ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
(Show Context)
Vector diffusion maps (VDM) is a modern data analysis technique that is starting to be applied for the analysis of high dimensional and massive datasets. Motivated by this technique, we study matrices that are akin to the ones appearing in the null case of VDM, i.e the case where there is no structure in the dataset under investigation. Developing this understanding is important in making sense of the output of the VDM algorithm whether there is signal or not. We hence develop a theory explaining the behavior of the spectral distribution of a large class of random matrices, in particular random matrices with random block entries. Numerical work shows that the agreement between our theoretical predictions and numerical simulations is generally very good. 1