Results 1 - 10
of
20
An online algorithm for separating sparse and low-dimensional signal sequences from their sum
- IEEE Trans. Signal Process
"... Abstract—This paper designs and extensively evaluates an online algorithm, called practical recursive projected compressive sensing (Prac-ReProCS), for recovering a time sequence of sparse vectors and a time sequence of dense vectors from their sum, , when the ’s lie in a slowly changing low-di-mens ..."
Abstract
-
Cited by 10 (8 self)
- Add to MetaCart
Abstract—This paper designs and extensively evaluates an online algorithm, called practical recursive projected compressive sensing (Prac-ReProCS), for recovering a time sequence of sparse vectors and a time sequence of dense vectors from their sum, , when the ’s lie in a slowly changing low-di-mensional subspace of the full space. A key application where this problem occurs is in real-time video layering where the goal is to separate a video sequence into a slowly changing background sequence and a sparse foreground sequence that consists of one or more moving regions/objects on-the-fly. Prac-ReProCS is a practical modification of its theoretical counterpart which was analyzed in our recent work. Extension to the undersampled case is also developed. Extensive experimental comparisons demon-strating the advantage of the approach for both simulated and real videos, over existing batch and recursive methods, are shown. Index Terms—Online robust PCA, recursive sparse recovery, large but structured noise, compressed sensing. I.
Subspace learning and imputation for streaming big data matrices and tensors
- IEEE Trans. Signal Process
, 2015
"... Abstract—Extracting latent low-dimensional structure from high-dimensional data is of paramount importance in timely inference tasks encountered with “Big Data ” analytics. However, increasingly noisy, heterogeneous, and incomplete datasets, as well as the need for real-time processing of streaming ..."
Abstract
-
Cited by 5 (2 self)
- Add to MetaCart
(Show Context)
Abstract—Extracting latent low-dimensional structure from high-dimensional data is of paramount importance in timely inference tasks encountered with “Big Data ” analytics. However, increasingly noisy, heterogeneous, and incomplete datasets, as well as the need for real-time processing of streaming data, pose major challenges to this end. In this context, the present paper permeates benefits from rank minimization to scalable imputation of missing data, via tracking low-dimensional subspaces and unraveling latent (possibly multi-way) structure from incomplete streaming data. For low-rank matrix data, a subspace estimator is proposed based on an exponentially weighted least-squares criterion regularized with the nuclear norm. After recasting the nonseparable nuclear norm into a form amenable to online opti-mization, real-time algorithms with complementary strengths are developed, and their convergence is established under simplifying technical assumptions. In a stationary setting, the asymptotic estimates obtained offer the well-documented performance guar-antees of the batch nuclear-norm regularized estimator. Under the same unifying framework, a novel online (adaptive) algorithm is developed to obtain multi-way decompositions of low-rank tensors with missing entries and perform imputation as a byproduct. Sim-ulated tests with both synthetic as well as real Internet and cardiac magnetic resonance imagery (MRI) data confirm the efficacy of the proposed algorithms, and their superior performance relative to state-of-the-art alternatives. Index Terms—Low rank, matrix and tensor completion, missing data, streaming analytics, subspace tracking. I.
Robust pca with partial subspace knowledge,”
- in IEEE Intl. Symp. on Information Theory (ISIT),
, 2014
"... Abstract-In recent work, robust Principal Components Analysis (PCA) has been posed as a problem of recovering a low-rank matrix L and a sparse matrix S from their sum, M := L + S and a provably exact convex optimization solution called PCP has been proposed. This work studies the following problem. ..."
Abstract
-
Cited by 5 (2 self)
- Add to MetaCart
(Show Context)
Abstract-In recent work, robust Principal Components Analysis (PCA) has been posed as a problem of recovering a low-rank matrix L and a sparse matrix S from their sum, M := L + S and a provably exact convex optimization solution called PCP has been proposed. This work studies the following problem. Suppose that we have partial knowledge about the column space of the low rank matrix L. Can we use this information to improve the PCP solution, i.e. allow recovery under weaker assumptions? We propose here a simple but useful modification of the PCP idea, called modified-PCP, that allows us to use this knowledge. We derive its correctness result which shows that, when the available subspace knowledge is accurate, modified-PCP indeed requires significantly weaker incoherence assumptions than PCP. Extensive simulations are also used to illustrate this. Comparisons with PCP and other existing work are shown for a stylized real application as well. Finally, we explain how this problem naturally occurs in many applications involving time series data, i.e. in what is called the online or recursive robust PCA problem. A corollary for this case is also given.
PRACTICAL REPROCS FOR SEPARATING SPARSE AND LOW-DIMENSIONAL SIGNAL SEQUENCES FROM THEIR SUM – PART 1
"... This paper designs and evaluates a practical algorithm, called Prac-ReProCS, for recovering a time sequence of sparse vec-tors St and a time sequence of dense vectors Lt from their sum, Mt: = St + Lt, when any subsequence of the Lt’s lies in a slowly changing low-dimensional subspace. A key appli-ca ..."
Abstract
-
Cited by 3 (3 self)
- Add to MetaCart
(Show Context)
This paper designs and evaluates a practical algorithm, called Prac-ReProCS, for recovering a time sequence of sparse vec-tors St and a time sequence of dense vectors Lt from their sum, Mt: = St + Lt, when any subsequence of the Lt’s lies in a slowly changing low-dimensional subspace. A key appli-cation where this problem occurs is in video layering where the goal is to separate a video sequence into a slowly chang-ing background sequence and a sparse foreground sequence that consists of one or more moving regions/objects. Prac-ReProCS is the practical analog of its theoretical counterpart that was studied in our recent work. Index Terms — robust PCA, robust matrix completion, sparse recovery, compressed sensing 1.
ROML: A robust feature correspondence approach for matching objects in a set of images
, 2014
"... Feature-based object matching is a fundamental problem for many applications in computer vision, such as object recognition, 3D reconstruction, tracking, and motion segmentation. In this work, we consider simultaneously matching object instances in a set of images,where both inlier and outlier feat ..."
Abstract
-
Cited by 3 (2 self)
- Add to MetaCart
Feature-based object matching is a fundamental problem for many applications in computer vision, such as object recognition, 3D reconstruction, tracking, and motion segmentation. In this work, we consider simultaneously matching object instances in a set of images,where both inlier and outlier features are extracted. The task is to identify the inlier features and establish their consistent correspondences across the image set. This is a challenging combinatorial problem, and the problem complexity grows exponentially
Robust Stochastic Principal Component Analysis
"... We consider the problem of finding lower di-mensional subspaces in the presence of out-liers and noise in the online setting. In par-ticular, we extend previous batch formula-tions of robust PCA to the stochastic setting with minimal storage requirements and run-time complexity. We introduce three n ..."
Abstract
-
Cited by 2 (1 self)
- Add to MetaCart
(Show Context)
We consider the problem of finding lower di-mensional subspaces in the presence of out-liers and noise in the online setting. In par-ticular, we extend previous batch formula-tions of robust PCA to the stochastic setting with minimal storage requirements and run-time complexity. We introduce three novel stochastic approximation algorithms for ro-bust PCA that are extensions of standard algorithms for PCA – the stochastic power method, incremental PCA and online PCA using matrix-exponentiated-gradient (MEG) updates. For robust online PCA we also give a sub-linear convergence guarantee. Our nu-merical results demonstrate the superiority of the the robust online method over the other robust stochastic methods and the advan-tage of robust methods over their non-robust counterparts in the presence of outliers in ar-tificial and real scenarios. 1
Learning Structured Low-Rank Representation via Matrix Factorization
"... Abstract A vast body of recent works in the literature have shown that exploring structures beyond data lowrankness can boost the performance of subspace clustering methods such as Low-Rank Representation (LRR). It has also been well recognized that the matrix factorization framework might offer mo ..."
Abstract
-
Cited by 1 (1 self)
- Add to MetaCart
(Show Context)
Abstract A vast body of recent works in the literature have shown that exploring structures beyond data lowrankness can boost the performance of subspace clustering methods such as Low-Rank Representation (LRR). It has also been well recognized that the matrix factorization framework might offer more flexibility on pursuing underlying structures of the data. In this paper, we propose to learn structured LRR by factorizing the nuclear norm regularized matrix, which leads to our proposed non-convex formulation NLRR. Interestingly, this formulation of NLRR provides a general framework for unifying a variety of popular algorithms including LRR, dictionary learning, robust principal component analysis, sparse subspace clustering, etc. Several variants of NLRR are also proposed, for example, to promote sparsity while preserving low-rankness. We design a practical algorithm for NLRR and its variants, and establish theoretical guarantee for the stability of the solution and the convergence of the algorithm. Perhaps surprisingly, the computational and memory cost of NLRR can be reduced by roughly one order of magnitude compared to the cost of LRR. Experiments on extensive simulations and real datasets confirm the robustness of efficiency of NLRR and the variants.
Online low-rank subspace clustering by basis dictionary pursuit. arXiv preprint arXiv:1503.08356,
, 2015
"... Abstract Low-Rank Representation (LRR) has been a significant method for segmenting data that are generated from a union of subspaces. It is also known that solving LRR is challenging in terms of time complexity and memory footprint, in that the size of the nuclear norm regularized matrix is n-by-n ..."
Abstract
-
Cited by 1 (1 self)
- Add to MetaCart
(Show Context)
Abstract Low-Rank Representation (LRR) has been a significant method for segmenting data that are generated from a union of subspaces. It is also known that solving LRR is challenging in terms of time complexity and memory footprint, in that the size of the nuclear norm regularized matrix is n-by-n (where n is the number of samples). In this paper, we thereby develop a novel online implementation of LRR that reduces the memory cost from O(n 2 ) to O(pd), with p being the ambient dimension and d being some estimated rank (d < p ≪ n). We also establish the theoretical guarantee that the sequence of solutions produced by our algorithm converges to a stationary point of the expected loss function asymptotically. Extensive experiments on synthetic and realistic datasets further substantiate that our algorithm is fast, robust and memory efficient.
1Robust PCA with Partial Subspace Knowledge
"... Abstract—In recent work, robust Principal Components Anal-ysis (PCA) has been posed as a problem of recovering a low-rank matrix L and a sparse matrix S from their sum, M: = L + S and a provably exact convex optimization solution called PCP has been proposed. This work studies the following problem. ..."
Abstract
- Add to MetaCart
Abstract—In recent work, robust Principal Components Anal-ysis (PCA) has been posed as a problem of recovering a low-rank matrix L and a sparse matrix S from their sum, M: = L + S and a provably exact convex optimization solution called PCP has been proposed. This work studies the following problem. Suppose that we have partial knowledge about the column space of the low rank matrix L. Can we use this information to improve the PCP solution, i.e. allow recovery under weaker assumptions? We propose here a simple but useful modification of the PCP idea, called modified-PCP, that allows us to use this knowledge. We derive its correctness result which shows that, when the available subspace knowledge is accurate, modified-PCP indeed requires significantly weaker incoherence assumptions than PCP. Extensive simulations are also used to illustrate this. Comparisons with PCP and other existing work are shown for a stylized real application as well. Finally, we explain how this problem naturally occurs in many applications involving time series data, i.e. in what is called the online or recursive robust PCA problem. A corollary for this case is also given. I.