Results 1  10
of
20
An online algorithm for separating sparse and lowdimensional signal sequences from their sum
 IEEE Trans. Signal Process
"... Abstract—This paper designs and extensively evaluates an online algorithm, called practical recursive projected compressive sensing (PracReProCS), for recovering a time sequence of sparse vectors and a time sequence of dense vectors from their sum, , when the ’s lie in a slowly changing lowdimens ..."
Abstract

Cited by 10 (8 self)
 Add to MetaCart
Abstract—This paper designs and extensively evaluates an online algorithm, called practical recursive projected compressive sensing (PracReProCS), for recovering a time sequence of sparse vectors and a time sequence of dense vectors from their sum, , when the ’s lie in a slowly changing lowdimensional subspace of the full space. A key application where this problem occurs is in realtime video layering where the goal is to separate a video sequence into a slowly changing background sequence and a sparse foreground sequence that consists of one or more moving regions/objects onthefly. PracReProCS is a practical modification of its theoretical counterpart which was analyzed in our recent work. Extension to the undersampled case is also developed. Extensive experimental comparisons demonstrating the advantage of the approach for both simulated and real videos, over existing batch and recursive methods, are shown. Index Terms—Online robust PCA, recursive sparse recovery, large but structured noise, compressed sensing. I.
Subspace learning and imputation for streaming big data matrices and tensors
 IEEE Trans. Signal Process
, 2015
"... Abstract—Extracting latent lowdimensional structure from highdimensional data is of paramount importance in timely inference tasks encountered with “Big Data ” analytics. However, increasingly noisy, heterogeneous, and incomplete datasets, as well as the need for realtime processing of streaming ..."
Abstract

Cited by 5 (2 self)
 Add to MetaCart
(Show Context)
Abstract—Extracting latent lowdimensional structure from highdimensional data is of paramount importance in timely inference tasks encountered with “Big Data ” analytics. However, increasingly noisy, heterogeneous, and incomplete datasets, as well as the need for realtime processing of streaming data, pose major challenges to this end. In this context, the present paper permeates benefits from rank minimization to scalable imputation of missing data, via tracking lowdimensional subspaces and unraveling latent (possibly multiway) structure from incomplete streaming data. For lowrank matrix data, a subspace estimator is proposed based on an exponentially weighted leastsquares criterion regularized with the nuclear norm. After recasting the nonseparable nuclear norm into a form amenable to online optimization, realtime algorithms with complementary strengths are developed, and their convergence is established under simplifying technical assumptions. In a stationary setting, the asymptotic estimates obtained offer the welldocumented performance guarantees of the batch nuclearnorm regularized estimator. Under the same unifying framework, a novel online (adaptive) algorithm is developed to obtain multiway decompositions of lowrank tensors with missing entries and perform imputation as a byproduct. Simulated tests with both synthetic as well as real Internet and cardiac magnetic resonance imagery (MRI) data confirm the efficacy of the proposed algorithms, and their superior performance relative to stateoftheart alternatives. Index Terms—Low rank, matrix and tensor completion, missing data, streaming analytics, subspace tracking. I.
Robust pca with partial subspace knowledge,”
 in IEEE Intl. Symp. on Information Theory (ISIT),
, 2014
"... AbstractIn recent work, robust Principal Components Analysis (PCA) has been posed as a problem of recovering a lowrank matrix L and a sparse matrix S from their sum, M := L + S and a provably exact convex optimization solution called PCP has been proposed. This work studies the following problem. ..."
Abstract

Cited by 5 (2 self)
 Add to MetaCart
(Show Context)
AbstractIn recent work, robust Principal Components Analysis (PCA) has been posed as a problem of recovering a lowrank matrix L and a sparse matrix S from their sum, M := L + S and a provably exact convex optimization solution called PCP has been proposed. This work studies the following problem. Suppose that we have partial knowledge about the column space of the low rank matrix L. Can we use this information to improve the PCP solution, i.e. allow recovery under weaker assumptions? We propose here a simple but useful modification of the PCP idea, called modifiedPCP, that allows us to use this knowledge. We derive its correctness result which shows that, when the available subspace knowledge is accurate, modifiedPCP indeed requires significantly weaker incoherence assumptions than PCP. Extensive simulations are also used to illustrate this. Comparisons with PCP and other existing work are shown for a stylized real application as well. Finally, we explain how this problem naturally occurs in many applications involving time series data, i.e. in what is called the online or recursive robust PCA problem. A corollary for this case is also given.
PRACTICAL REPROCS FOR SEPARATING SPARSE AND LOWDIMENSIONAL SIGNAL SEQUENCES FROM THEIR SUM – PART 1
"... This paper designs and evaluates a practical algorithm, called PracReProCS, for recovering a time sequence of sparse vectors St and a time sequence of dense vectors Lt from their sum, Mt: = St + Lt, when any subsequence of the Lt’s lies in a slowly changing lowdimensional subspace. A key applica ..."
Abstract

Cited by 3 (3 self)
 Add to MetaCart
(Show Context)
This paper designs and evaluates a practical algorithm, called PracReProCS, for recovering a time sequence of sparse vectors St and a time sequence of dense vectors Lt from their sum, Mt: = St + Lt, when any subsequence of the Lt’s lies in a slowly changing lowdimensional subspace. A key application where this problem occurs is in video layering where the goal is to separate a video sequence into a slowly changing background sequence and a sparse foreground sequence that consists of one or more moving regions/objects. PracReProCS is the practical analog of its theoretical counterpart that was studied in our recent work. Index Terms — robust PCA, robust matrix completion, sparse recovery, compressed sensing 1.
ROML: A robust feature correspondence approach for matching objects in a set of images
, 2014
"... Featurebased object matching is a fundamental problem for many applications in computer vision, such as object recognition, 3D reconstruction, tracking, and motion segmentation. In this work, we consider simultaneously matching object instances in a set of images,where both inlier and outlier feat ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
Featurebased object matching is a fundamental problem for many applications in computer vision, such as object recognition, 3D reconstruction, tracking, and motion segmentation. In this work, we consider simultaneously matching object instances in a set of images,where both inlier and outlier features are extracted. The task is to identify the inlier features and establish their consistent correspondences across the image set. This is a challenging combinatorial problem, and the problem complexity grows exponentially
Robust Stochastic Principal Component Analysis
"... We consider the problem of finding lower dimensional subspaces in the presence of outliers and noise in the online setting. In particular, we extend previous batch formulations of robust PCA to the stochastic setting with minimal storage requirements and runtime complexity. We introduce three n ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
(Show Context)
We consider the problem of finding lower dimensional subspaces in the presence of outliers and noise in the online setting. In particular, we extend previous batch formulations of robust PCA to the stochastic setting with minimal storage requirements and runtime complexity. We introduce three novel stochastic approximation algorithms for robust PCA that are extensions of standard algorithms for PCA – the stochastic power method, incremental PCA and online PCA using matrixexponentiatedgradient (MEG) updates. For robust online PCA we also give a sublinear convergence guarantee. Our numerical results demonstrate the superiority of the the robust online method over the other robust stochastic methods and the advantage of robust methods over their nonrobust counterparts in the presence of outliers in artificial and real scenarios. 1
Learning Structured LowRank Representation via Matrix Factorization
"... Abstract A vast body of recent works in the literature have shown that exploring structures beyond data lowrankness can boost the performance of subspace clustering methods such as LowRank Representation (LRR). It has also been well recognized that the matrix factorization framework might offer mo ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
(Show Context)
Abstract A vast body of recent works in the literature have shown that exploring structures beyond data lowrankness can boost the performance of subspace clustering methods such as LowRank Representation (LRR). It has also been well recognized that the matrix factorization framework might offer more flexibility on pursuing underlying structures of the data. In this paper, we propose to learn structured LRR by factorizing the nuclear norm regularized matrix, which leads to our proposed nonconvex formulation NLRR. Interestingly, this formulation of NLRR provides a general framework for unifying a variety of popular algorithms including LRR, dictionary learning, robust principal component analysis, sparse subspace clustering, etc. Several variants of NLRR are also proposed, for example, to promote sparsity while preserving lowrankness. We design a practical algorithm for NLRR and its variants, and establish theoretical guarantee for the stability of the solution and the convergence of the algorithm. Perhaps surprisingly, the computational and memory cost of NLRR can be reduced by roughly one order of magnitude compared to the cost of LRR. Experiments on extensive simulations and real datasets confirm the robustness of efficiency of NLRR and the variants.
Online lowrank subspace clustering by basis dictionary pursuit. arXiv preprint arXiv:1503.08356,
, 2015
"... Abstract LowRank Representation (LRR) has been a significant method for segmenting data that are generated from a union of subspaces. It is also known that solving LRR is challenging in terms of time complexity and memory footprint, in that the size of the nuclear norm regularized matrix is nbyn ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
(Show Context)
Abstract LowRank Representation (LRR) has been a significant method for segmenting data that are generated from a union of subspaces. It is also known that solving LRR is challenging in terms of time complexity and memory footprint, in that the size of the nuclear norm regularized matrix is nbyn (where n is the number of samples). In this paper, we thereby develop a novel online implementation of LRR that reduces the memory cost from O(n 2 ) to O(pd), with p being the ambient dimension and d being some estimated rank (d < p ≪ n). We also establish the theoretical guarantee that the sequence of solutions produced by our algorithm converges to a stationary point of the expected loss function asymptotically. Extensive experiments on synthetic and realistic datasets further substantiate that our algorithm is fast, robust and memory efficient.
1Robust PCA with Partial Subspace Knowledge
"... Abstract—In recent work, robust Principal Components Analysis (PCA) has been posed as a problem of recovering a lowrank matrix L and a sparse matrix S from their sum, M: = L + S and a provably exact convex optimization solution called PCP has been proposed. This work studies the following problem. ..."
Abstract
 Add to MetaCart
Abstract—In recent work, robust Principal Components Analysis (PCA) has been posed as a problem of recovering a lowrank matrix L and a sparse matrix S from their sum, M: = L + S and a provably exact convex optimization solution called PCP has been proposed. This work studies the following problem. Suppose that we have partial knowledge about the column space of the low rank matrix L. Can we use this information to improve the PCP solution, i.e. allow recovery under weaker assumptions? We propose here a simple but useful modification of the PCP idea, called modifiedPCP, that allows us to use this knowledge. We derive its correctness result which shows that, when the available subspace knowledge is accurate, modifiedPCP indeed requires significantly weaker incoherence assumptions than PCP. Extensive simulations are also used to illustrate this. Comparisons with PCP and other existing work are shown for a stylized real application as well. Finally, we explain how this problem naturally occurs in many applications involving time series data, i.e. in what is called the online or recursive robust PCA problem. A corollary for this case is also given. I.