Results 1  10
of
21
Dynamic anomalography: Tracking network anomalies via sparsity and low rank
, 2013
"... In the backbone of largescale networks, origintodestination (OD) traffic flows experience abrupt unusual changes known as traffic volume anomalies, which can result in congestion and limit the extent to which enduser quality of service requirements are met. As a means of maintaining seamless en ..."
Abstract

Cited by 24 (10 self)
 Add to MetaCart
In the backbone of largescale networks, origintodestination (OD) traffic flows experience abrupt unusual changes known as traffic volume anomalies, which can result in congestion and limit the extent to which enduser quality of service requirements are met. As a means of maintaining seamless enduser experience in dynamic environments, as well as for ensuring network security, this paper deals with a crucial network monitoring task termed dynamic anomalography. Given link traffic measurements (noisy superpositions of unobserved OD flows) periodically acquired by backbone routers, the goal is to construct an estimated map of anomalies in real time, and thus summarize the network ‘health state ’ along both the flow and time dimensions. Leveraging the low intrinsicdimensionality of OD flows and the sparse nature of anomalies, a novel online estimator is proposed based on an exponentiallyweighted leastsquares criterion regularized with the sparsitypromotingnorm of the anomalies, and the nuclear norm of the nominal traffic matrix. After recasting the nonseparable nuclear norm into a form amenable to online optimization, a realtime algorithm for dynamic anomalography is developed and its convergence established under simplifying technical assumptions. For operational conditions where computational complexity reductions are at a premium, a lightweight stochastic gradient algorithm based on Nesterov’s acceleration technique is developed as well. Comprehensive numerical tests with both synthetic and real network data corroborate the effectiveness of the proposed online algorithms and their tracking capabilities, and demonstrate that they outperform stateoftheart approaches developed to diagnose traffic anomalies.
Recursive robust pca or recursive sparse recovery in large but structured noise
 in IEEE Intl. Symp. on Information Theory (ISIT
, 2013
"... This Dissertation is brought to you for free and open access by the Graduate College at Digital Repository @ Iowa State University. It has been accepted for inclusion in Graduate Theses and Dissertations by an authorized administrator of Digital Repository @ Iowa State University. For more informati ..."
Abstract

Cited by 22 (17 self)
 Add to MetaCart
(Show Context)
This Dissertation is brought to you for free and open access by the Graduate College at Digital Repository @ Iowa State University. It has been accepted for inclusion in Graduate Theses and Dissertations by an authorized administrator of Digital Repository @ Iowa State University. For more information, please contact
An online algorithm for separating sparse and lowdimensional signal sequences from their sum
 IEEE Trans. Signal Process
"... Abstract—This paper designs and extensively evaluates an online algorithm, called practical recursive projected compressive sensing (PracReProCS), for recovering a time sequence of sparse vectors and a time sequence of dense vectors from their sum, , when the ’s lie in a slowly changing lowdimens ..."
Abstract

Cited by 10 (8 self)
 Add to MetaCart
Abstract—This paper designs and extensively evaluates an online algorithm, called practical recursive projected compressive sensing (PracReProCS), for recovering a time sequence of sparse vectors and a time sequence of dense vectors from their sum, , when the ’s lie in a slowly changing lowdimensional subspace of the full space. A key application where this problem occurs is in realtime video layering where the goal is to separate a video sequence into a slowly changing background sequence and a sparse foreground sequence that consists of one or more moving regions/objects onthefly. PracReProCS is a practical modification of its theoretical counterpart which was analyzed in our recent work. Extension to the undersampled case is also developed. Extensive experimental comparisons demonstrating the advantage of the approach for both simulated and real videos, over existing batch and recursive methods, are shown. Index Terms—Online robust PCA, recursive sparse recovery, large but structured noise, compressed sensing. I.
Performance guarantees for undersampled recursive sparse recovery in large but structured noise (long version).” [Online]. Available: http://www.public.iastate. edu/%7Eblois/ReProModCSLong.pdf
"... Abstract—We study the problem of recursively reconstructing a time sequence of sparse vectors St from measurements of the form Mt = ASt +BLt where A and B are known measurement matrices, and Lt lies in a slowly changing low dimensional subspace. We assume that the signal of interest (St) is sparse, ..."
Abstract

Cited by 8 (7 self)
 Add to MetaCart
(Show Context)
Abstract—We study the problem of recursively reconstructing a time sequence of sparse vectors St from measurements of the form Mt = ASt +BLt where A and B are known measurement matrices, and Lt lies in a slowly changing low dimensional subspace. We assume that the signal of interest (St) is sparse, and has support which is correlated over time. We introduce a solution which we call Recursive Projected Modified Compressed Sensing (ReProMoCS), which exploits the correlated support change of St. We show that, under weaker assumptions than previous work, with high probability, ReProMoCS will exactly recover the support set of St and the reconstruction error of St is upper bounded by a small timeinvariant value. A motivating application where the above problem occurs is in functional MRI imaging of the brain to detect regions that are “activated ” in response to stimuli. In this case both measurement matrices are the same (i.e. A = B). The active region image constitutes the sparse vector St and this region changes slowly over time. The background brain image changes are global but the amount of change is very little and hence it can be well modeled as lying in a slowly changing low dimensional subspace, i.e. this constitutes Lt. I.
Recursive sparse recovery in large but structured noise  part 2,” arXiv: 1211.3754 [cs.IT
, 2013
"... Abstract—We study the problem of recursively recovering a time sequence of sparse vectors, St, from measurements Mt: = St + Lt that are corrupted by structured noise Lt which is dense and can have large magnitude. The structure that we require is that Lt should lie in a low dimensional subspace that ..."
Abstract

Cited by 7 (7 self)
 Add to MetaCart
(Show Context)
Abstract—We study the problem of recursively recovering a time sequence of sparse vectors, St, from measurements Mt: = St + Lt that are corrupted by structured noise Lt which is dense and can have large magnitude. The structure that we require is that Lt should lie in a low dimensional subspace that is either fixed or changes “slowly enough”; and the eigenvalues of its covariance matrix are “clustered”. We do not assume any model on the sequence of sparse vectors. Their support sets and their nonzero element values may be either independent or correlated over time (usually in many applications they are correlated). The only thing required is that there be some support change every so often. We introduce a novel solution approach called Recursive Projected Compressive Sensing with clusterPCA (ReProCScPCA) that addresses some of the limitations of earlier work. Under mild assumptions, we show that, with high probability, ReProCScPCA can exactly recover the support set of St at all times; and the reconstruction errors of both St and Lt are upper bounded by a timeinvariant and small value. I.
Decentralized sparsityregularized rank minimization: Algorithms and applications
 IEEE Trans. Signal Process
, 2013
"... Abstract—Given a limited number of entries from the superposition of a lowrank matrix plus the product of a known compression matrix times a sparse matrix, recovery of the lowrank and sparse components is a fundamental task subsuming compressed sensing, matrix completion, and principal components ..."
Abstract

Cited by 7 (5 self)
 Add to MetaCart
(Show Context)
Abstract—Given a limited number of entries from the superposition of a lowrank matrix plus the product of a known compression matrix times a sparse matrix, recovery of the lowrank and sparse components is a fundamental task subsuming compressed sensing, matrix completion, and principal components pursuit. This paper develops algorithms for decentralized sparsityregularized rank minimization over networks, when the nuclear andnorm are used as surrogates to the rank and nonzero entry counts of the sought matrices, respectively. While nuclearnorm minimization has welldocumented merits when centralized processing is viable, nonseparability of the singularvalue sum challenges its decentralized minimization. To overcome this limitation, leveraging an alternative characterization of the nuclear norm yields a separable, yet nonconvex cost minimized via the alternatingdirection method of multipliers. Interestingly, if the decentralized (nonconvex) estimator converges, under certain conditions it provably attains the global optimum of its centralized counterpart. As a result, this paper bridges the performance gap between centralized and innetwork decentralized, sparsityregularized rankminimization. This, in turn, facilitates (stable) recovery of the low rank and sparse model matrices through reducedcomplexity pernode computations, and affordable message passing among singlehop neighbors. Several application domains are outlined to highlight the generality and impact of the proposed framework. These include unveiling traffic anomalies in backbone networks, and predicting networkwide path latencies. Simulations with synthetic and real network data confirm the convergence of the novel decentralized algorithm, and its centralized performance guarantees. Index Terms—Decentralized optimization, sparsity, nuclear norm, low rank, networks, Lasso, matrix completion. I.
SEPARATING SPARSE AND LOWDIMENSIONAL SIGNAL SEQUENCES FROM TIMEVARYING UNDERSAMPLED PROJECTIONS OF THEIR SUMS
"... The goal of this work is to recover a sequence of sparse vectors, st; and a sequence of dense vectors, ℓt, that lie in a “slowly changing” low dimensional subspace, from timevarying undersampled linear projections of their sum. This type of problem typically occurs when the quantity being imaged ca ..."
Abstract

Cited by 4 (3 self)
 Add to MetaCart
(Show Context)
The goal of this work is to recover a sequence of sparse vectors, st; and a sequence of dense vectors, ℓt, that lie in a “slowly changing” low dimensional subspace, from timevarying undersampled linear projections of their sum. This type of problem typically occurs when the quantity being imaged can be split into a sum of two layers, one of which is sparse and the other is lowdimensional. A key application where this problem occurs is in undersampled functional magnetic resonance imaging (fMRI) to detect brain activation patterns in response to a stimulus. The brain image at time t can be modeled as being a sum of the active region image, st, (equal to the activation in the active region and zero everywhere else) and the background brain image, ℓt, which can be accurately modeled as lying in a slowly changing low dimensional subspace. We introduce a novel solution approach called matrix completion projected compressive sensing or MatComProCS. Significantly improved performance of MatComProCS over existing work is shown for the undersampled fMRI based brain active region detection problem. Index Terms — matrix completion, compressive sensing, fMRI 1.
PRACTICAL REPROCS FOR SEPARATING SPARSE AND LOWDIMENSIONAL SIGNAL SEQUENCES FROM THEIR SUM – PART 1
"... This paper designs and evaluates a practical algorithm, called PracReProCS, for recovering a time sequence of sparse vectors St and a time sequence of dense vectors Lt from their sum, Mt: = St + Lt, when any subsequence of the Lt’s lies in a slowly changing lowdimensional subspace. A key applica ..."
Abstract

Cited by 3 (3 self)
 Add to MetaCart
(Show Context)
This paper designs and evaluates a practical algorithm, called PracReProCS, for recovering a time sequence of sparse vectors St and a time sequence of dense vectors Lt from their sum, Mt: = St + Lt, when any subsequence of the Lt’s lies in a slowly changing lowdimensional subspace. A key application where this problem occurs is in video layering where the goal is to separate a video sequence into a slowly changing background sequence and a sparse foreground sequence that consists of one or more moving regions/objects. PracReProCS is the practical analog of its theoretical counterpart that was studied in our recent work. Index Terms — robust PCA, robust matrix completion, sparse recovery, compressed sensing 1.
Semiblind source separation via sparse representations and online dictionary learning
 In Proc. Asilomar Conf. on Signals, Systems, and Computers
, 2013
"... This work examines a semiblind source separation problem where the aim is to separate one source, whose local (nominally periodic) structure is partially or approximately known, from another a priori unspecified but structured source, given only a single linear combination of the two sources. We p ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
(Show Context)
This work examines a semiblind source separation problem where the aim is to separate one source, whose local (nominally periodic) structure is partially or approximately known, from another a priori unspecified but structured source, given only a single linear combination of the two sources. We propose a novel separation technique based on local sparse approximations; a key feature of our procedure is the online learning of dictionaries (using only the data itself) which sparsely model the a priori unknown source. We demonstrate the performance of our proposed approach via simulation in a stylized audio source separation problem. Index Terms — Semiblind source separation, sparse representations, online dictionary learning 1.
Recursive projected modified compressed sensing for undersampled measurements,” http://www.public.iastate.edu/ blois/ReProModCSLong.pdf
"... Abstract—We study the problem of recursively reconstructing a time sequence of sparse vectors St from measurements of the form Mt = ASt + BLt where A and B are known measurement matrices, and Lt lies in a slowly changing low dimensional subspace. We assume that the signal of interest (St) is sparse, ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
(Show Context)
Abstract—We study the problem of recursively reconstructing a time sequence of sparse vectors St from measurements of the form Mt = ASt + BLt where A and B are known measurement matrices, and Lt lies in a slowly changing low dimensional subspace. We assume that the signal of interest (St) is sparse, and has support which is correlated over time. We introduce a solution which we call Recursive Projected Modified Compressed Sensing (ReProMoCS), which exploits the correlated support change of St. We show that, under weaker assumptions than previous work, with high probability, ReProMoCS will exactly recover the support set of St and the reconstruction error of St is upper bounded by a small timeinvariant value. A motivating application where the above problem occurs is in functional MRI imaging of the brain to detect regions that are “activated ” in response to stimuli. In this case both measurement matrices are the same (i.e. A = B). The active region image constitutes the sparse vector St and this region changes slowly over time. The background brain image changes are global but the amount of change is very little and hence it can be well modeled as lying in a slowly changing low dimensional subspace, i.e. this constitutes Lt. I.