#### DMCA

## Simultaneous codeword optimization (SimCO) for dictionary update and learning (2012)

### Cached

### Download Links

Venue: | IEEE Trans. Signal Process |

Citations: | 14 - 8 self |

### Citations

4185 | Regression shrinkage and selection via the lasso
- Tibshirani
- 1996
(Show Context)
Citation Context ...of algorithms including basis pursuit (BP) [22], matching pursuit (MP) [23], orthogonal matching pursuit (OMP) [24], [25], subspace pursuit (SP) [26], [27], regression shrinkage and selection (LASSO) =-=[28]-=-, focal under-determined system solver (FOCUSS) [29], and gradient pursuit (GP) [30]. Sparse decompositions of a signal, however, rely greatly on the degree of fitting between the data and the diction... |

3302 | Numerical optimization
- Nocedal, Wright
- 2006
(Show Context)
Citation Context ... means optimal. Other ways to do a gradient descent efficiently can be found in [64, Chapter 3]. 1Algorithm 2 looks more complicated than popular gradient descent methods in standard textbooks, e.g., =-=[64]-=-. We choose this implementation because it mimics the ideal gradient descent with infinitesimal steps more authentically than other optimization methods of which the step size may be so large that loc... |

2707 | Atomic decomposition by basis pursuit
- Chen, Donoho, et al.
(Show Context)
Citation Context ...is, to find the sparse linear decompositions of a signal for a given dictionary. Efforts dedicated to this problem have resulted in the creation of a number of algorithms including basis pursuit (BP) =-=[22]-=-, matching pursuit (MP) [23], orthogonal matching pursuit (OMP) [24], [25], subspace pursuit (SP) [26], [27], regression shrinkage and selection (LASSO) [28], focal under-determined system solver (FOC... |

1665 | Matching pursuits with time-frequency dictionaries
- Mallat, Zhang
- 1993
(Show Context)
Citation Context ...r decompositions of a signal for a given dictionary. Efforts dedicated to this problem have resulted in the creation of a number of algorithms including basis pursuit (BP) [22], matching pursuit (MP) =-=[23]-=-, orthogonal matching pursuit (OMP) [24], [25], subspace pursuit (SP) [26], [27], regression shrinkage and selection (LASSO) [28], focal under-determined system solver (FOCUSS) [29], and gradient purs... |

1386 | Decoding by linear programming
- Candés, Tao
(Show Context)
Citation Context ...edure. In the sparse coding stage, the goal is to find a sparse to minimize for a given dictionary . In practice, the sparse coding problem is often approximately solved by using either -minimization =-=[61]-=- or greedy algorithms, for example, OMP [25] and SP [26] algorithms. 6342 IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 60, NO. 12, DECEMBER 2012 Algorithm 1: A Typical Dictionary Learning Algorithm Ta... |

1302 |
Emergence of simple-cell receptive field properties by learning a sparse code for natural images
- Olshausen, Field
- 1996
(Show Context)
Citation Context ...nents, called codewords or atoms, that are chosen from a dictionary (i.e., the whole collection of all the codewords). Sparse representations have found successful applications in data interpretation =-=[5]-=-, [6], source separation [7], [8], [9], signal denoising [10], [11], coding [12], [13], [14], classification [15], [16], [17], recognition [18], impainting [19], [20] and many more (see e.g. [21]). Tw... |

955 | Sparse coding with an overcomplete basis set: a strategy employed by V1?, Vision Res. 37
- Olshausen, Field
- 1997
(Show Context)
Citation Context ...[34]. Such dictionaries are relatively easy to obtain and more suitable for generic signals. In learning-based approaches, however, the dictionaries are adapted from a set of training data [5], [10], =-=[35]-=-–[42]. Although this may involve higher computational complexity, learned dictionaries have the potential to offer improved performance as compared with predefined dictionaries, since the atoms are de... |

934 | Robust face recognition via sparse representation
- Wright, Yang, et al.
(Show Context)
Citation Context ...have found successful applications in data interpretation [5], [6], source separation [7], [8], [9], signal denoising [10], [11], coding [12], [13], [14], classification [15], [16], [17], recognition =-=[18]-=-, impainting [19], [20] and many more (see e.g. [21]). Two related problems have been studied either separately or jointly in sparse representations. The first one is sparse coding, that is, to find t... |

928 | K-SVD: An Algorithm for Designing Overcomplete Dictionaries for Sparse Representation
- Aharon, Elad, et al.
- 2006
(Show Context)
Citation Context ...tionary (i.e., the whole collection of all the codewords). Sparse representations have found successful applications in data interpretation [5], [6], source separation [7], [8], [9], signal denoising =-=[10]-=-, [11], coding [12], [13], [14], classification [15], [16], [17], recognition [18], impainting [19], [20] and many more (see e.g. [21]). Two related problems have been studied either separately or joi... |

796 | Signal recovery from random measurements via orthogonal matching pursuit,”
- Tropp, Gilbert
- 2007
(Show Context)
Citation Context ...onary. Efforts dedicated to this problem have resulted in the creation of a number of algorithms including basis pursuit (BP) [22], matching pursuit (MP) [23], orthogonal matching pursuit (OMP) [24], =-=[25]-=-, subspace pursuit (SP) [26], [27], regression shrinkage and selection (LASSO) [28], focal under-determined system solver (FOCUSS) [29], and gradient pursuit (GP) [30]. Sparse decompositions of a sign... |

763 | CoSaMP: Iterative signal recovery from incomplete and inaccurate samples,”
- Needell, Tropp
- 2009
(Show Context)
Citation Context ...roblem have resulted in the creation of a number of algorithms including basis pursuit (BP) [22], matching pursuit (MP) [23], orthogonal matching pursuit (OMP) [24], [25], subspace pursuit (SP) [26], =-=[27]-=-, regression shrinkage and selection (LASSO) [28], focal under-determined system solver (FOCUSS) [29], and gradient pursuit (GP) [30]. Sparse decompositions of a signal, however, rely greatly on the d... |

636 | The geometry of algorithms with orthogonality constraints
- Edelman, Arias, et al.
- 1999
(Show Context)
Citation Context ...) relies on the notion of Stiefel and Grassmann manifolds. In particular, the Stiefel manifold is defined as The Grassmann manifold is defined as Here, the notations and follow from the convention in =-=[62]-=-, [63]. Note that each element in is a unit-norm vector while each element in is a one-dimensional subspace in . For any given , it can generate a one-dimensional subspace U . Meanwhile, any given U c... |

631 | Orthogonal Matching Pursuit: Recursive function approximation with applications to wavelet decomposition,” in - Pati, Rezaiifar, et al. - 1993 |

598 |
Image denoising via sparse and redundant representations over learned dictionary
- Elad, Aharon
- 2006
(Show Context)
Citation Context ...ctionary learning performance by combining the dictionary update and sparse coding stages. For sparse coding, we adopt the OMP algorithm [25] as it has been used for testing the K-SVD method in [10], =-=[67]-=-. The overall dictionary learning procedure is given in Algorithm 1. We refer to the iterations between sparse coding and dictionary learning stages as outer-iterations, and the iterations within the ... |

513 | The contourlet transform: an efficient directional multiresolution image representation
- Do, Vetterli
(Show Context)
Citation Context ...l approach generates the dictionary based on a predefined mathematical transform, such as discrete Fourier transform (DFT), discrete cosine transform (DCT), wavelets [31], curvelets [32], contourlets =-=[33]-=-, and bandelets [34]. Such dictionaries are relatively easy to obtain and more suitable for generic signals. In learning-based approaches, however, the dictionaries are adapted from a set of training ... |

394 | Curvelets - a surprisingly effective nonadaptive representation for objects with edges, Curves and
- Candès, Donoho
- 1999
(Show Context)
Citation Context ...ach. The analytical approach generates the dictionary based on a predefined mathematical transform, such as discrete Fourier transform (DFT), discrete cosine transform (DCT), wavelets [31], curvelets =-=[32]-=-, contourlets [33], and bandelets [34]. Such dictionaries are relatively easy to obtain and more suitable for generic signals. In learning-based approaches, however, the dictionaries are adapted from ... |

365 | Sparse signal reconstruction from limited data using focuss: A re-weighted norm minimization algorithm
- Gorodnitsky, Rao
- 1997
(Show Context)
Citation Context ...ching pursuit (MP) [23], orthogonal matching pursuit (OMP) [24], [25], subspace pursuit (SP) [26], [27], regression shrinkage and selection (LASSO) [28], focal under-determined system solver (FOCUSS) =-=[29]-=-, and gradient pursuit (GP) [30]. Sparse decompositions of a signal, however, rely greatly on the degree of fitting between the data and the dictionary, which leads to the second problem, i.e., the is... |

353 | Learning overcomplete representations
- Lewicki, Sejnowski
(Show Context)
Citation Context ...ed. I. INTRODUCTION Sparse signal representations have recently received extensive research interests across several communities including signal processing, information theory, and optimization [1], =-=[2]-=-, [3], [4]. The basic assumption underlying this technique is that a natural signal can be approximated by the combination of only a small number of elementary components, called codewords or atoms, t... |

334 |
Sparse representations in unions of bases
- Gribonval, Nielsen
- 2003
(Show Context)
Citation Context ... the desired structures of the dictionaries, such as the translation-invariant or shift-invariant characteristics of the atoms imposed in [49]–[53] and the orthogonality between subspaces enforced in =-=[54]-=-, and the de-correlation between the atoms promoted in [55]. An advantage of a parametric dictionary lies in its potential for reducing the number of free parameters, thereby leading to a more efficie... |

326 | Online learning for matrix factorization and sparse coding
- Mairal, Bach, et al.
(Show Context)
Citation Context ...tation and better convergence of dictionary learning algorithms [43]. Other recent efforts in dictionary learning include the search for robust and computationally efficient algorithms, such as [56], =-=[57]-=-, and [11], and learning dictionaries from multimodal data [58], [59]. Comprehensive reviews of dictionary learning algorithms can be found in recent survey papers e.g., [43] and [60]. In thispaper, s... |

294 | Optimization Algorithms on Matrix Manifolds - Absil, Mahony, et al. - 2008 |

288 |
Subspace pursuit for compressive sensing signal reconstruction
- Dai, Milenkovic
- 2009
(Show Context)
Citation Context ...this problem have resulted in the creation of a number of algorithms including basis pursuit (BP) [22], matching pursuit (MP) [23], orthogonal matching pursuit (OMP) [24], [25], subspace pursuit (SP) =-=[26]-=-, [27], regression shrinkage and selection (LASSO) [28], focal under-determined system solver (FOCUSS) [29], and gradient pursuit (GP) [30]. Sparse decompositions of a signal, however, rely greatly on... |

272 | Blind source separation by sparse decomposition in a signal dictionary
- Zibulevsky, Pearlmutter
- 2001
(Show Context)
Citation Context ...toms, that are chosen from a dictionary (i.e., the whole collection of all the codewords). Sparse representations have found successful applications in data interpretation [5], [6], source separation =-=[7]-=-, [8], [9], signal denoising [10], [11], coding [12], [13], [14], classification [15], [16], [17], recognition [18], impainting [19], [20] and many more (see e.g. [21]). Two related problems have been... |

270 | The dual-tree complex wavelet transform.
- Selesnick, Baraniuk, et al.
- 2005
(Show Context)
Citation Context ...ning-based approach. The analytical approach generates the dictionary based on a predefined mathematical transform, such as discrete Fourier transform (DFT), discrete cosine transform (DCT), wavelets =-=[31]-=-, curvelets [32], contourlets [33], and bandelets [34]. Such dictionaries are relatively easy to obtain and more suitable for generic signals. In learning-based approaches, however, the dictionaries a... |

193 | Sparse geometric image representation with bandelets
- LePennec, Mallat
(Show Context)
Citation Context ... the dictionary based on a predefined mathematical transform, such as discrete Fourier transform (DFT), discrete cosine transform (DCT), wavelets [31], curvelets [32], contourlets [33], and bandelets =-=[34]-=-. Such dictionaries are relatively easy to obtain and more suitable for generic signals. In learning-based approaches, however, the dictionaries are adapted from a set of training data [5], [10], [35]... |

190 | Forming sparse representations by local anti-hebbian learning.
- Foldiak
- 1990
(Show Context)
Citation Context ...g speed. I. INTRODUCTION Sparse signal representations have recently received extensive research interests across several communities including signal processing, information theory, and optimization =-=[1]-=-, [2], [3], [4]. The basic assumption underlying this technique is that a natural signal can be approximated by the combination of only a small number of elementary components, called codewords or ato... |

183 |
Dictionary learning algorithms for sparse representation
- Kreutz-Delgado, Murray, et al.
(Show Context)
Citation Context ...dant dictionary. The sparse approximation step in the ML algorithm [5] which involves probabilistic inference is computationally expensive. In a similar probabilistic framework, Kreutz-Delgado et al. =-=[37]-=- proposed a maximum a posteriori (MAP) dictionary 1053-587X/$31.00 © 2012 IEEE DAI et al.: SIMCO FOR DICTIONARY UPDATE AND LEARNING 6341 learning algorithm, where the maximization of the likelihood fu... |

168 |
Discriminative learned dictionaries for local image analysis
- Mairal, Bach, et al.
- 2008
(Show Context)
Citation Context ... Sparse representations have found successful applications in data interpretation [5], [6], source separation [7], [8], [9], signal denoising [10], [11], coding [12], [13], [14], classification [15], =-=[16]-=-, [17], recognition [18], impainting [19], [20] and many more (see e.g. [21]). Two related problems have been studied either separately or jointly in sparse representations. The first one is sparse co... |

149 | Sparse Bayesian learning for basis selection,” - Wipf, Rao - 2004 |

107 | Dictionaries for Sparse Representation Modeling
- Rubinstein, Bruckstein, et al.
(Show Context)
Citation Context ... optimal directions (MOD), in which a closed-form solution for the dictionary update has been proposed. This method is one of the earliest methods that implements the concept of sparification process =-=[43]-=-. Several variants of this algorithm, such as the iterative least squares (ILS) method, have also been developed which were summarized in [44]. A recursive least squares (RLS) dictionary learning algo... |

82 | Gradient pursuits.
- Blumensath, Davies
- 2008
(Show Context)
Citation Context ...nal matching pursuit (OMP) [24], [25], subspace pursuit (SP) [26], [27], regression shrinkage and selection (LASSO) [28], focal under-determined system solver (FOCUSS) [29], and gradient pursuit (GP) =-=[30]-=-. Sparse decompositions of a signal, however, rely greatly on the degree of fitting between the data and the dictionary, which leads to the second problem, i.e., the issue of dictionary design. An ove... |

77 | Sparse representation for signal classification.
- Huang, Aviyente
- 2007
(Show Context)
Citation Context ...ords). Sparse representations have found successful applications in data interpretation [5], [6], source separation [7], [8], [9], signal denoising [10], [11], coding [12], [13], [14], classification =-=[15]-=-, [16], [17], recognition [18], impainting [19], [20] and many more (see e.g. [21]). Two related problems have been studied either separately or jointly in sparse representations. The first one is spa... |

73 |
Method of optimal directions for frame design,”
- Engan, Aase, et al.
- 1999
(Show Context)
Citation Context ...ed by the maximization of the posterior probability that a given signal can be synthesized by a dictionary and the sparse coefficients. Based on the same ML objective function as in [5], Engan et al. =-=[36]-=- developed a more efficient algorithm, called the method of optimal directions (MOD), in which a closed-form solution for the dictionary update has been proposed. This method is one of the earliest me... |

65 | Quantization bounds on grassmann manifolds and applications to mimo communications. Information Theory,
- Dai, Liu, et al.
- 2008
(Show Context)
Citation Context ...rom the uniform distribution on the Stiefel manifold. Define a set to describe the set of “bad” starting points. It is defined by which contains all unit vectors that are orthogonal to . According to =-=[68]-=-, under the uniform measure on , the measure of the set is zero. As a result, the starting point with probability one. The reason that we refer to as the set of “bad” starting points is explained by t... |

48 | Topics in Sparse Approximation - Tropp - 2004 |

47 | Dictionary learning for sparse approximations with majorization method
- Yaghoobi, Blumensath, et al.
- 2009
(Show Context)
Citation Context ...ion approach to MOD, but updates the dictionary on an atom-by-atom basis, without having to compute matrix inversion as required in the original MOD algorithm. The majorization method was proposed by =-=[46]-=- in which the original objective function is substituted by a surrogate function in each step of the optimization process. In contrast to the generic dictionaries described above, learning structure-o... |

46 | Sparse and shift-invariant representations of music - Blumensath, Davies |

42 | Learning sparse multiscale image representations,”
- Sallee, Olshausen
- 2002
(Show Context)
Citation Context ...g structure-oriented parametric dictionaries has also attracted attention. For example, a Gammatone generating function has been used by Yaghoobi et al. [47] to learn dictionaries from audio data. In =-=[48]-=-, a pyramidal wavelet-like transform was proposed to learn a multiscale structure in the dictionary. Other constraints have also been considered in the learning process to favor the desired structures... |

37 | Sparse representations in audio and music: from coding to source separation,"
- Plumbley, Blumensath, et al.
- 2010
(Show Context)
Citation Context ...tion of all the codewords). Sparse representations have found successful applications in data interpretation [5], [6], source separation [7], [8], [9], signal denoising [10], [11], coding [12], [13], =-=[14]-=-, classification [15], [16], [17], recognition [18], impainting [19], [20] and many more (see e.g. [21]). Two related problems have been studied either separately or jointly in sparse representations.... |

35 |
Sparse decomposition of stereo signals with matching pursuit and application to blind separation of more than two sources from a stereo mixture
- Gribonval
- 1993
(Show Context)
Citation Context ... that are chosen from a dictionary (i.e., the whole collection of all the codewords). Sparse representations have found successful applications in data interpretation [5], [6], source separation [7], =-=[8]-=-, [9], signal denoising [10], [11], coding [12], [13], [14], classification [15], [16], [17], recognition [18], impainting [19], [20] and many more (see e.g. [21]). Two related problems have been stud... |

35 | Sparse and shiftinvariant feature extraction from non-negative data
- Smaragdis, Raj, et al.
- 2008
(Show Context)
Citation Context ...ave also been considered in the learning process to favor the desired structures of the dictionaries, such as the translation-invariant or shift-invariant characteristics of the atoms imposed in [49]–=-=[53]-=- and the orthogonality between subspaces enforced in [54], and the de-correlation between the atoms promoted in [55]. An advantage of a parametric dictionary lies in its potential for reducing the num... |

33 |
Motif: an efficient algorithm for learning translation invariant dictionaries.
- Jost, Vandergheynst, et al.
- 2006
(Show Context)
Citation Context ...anslation-invariant or shift-invariant characteristics of the atoms imposed in [49]–[53] and the orthogonality between subspaces enforced in [54], and the de-correlation between the atoms promoted in =-=[55]-=-. An advantage of a parametric dictionary lies in its potential for reducing the number of free parameters, thereby leading to a more efficient implementation and better convergence of dictionary lear... |

28 | Dictionary identification - Sparse matrixfactorisation via `1 minimisation - Gribonval, Schnass - 2010 |

26 | Sparse and redundant modeling of image content using an image-signature-dictionary - Aharon, Elad - 2008 |

25 | Shift-invariant dictionary learning for sparse representations: extending K-SVD - Mailhé, Lesage, et al. |

25 | Blind audiovisual source separation based on sparse redundant representations.
- Casanovas, Monaci, et al.
- 2010
(Show Context)
Citation Context ... Other recent efforts in dictionary learning include the search for robust and computationally efficient algorithms, such as [56], [57], and [11], and learning dictionaries from multimodal data [58], =-=[59]-=-. Comprehensive reviews of dictionary learning algorithms can be found in recent survey papers e.g., [43] and [60]. In thispaper, similar toMODandK-SVDmethods,wefocuson thedictionaryupdate step forgen... |

25 | Recursive least squares dictionary learning algorithm - Skretting, Engan - 2010 |

22 |
Numerical Recipes: The Art
- Press, Teukolsky, et al.
(Show Context)
Citation Context ...tational details of the gradient and line search path are presented in Sections V-B and V-C respectively, where 1 is a step size. For proof-of-concept, we use the method of golden section search (see =-=[65]-=- for a detailed description). The idea is to use the golden ratio to successively narrow the search range of inside which a local minimum exists. To implement this idea, we design a two-step procedure... |

20 | Parametric dictionary design for sparse coding,”
- Yaghoobi, Daudet, et al.
- 2009
(Show Context)
Citation Context ...e generic dictionaries described above, learning structure-oriented parametric dictionaries has also attracted attention. For example, a Gammatone generating function has been used by Yaghoobi et al. =-=[47]-=- to learn dictionaries from audio data. In [48], a pyramidal wavelet-like transform was proposed to learn a multiscale structure in the dictionary. Other constraints have also been considered in the l... |

19 |
A family of iterative LS-based dictionary learning algorithms, ILS-DLA, for sparse signal represenation. submitted
- Engan, Skretting, et al.
- 2005
(Show Context)
Citation Context ...hods that implements the concept of sparification process [43]. Several variants of this algorithm, such as the iterative least squares (ILS) method, have also been developed which were summarized in =-=[44]-=-. A recursive least squares (RLS) dictionary learning algorithm was recently presented in [45] where the dictionary is continuously updated as each training vector is being processed, which is differe... |

14 | Greedy dictionary selection for sparse representation. Selected Topics in Signal Processing,
- Cevher, Krause
- 2011
(Show Context)
Citation Context ...sful applications in data interpretation [5], [6], source separation [7], [8], [9], signal denoising [10], [11], coding [12], [13], [14], classification [15], [16], [17], recognition [18], impainting =-=[19]-=-, [20] and many more (see e.g. [21]). Two related problems have been studied either separately or jointly in sparse representations. The first one is sparse coding, that is, to find the sparse linear ... |

14 | Dictionary learning for L1-exact sparse coding,” Lect - Plumbley - 2007 |

11 |
Semantic coding by supervised dimensionality reduction
- Kokiopoulou, Frossard
- 2008
(Show Context)
Citation Context ...collection of all the codewords). Sparse representations have found successful applications in data interpretation [5], [6], source separation [7], [8], [9], signal denoising [10], [11], coding [12], =-=[13]-=-, [14], classification [15], [16], [17], recognition [18], impainting [19], [20] and many more (see e.g. [21]). Two related problems have been studied either separately or jointly in sparse representa... |

11 | Learning bimodal structure in audio-visual data,”
- Monaci, Vandergheynst, et al.
- 2009
(Show Context)
Citation Context ... [43]. Other recent efforts in dictionary learning include the search for robust and computationally efficient algorithms, such as [56], [57], and [11], and learning dictionaries from multimodal data =-=[58]-=-, [59]. Comprehensive reviews of dictionary learning algorithms can be found in recent survey papers e.g., [43] and [60]. In thispaper, similar toMODandK-SVDmethods,wefocuson thedictionaryupdate step ... |

11 | A geometric approach to low-rank matrix completion
- Dai, Kerman, et al.
- 2012
(Show Context)
Citation Context ...es on the notion of Stiefel and Grassmann manifolds. In particular, the Stiefel manifold is defined as The Grassmann manifold is defined as Here, the notations and follow from the convention in [62], =-=[63]-=-. Note that each element in is a unit-norm vector while each element in is a one-dimensional subspace in . For any given , it can generate a one-dimensional subspace U . Meanwhile, any given U can be ... |

10 | Audio inpainting
- Adler, Emiya, et al.
- 2012
(Show Context)
Citation Context ...pplications in data interpretation [5], [6], source separation [7], [8], [9], signal denoising [10], [11], coding [12], [13], [14], classification [15], [16], [17], recognition [18], impainting [19], =-=[20]-=- and many more (see e.g. [21]). Two related problems have been studied either separately or jointly in sparse representations. The first one is sparse coding, that is, to find the sparse linear decomp... |

10 |
Learning transformational invariants from time-varying natural images
- Cadieu, Olshausen
(Show Context)
Citation Context ...nts have also been considered in the learning process to favor the desired structures of the dictionaries, such as the translation-invariant or shift-invariant characteristics of the atoms imposed in =-=[49]-=-–[53] and the orthogonality between subspaces enforced in [54], and the de-correlation between the atoms promoted in [55]. An advantage of a parametric dictionary lies in its potential for reducing th... |

9 | On the local correctness of L-1 minimization for dictionary learning,” arXiv:1101.5672 - Geng, Wang, et al. - 2011 |

8 |
Applications of sparse representation and compressive sensing
- Baraniuk, Candès, et al.
- 2010
(Show Context)
Citation Context ...dentifier 10.1109/TSP.2012.2215026 source separation [7]–[9], signal denoising [10], [11], coding [12]–[14], classification [15]–[17], recognition [18], impainting [19], [20] and many more (see e.g., =-=[21]-=-). Two related problems have been studied either separately or jointly in sparse representations. The first one is sparse coding, that is, to find the sparse linear decompositions of a signal for a gi... |

7 |
Learning real and complex overcomplete representations from the statistics of natural images
- Olshausen, Cadieu, et al.
- 2009
(Show Context)
Citation Context ...RODUCTION Sparse signal representations have recently received extensive research interests across several communities including signal processing, information theory, and optimization [1], [2], [3], =-=[4]-=-. The basic assumption underlying this technique is that a natural signal can be approximated by the combination of only a small number of elementary components, called codewords or atoms, that are ch... |

7 |
Fast dictionary learning for sparse representations of speech signals
- Jafari, Plumbley
- 2011
(Show Context)
Citation Context ...ns of one or more of the figures in this paper are available online at http://ieeexplore.ieee.org. Digital Object Identifier 10.1109/TSP.2012.2215026 source separation [7]–[9], signal denoising [10], =-=[11]-=-, coding [12]–[14], classification [15]–[17], recognition [18], impainting [19], [20] and many more (see e.g., [21]). Two related problems have been studied either separately or jointly in sparse repr... |

7 | Dictionary design for matching pursuit and application to motion-compensated video coding
- Schmid-Saugeon, Zakhor
- 2004
(Show Context)
Citation Context ...whole collection of all the codewords). Sparse representations have found successful applications in data interpretation [5], [6], source separation [7], [8], [9], signal denoising [10], [11], coding =-=[12]-=-, [13], [14], classification [15], [16], [17], recognition [18], impainting [19], [20] and many more (see e.g. [21]). Two related problems have been studied either separately or jointly in sparse repr... |

7 | Dictionary identification—sparse matrix-factorization via `1-minimization - Gribonval, Schnass - 2010 |

6 |
Dictionary learning: what is the right representation for my signal
- Tošić, Frossard
- 2011
(Show Context)
Citation Context ...s, such as [56], [57], and [11], and learning dictionaries from multimodal data [58], [59]. Comprehensive reviews of dictionary learning algorithms can be found in recent survey papers e.g., [43] and =-=[60]-=-. In thispaper, similar toMODandK-SVDmethods,wefocuson thedictionaryupdate step forgenericdictionary learning.Anovel optimization framework is proposed, where an arbitrary subset of the codewords are ... |

5 | Methods for learning adaptive dictionary in underdetermined speech separation
- Xu, Wang
- 2011
(Show Context)
Citation Context ... are chosen from a dictionary (i.e., the whole collection of all the codewords). Sparse representations have found successful applications in data interpretation [5], [6], source separation [7], [8], =-=[9]-=-, signal denoising [10], [11], coding [12], [13], [14], classification [15], [16], [17], recognition [18], impainting [19], [20] and many more (see e.g. [21]). Two related problems have been studied e... |

4 |
Orthogonalmatching pursuit: Recursive function approximation with applications to wavelet decomposition
- Pati, Rezaiifar, et al.
- 1993
(Show Context)
Citation Context ... dictionary. Efforts dedicated to this problem have resulted in the creation of a number of algorithms including basis pursuit (BP) [22], matching pursuit (MP) [23], orthogonal matching pursuit (OMP) =-=[24]-=-, [25], subspace pursuit (SP) [26], [27], regression shrinkage and selection (LASSO) [28], focal under-determined system solver (FOCUSS) [29], and gradient pursuit (GP) [30]. Sparse decompositions of ... |

3 | Dictionary learning for stereo image representation
- Tošić, Frossard
- 2011
(Show Context)
Citation Context ..., called codewords or atoms, that are chosen from a dictionary (i.e., the whole collection of all the codewords). Sparse representations have found successful applications in data interpretation [5], =-=[6]-=-, source separation [7], [8], [9], signal denoising [10], [11], coding [12], [13], [14], classification [15], [16], [17], recognition [18], impainting [19], [20] and many more (see e.g. [21]). Two rel... |

2 |
Topics in sparse approximation. Ph.D. dissertation, The Unverisity of Texas at
- Tropp
- 2004
(Show Context)
Citation Context .... INTRODUCTION Sparse signal representations have recently received extensive research interests across several communities including signal processing, information theory, and optimization [1], [2], =-=[3]-=-, [4]. The basic assumption underlying this technique is that a natural signal can be approximated by the combination of only a small number of elementary components, called codewords or atoms, that a... |

2 |
Robust and fast learning of sparse codes with stochastic gradient descent
- Labusch, Barth, et al.
- 2011
(Show Context)
Citation Context ...plementation and better convergence of dictionary learning algorithms [43]. Other recent efforts in dictionary learning include the search for robust and computationally efficient algorithms, such as =-=[56]-=-, [57], and [11], and learning dictionaries from multimodal data [58], [59]. Comprehensive reviews of dictionary learning algorithms can be found in recent survey papers e.g., [43] and [60]. In thispa... |

1 | A union of incoherent spaces model for classification
- Schnass, Vandergheynst
(Show Context)
Citation Context ...e representations have found successful applications in data interpretation [5], [6], source separation [7], [8], [9], signal denoising [10], [11], coding [12], [13], [14], classification [15], [16], =-=[17]-=-, recognition [18], impainting [19], [20] and many more (see e.g. [21]). Two related problems have been studied either separately or jointly in sparse representations. The first one is sparse coding, ... |

1 | Speech denoising based on a greedy adaptive dictionary algorithm - Jafari, Plumbley |

1 |
On the local correctness of L minimization for dictionary learning,” 2011 [Online]. Available: http:// arxiv.org/abs/1101.5672
- Geng, Wang, et al.
(Show Context)
Citation Context ... Such dictionaries are relatively easy to obtain and more suitable for generic signals. In learning-based approaches, however, the dictionaries are adapted from a set of training data [5], [10], [35]–=-=[42]-=-. Although this may involve higher computational complexity, learned dictionaries have the potential to offer improved performance as compared with predefined dictionaries, since the atoms are derived... |

1 |
Skretting andK. Engan, “Recursive least squares dictionary learning algorithm
- unknown authors
- 2010
(Show Context)
Citation Context ...ithm, such as the iterative least squares (ILS) method, have also been developed which were summarized in [44]. A recursive least squares (RLS) dictionary learning algorithm was recently presented in =-=[45]-=- where the dictionary is continuously updated as each training vector is being processed, which is different from the ILS dictionary learning method. Aharon, Elad and Bruckstein developed the K-SVD al... |