#### DMCA

## Sparse representation for signal classification (2006)

Venue: | In Adv. NIPS |

Citations: | 78 - 0 self |

### Citations

2679 | Atomic decomposition by basis pursuit.
- Chen, Donoho, et al.
- 1999
(Show Context)
Citation Context ... Research has focused on three aspects of the sparse representation: pursuit methods for solving the optimization problem, such as matching pursuit [1], orthogonal matching pursuit [2], basis pursuit =-=[3]-=-, LARS/homotopy methods [4]; design of the dictionary, such as the K-SVD method [5]; the applications of the sparse representation for different tasks, such as signal separation, denoising, coding, im... |

2250 | Eigenfaces vs. fisherfaces: Recognition using class specific linear projection.
- Belhumeur, Hespanha, et al.
- 1997
(Show Context)
Citation Context ...signals from different classes. While both methods have broad applications in classification, the discriminative methods have often outperformed the reconstructive methods for the classification task =-=[15, 16]-=-. However, this comparison between the two types of method assumes that the signals being classified are ideal, i.e., noiseless, complete(without missing data) and without outliers. When this assumpti... |

1640 | Matching pursuits with time-frequency dictionaries." Signal Processing,
- Mallat, Zhang
- 1993
(Show Context)
Citation Context ...ce with its capacity for efficient signal modelling. Research has focused on three aspects of the sparse representation: pursuit methods for solving the optimization problem, such as matching pursuit =-=[1]-=-, orthogonal matching pursuit [2], basis pursuit [3], LARS/homotopy methods [4]; design of the dictionary, such as the K-SVD method [5]; the applications of the sparse representation for different tas... |

1240 | A study of cross-validation and bootstrap for accuracy estimation and model selection
- Kohavi
(Show Context)
Citation Context ...database of USPS handwritten digits [22]. The database contains 8-bit grayscale images of “0” through “9” with a size of 16 × 16 and there are 1100 examples of each digit. Following the conclusion of =-=[23]-=-, 10-fold stratified cross validation is adopted. Classification is conducted with the decomposition coefficients (’ X’in equation (10)) as feature and support vector machine (SVM) as classifier. In t... |

947 | Sparse Bayesian learning and the relevance vector machine
- Tipping
- 2001
(Show Context)
Citation Context ...rmine optimal values for the weighting factors are being conducted, following the methods similar to that introduced in [12]. It is interesting to compare SRSC with the relevance vector machine (RVM) =-=[24]-=-. RVM has shown comparable performance to the widely used support vector machine (SVM), but with a substantially less number of relevance/support vectors. Both SRSC and RVM incorporate sparsity and re... |

923 | K-SVD: An algorithm for designing overcomplete dictionaries for sparse representations.
- Aharon, Elad, et al.
- 2006
(Show Context)
Citation Context ...s for solving the optimization problem, such as matching pursuit [1], orthogonal matching pursuit [2], basis pursuit [3], LARS/homotopy methods [4]; design of the dictionary, such as the K-SVD method =-=[5]-=-; the applications of the sparse representation for different tasks, such as signal separation, denoising, coding, image inpainting [6, 7, 8, 9, 10]. For instance, in [6], sparse representation is use... |

620 | Orthogonal matching pursuit: recursive function approximation with applications to wavelet decomposition.
- Pati, Rezaiifar, et al.
- 1993
(Show Context)
Citation Context ...t signal modelling. Research has focused on three aspects of the sparse representation: pursuit methods for solving the optimization problem, such as matching pursuit [1], orthogonal matching pursuit =-=[2]-=-, basis pursuit [3], LARS/homotopy methods [4]; design of the dictionary, such as the K-SVD method [5]; the applications of the sparse representation for different tasks, such as signal separation, de... |

573 | Uncertainty principles and ideal atomic decomposition
- Donoho, X
- 2001
(Show Context)
Citation Context ...hogonal matching pursuit. An approximate solution is obtained by replacing the ℓ0 norm in equation (1) with the ℓ1 norm, as follows: x = min x ′ �x′ �1 s.t. y = Ax. (2) where �x� 1 is the ℓ1 norm. In =-=[11]-=-, it is proved that if certain conditions on the sparsity is satisfied, i.e., the solution is sparse enough, the solution of equation (1) is equivalent to the solution of equation (2), which can be ef... |

460 | PCA versus LDA
- Martínez, Kak
(Show Context)
Citation Context ...signals from different classes. While both methods have broad applications in classification, the discriminative methods have often outperformed the reconstructive methods for the classification task =-=[15, 16]-=-. However, this comparison between the two types of method assumes that the signals being classified are ideal, i.e., noiseless, complete(without missing data) and without outliers. When this assumpti... |

357 | Algorithms for simultaneous sparse approximation. part i: Greedy pursuit
- Tropp, Gilbert, et al.
(Show Context)
Citation Context ...optimization of J2(X,λ) and J3(X,λ1,λ2). In this paper, we propose an algorithm similar to the orthogonal matching pursuit and inspired by the simultaneous sparse approximation algorithm described in =-=[20, 21]-=-. Taking the optimization of J3(X,λ1,λ2) as example, the pursuit algorithm can be summarized as follows: 1. Initialize the residue matrix R0 = Y and the iteration counter t =0. 2. Choose the atom from... |

215 | Image decomposition via the combination of sparse representations and a variational approach
- Starck, Elad, et al.
- 2005
(Show Context)
Citation Context ... methods [4]; design of the dictionary, such as the K-SVD method [5]; the applications of the sparse representation for different tasks, such as signal separation, denoising, coding, image inpainting =-=[6, 7, 8, 9, 10]-=-. For instance, in [6], sparse representation is used for image separation. The overcomplete dictionary is generated by combining multiple standard transforms, including curvelet transform, ridgelet t... |

215 |
Simultaneous cartoon and texture image inpainting using morphological component analysis
- Elad, Starck, et al.
- 2005
(Show Context)
Citation Context ... hand, reconstructive methods have shown successful performance in addressing these problems. In [9], the sparse representation is shown to achieve state-of-the-art performance in image denoising. In =-=[18]-=-, missing pixels in images are successfully recovered by inpainting method based on sparse representation. In [17, 19], PCA method with subsampling effectively detects and excludes outliers for the fo... |

143 | Sparse Bayesian learning for basis selection
- Wipf, Rao
- 2004
(Show Context)
Citation Context ...s obtaining a sparse factorization that minimizes signal reconstruction error, the problem formulated in equation (3) has an equivalent interpretation in the framework of Bayesian decision as follows =-=[13]-=-. The signal y is assumed to be generated by the following model: (3) y = Ax + ε (4) where ε is white Gaussian noise. Moreover, the prior distribution of x is assumed to be superGaussian: � M� p(x) ∼ ... |

111 | Robust recognition using eigenimages
- Leonardis, Bischof
- 2000
(Show Context)
Citation Context ...epresentation is shown to achieve state-of-the-art performance in image denoising. In [18], missing pixels in images are successfully recovered by inpainting method based on sparse representation. In =-=[17, 19]-=-, PCA method with subsampling effectively detects and excludes outliers for the following LDA analysis. All of these examples motivate the design of a new signal representation that combines the advan... |

69 | Image denoising via learned dictionaries and sparse representation
- Elad, Aharon
- 2006
(Show Context)
Citation Context ... methods [4]; design of the dictionary, such as the K-SVD method [5]; the applications of the sparse representation for different tasks, such as signal separation, denoising, coding, image inpainting =-=[6, 7, 8, 9, 10]-=-. For instance, in [6], sparse representation is used for image separation. The overcomplete dictionary is generated by combining multiple standard transforms, including curvelet transform, ridgelet t... |

52 |
S.: Analysis of sparse representation and blind source separation
- Li, Cichocki, et al.
- 2004
(Show Context)
Citation Context ... methods [4]; design of the dictionary, such as the K-SVD method [5]; the applications of the sparse representation for different tasks, such as signal separation, denoising, coding, image inpainting =-=[6, 7, 8, 9, 10]-=-. For instance, in [6], sparse representation is used for image separation. The overcomplete dictionary is generated by combining multiple standard transforms, including curvelet transform, ridgelet t... |

46 | Combining reconstructive and discriminative subspace methods for robust classification and regression by subsampling
- Fidler, Skocaj, et al.
- 2006
(Show Context)
Citation Context ... which is necessary for removing noise, recovering missing data and detecting outliers. This performance degradation of discriminative methods on corrupted signals is evident in the examples shown in =-=[17]-=-. On the other hand, reconstructive methods have shown successful performance in addressing these problems. In [9], the sparse representation is shown to achieve state-of-the-art performance in image ... |

38 | Image denoising with shrinkage and redundant representations
- Elad, Matalon, et al.
- 2006
(Show Context)
Citation Context ...t atom selected by: J1(X,λ) (the left figure) and J2(X,λ) (the right figure). The scalar g1 is uniformly distributed in the interval [0, 5], and the scalar g2 is uniformly distributed in the interval =-=[5, 10]-=-. The scalar h1 and h2 are uniformly distributed in the interval [10, 20]. Therefore, most of the energy of the signal can be described by the sine function and most of the discrimination power is in ... |

32 | Learning sparse image codes using a wavelet pyramid architecture
- Olshausen, Sallee, et al.
(Show Context)
Citation Context |

8 | Bayesian l1-norm sparse learning
- Lin, Lee
- 2006
(Show Context)
Citation Context ...objective function is minimized: J1(x; λ) =�y − Ax� 2 2 + λ �x� 1 where the parameter λ>0 is a scalar regularization parameter that balances the tradeoff between reconstruction error and sparsity. In =-=[12]-=-, a Bayesian approach is proposed for learning the optimal value for λ. Except for the intuitive interpretation as obtaining a sparse factorization that minimizes signal reconstruction error, the prob... |

4 |
Solution of l1 minimization problems by lars/homotopy methods
- Drori, Donoho
- 2006
(Show Context)
Citation Context ...ree aspects of the sparse representation: pursuit methods for solving the optimization problem, such as matching pursuit [1], orthogonal matching pursuit [2], basis pursuit [3], LARS/homotopy methods =-=[4]-=-; design of the dictionary, such as the K-SVD method [5]; the applications of the sparse representation for different tasks, such as signal separation, denoising, coding, image inpainting [6, 7, 8, 9,... |