#### DMCA

## Semi-supervised Discriminant Analysis Based on UDP Regularization

Citations: | 3 - 0 self |

### Citations

3879 | Eigenfaces for recognition
- Turk, Pentland
- 1991
(Show Context)
Citation Context ...t the low dimensional data manifold, and do dimensionality reduction by linear projection. Additional objectives/constraints are proposed to find ”optimal” projections in some sense. For example, PCA =-=[7]-=- is finding a projection in which projected data has maximum variance, LDA[2] finds the projection by minimizing the Fisher criteria, and ICA[1] tries to make each projected dimension as independent a... |

2309 | Eigenfaces vs. fisherfaces: Recognition using class specific linear proposed systemion.
- Belhumeur, Hespanha, et al.
- 1997
(Show Context)
Citation Context ...r projection. Additional objectives/constraints are proposed to find ”optimal” projections in some sense. For example, PCA [7] is finding a projection in which projected data has maximum variance, LDA=-=[2]-=- finds the projection by minimizing the Fisher criteria, and ICA[1] tries to make each projected dimension as independent as possible. Though these methods are efficient and achieve great success, the... |

782 | Semi-supervised learning literature survey
- Zhu
- 2008
(Show Context)
Citation Context ...ion is usually a supervised learning problem since it takes use of the label information, and discovering the structure of data points can be unsupervised. Recent research in semi-supervised learning =-=[9]-=- shows that, under reasonable assumptions, the stucture of both labeled and unlabeled data points can help to improve the classification quality, by assuming label consistency on the data manifolds, t... |

668 | Laplacian eigenmaps and spectral techniques for embedding and clustering
- Belkin, Niyogi
- 2001
(Show Context)
Citation Context ... 5. 978-1-4244-2175-6/08/$25.00 ©2008 IEEE2 Previous Work 2.1 LPP The Locality Preserving Projection (LPP) algorithm proposed by He Xiaofei et al. [6] is a linear extension of Laplacian Eigenmap(LE) =-=[3]-=-. LE employes a weighted graph to describe the structure of data points, and tries to find a lower dimentional representation preserving the graph relations. LE does not find an explicit map for embed... |

414 | Locality preserving projections.
- He, Niyogi
- 2003
(Show Context)
Citation Context ...is beneficial, and successful achivements can be seen in manifold regularization and some manifold linear extension algorithms. Among those methods, linear extensions for manifold algorithms like LPP =-=[6]-=-, UDP [8], are ways of building bridges between linear and non-linear methods. Classification is usually a supervised learning problem since it takes use of the label information, and discovering the ... |

102 | Semi-supervised discriminant analysis.
- Cai, He, et al.
- 2007
(Show Context)
Citation Context ...ssumption that samples have a unified distribution. For this relation, our method of regulariztion using UDP, is also closely connected to the method of regularization using LPP, which is proposed in =-=[4]-=-. However, we argue that the motivation of two method is different, as we shown in this paper. We employ both the local scatter matrix SL and non-local scatter matrix SN to regularize the fisher criti... |

45 |
Globally maximizing, locally minimizing: unsupervised discriminant projection with applications to face and palm biometrics,
- Yang, Zhang, et al.
- 2007
(Show Context)
Citation Context ...cial, and successful achivements can be seen in manifold regularization and some manifold linear extension algorithms. Among those methods, linear extensions for manifold algorithms like LPP [6], UDP =-=[8]-=-, are ways of building bridges between linear and non-linear methods. Classification is usually a supervised learning problem since it takes use of the label information, and discovering the structure... |

8 |
Comments on ‘Globally Maximizing, Locally Minimizing: Unsupervised Discriminant Projection with Application to Face and Palm Biometrics
- Deng, Hu, et al.
- 2008
(Show Context)
Citation Context ...ion of (8) is the eigenvectors corresponding to the largest eigenvalues of the following eigen problem: SNw = λSLw (9) Notice that the solution of (4) and (9) are similar, actually, as pointed out in =-=[5]-=-, LPP and UDP are equivalent under assumption that data points have a uniform distribution, however this assumption does not hold usually, thus they are different. 2.3 LDA Linear Discriminant Analysis... |

1 |
Video-based face recognition using probabilistic appearance manifolds
- Bartlett, Movellan, et al.
(Show Context)
Citation Context ...d ”optimal” projections in some sense. For example, PCA [7] is finding a projection in which projected data has maximum variance, LDA[2] finds the projection by minimizing the Fisher criteria, and ICA=-=[1]-=- tries to make each projected dimension as independent as possible. Though these methods are efficient and achieve great success, they are intrinsically not suitable for datasets that have non-linear ... |