• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations

Tools

Sorted by:
Try your query at:
Semantic Scholar Scholar Academic
Google Bing DBLP
Results 1 - 10 of 6,643
Next 10 →

Sliced inverse regression for dimension reduction

by Ker-chau Li - J. AMER. STATIST. ASSOC , 1991
"... ..."
Abstract - Cited by 381 (3 self) - Add to MetaCart
Abstract not found

Dimension Reduction

by Pádraig Cunningham , 2007
"... When data objects that are the subject of analysis using machine learning techniques are described by a large number of features (i.e. the data is high dimension) it is often beneficial to reduce the dimension of the data. Dimension reduction can be beneficial not only for reasons of computational e ..."
Abstract - Cited by 10 (0 self) - Add to MetaCart
When data objects that are the subject of analysis using machine learning techniques are described by a large number of features (i.e. the data is high dimension) it is often beneficial to reduce the dimension of the data. Dimension reduction can be beneficial not only for reasons of computational

On directional regression for dimension reduction

by Hansheng Wang, Yingcun Xia - J. Amer. Statist. Ass , 2007
"... By slicing the region of the response (Li, 1991, SIR) and applying local ker-nel regression (Xia et al., 2002, MAVE) to each slice, a new dimension reduction method is proposed. Compared with the traditional inverse regression methods, e.g. sliced inverse regression (Li, 1991), the new method is fre ..."
Abstract - Cited by 40 (3 self) - Add to MetaCart
By slicing the region of the response (Li, 1991, SIR) and applying local ker-nel regression (Xia et al., 2002, MAVE) to each slice, a new dimension reduction method is proposed. Compared with the traditional inverse regression methods, e.g. sliced inverse regression (Li, 1991), the new method

Dimension Reduction of Image Manifolds

by Arian Maleki
"... Dimension reduction of datasets is very useful in different application including classification, compression, feature extraction etc.; Linear methods such as principal Component Analysis, have been used for a long time and ..."
Abstract - Add to MetaCart
Dimension reduction of datasets is very useful in different application including classification, compression, feature extraction etc.; Linear methods such as principal Component Analysis, have been used for a long time and

Classes of Dimension Reduction Methods

by Zhishen Ye , Robert E. Weiss , 2000
"... Dimension reduction in regression analysis reduces the dimension of the predictor vector x without specifying a parametric model and without loss of information about the distribution of y given x. We study three existing methods, SIR (Li, 1991), SAVE (Cook and Weisberg, 1991) and pHd (Li, 1992) in ..."
Abstract - Add to MetaCart
Dimension reduction in regression analysis reduces the dimension of the predictor vector x without specifying a parametric model and without loss of information about the distribution of y given x. We study three existing methods, SIR (Li, 1991), SAVE (Cook and Weisberg, 1991) and pHd (Li, 1992

Supervised dimension reduction mappings

by Kerstin Bunte, Michael Biehl, Barbara Hammer
"... Abstract. We propose a general principle to extend dimension reduction tools to explicit dimension reduction mappings and we show that this can serve as an interface to incorporate prior knowledge in the form of class labels. We explicitly demonstrate this technique by combining locally linear mappi ..."
Abstract - Add to MetaCart
Abstract. We propose a general principle to extend dimension reduction tools to explicit dimension reduction mappings and we show that this can serve as an interface to incorporate prior knowledge in the form of class labels. We explicitly demonstrate this technique by combining locally linear

A Review on Dimension Reduction

by Yanyuan Ma, Liping Zhu - INTERNATIONAL STATISTICAL REVIEW (2013), 81, 1, 134–150 , 2013
"... Summarizing the effect of many covariates through a few linear combinations is an effective way of reducing covariate dimension and is the backbone of (sufficient) dimension reduction. Because the replacement of high-dimensional covariates by low-dimensional linear combinations is performed with a m ..."
Abstract - Cited by 3 (0 self) - Add to MetaCart
Summarizing the effect of many covariates through a few linear combinations is an effective way of reducing covariate dimension and is the backbone of (sufficient) dimension reduction. Because the replacement of high-dimensional covariates by low-dimensional linear combinations is performed with a

Dimension reduction regression in R

by Sanford Weisberg - Journal of Statistical Software. 7. Available , 2002
"... Regression is the study of the dependence of a response variable on a collection predictors collected in. In dimension reduction regression, we seek to find a few linear combinations, such that all the information about the regression is contained in these linear combinations. If is very small, perh ..."
Abstract - Cited by 12 (1 self) - Add to MetaCart
Regression is the study of the dependence of a response variable on a collection predictors collected in. In dimension reduction regression, we seek to find a few linear combinations, such that all the information about the regression is contained in these linear combinations. If is very small

Dimension-Reduction and Discrimination

by Of Neuronal Multi-channel Signals, Dimensionsreduktion Und Trennung, Neuronaler Multikanal-signale, Erstgutachter Prof, Dr. Reinhard Eckhorn, Zweitgutachter Prof, Dr. Ad Aertsen, Tag Der Mündlichen Prüfung
"... The cover illustrates the two-class problem in two dimensions and the functioning of the dimension reduction approach based on radial basis functions (RBF). Randomly selected measurements (centres) serve as construction aids for a non-linear contour map. In a classification task, unlabeled measureme ..."
Abstract - Add to MetaCart
The cover illustrates the two-class problem in two dimensions and the functioning of the dimension reduction approach based on radial basis functions (RBF). Randomly selected measurements (centres) serve as construction aids for a non-linear contour map. In a classification task, unlabeled

Transformed sufficient dimension reduction

by Tao Wang, Xu Guo, Peirong Xu, Lixing Zhu
"... A novel general framework is proposed in this paper for dimension reduction in regression to fill the gap between linear and fully nonlinear dimension reduction. The main idea is to transform first each of the raw predictors monotonically, and then search for a low-dimensional projection in the spac ..."
Abstract - Add to MetaCart
A novel general framework is proposed in this paper for dimension reduction in regression to fill the gap between linear and fully nonlinear dimension reduction. The main idea is to transform first each of the raw predictors monotonically, and then search for a low-dimensional projection
Next 10 →
Results 1 - 10 of 6,643
Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University