• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations

Tools

Sorted by:
Try your query at:
Semantic Scholar Scholar Academic
Google Bing DBLP
Results 11 - 20 of 46,597
Next 10 →

Exact Matrix Completion via Convex Optimization

by Emmanuel J. Candès, Benjamin Recht , 2008
"... We consider a problem of considerable practical interest: the recovery of a data matrix from a sampling of its entries. Suppose that we observe m entries selected uniformly at random from a matrix M. Can we complete the matrix and recover the entries that we have not seen? We show that one can perfe ..."
Abstract - Cited by 873 (26 self) - Add to MetaCart
We consider a problem of considerable practical interest: the recovery of a data matrix from a sampling of its entries. Suppose that we observe m entries selected uniformly at random from a matrix M. Can we complete the matrix and recover the entries that we have not seen? We show that one can

High dimensional graphs and variable selection with the Lasso

by Nicolai Meinshausen, Peter Bühlmann - ANNALS OF STATISTICS , 2006
"... The pattern of zero entries in the inverse covariance matrix of a multivariate normal distribution corresponds to conditional independence restrictions between variables. Covariance selection aims at estimating those structural zeros from data. We show that neighborhood selection with the Lasso is a ..."
Abstract - Cited by 736 (22 self) - Add to MetaCart
show that the proposed neighborhood selection scheme is consistent for sparse high-dimensional graphs. Consistency hinges on the choice of the penalty parameter. The oracle value for optimal prediction does not lead to a consistent neighborhood estimate. Controlling instead the probability of falsely

Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties

by Jianqing Fan , Runze Li , 2001
"... Variable selection is fundamental to high-dimensional statistical modeling, including nonparametric regression. Many approaches in use are stepwise selection procedures, which can be computationally expensive and ignore stochastic errors in the variable selection process. In this article, penalized ..."
Abstract - Cited by 948 (62 self) - Add to MetaCart
Variable selection is fundamental to high-dimensional statistical modeling, including nonparametric regression. Many approaches in use are stepwise selection procedures, which can be computationally expensive and ignore stochastic errors in the variable selection process. In this article, penalized

Lag length selection and the construction of unit root tests with good size and power

by Serena Ng, Pierre Perron - Econometrica , 2001
"... It is widely known that when there are errors with a moving-average root close to −1, a high order augmented autoregression is necessary for unit root tests to have good size, but that information criteria such as the AIC and the BIC tend to select a truncation lag (k) that is very small. We conside ..."
Abstract - Cited by 558 (14 self) - Add to MetaCart
framework in which the moving-average root is local to −1 to document how the MIC performs better in selecting appropriate values of k. In monte-carlo experiments, the MIC is found to yield huge size improvements to the DF GLS and the feasible point optimal PT test developed in Elliott, Rothenberg and Stock

A Fast Elitist Non-Dominated Sorting Genetic Algorithm for Multi-Objective Optimization: NSGA-II

by Kalyanmoy Deb, Samir Agrawal, Amrit Pratap, T Meyarivan , 2000
"... Multi-objective evolutionary algorithms which use non-dominated sorting and sharing have been mainly criticized for their (i) -4 computational complexity (where is the number of objectives and is the population size), (ii) non-elitism approach, and (iii) the need for specifying a sharing ..."
Abstract - Cited by 662 (15 self) - Add to MetaCart
complexity is presented. Second, a selection operator is presented which creates a mating pool by combining the parent and child populations and selecting the best (with respect to fitness and spread) solutions. Simulation results on five difficult test problems show that the proposed NSGA-II is able

An extensive empirical study of feature selection metrics for text classification

by George Forman, Isabelle Guyon, André Elisseeff - J. of Machine Learning Research , 2003
"... Machine learning for text classification is the cornerstone of document categorization, news filtering, document routing, and personalization. In text domains, effective feature selection is essential to make the learning task efficient and more accurate. This paper presents an empirical comparison ..."
Abstract - Cited by 496 (15 self) - Add to MetaCart
Machine learning for text classification is the cornerstone of document categorization, news filtering, document routing, and personalization. In text domains, effective feature selection is essential to make the learning task efficient and more accurate. This paper presents an empirical comparison

MediaBench: A Tool for Evaluating and Synthesizing Multimedia and Communications Systems

by Chunho Lee, Miodrag Potkonjak, William H. Mangione-smith
"... Over the last decade, significant advances have been made in compilation technology for capitalizing on instruction-level parallelism (ILP). The vast majority of ILP compilation research has been conducted in the context of generalpurpose computing, and more specifically the SPEC benchmark suite. At ..."
Abstract - Cited by 966 (22 self) - Add to MetaCart
. Conventional wisdom, and a history of hand optimization of inner-loops, suggests that ILP compilation techniques are well suited to these applications. Unfortunately, there currently exists a gap between the compiler community and embedded applications developers. This paper presents MediaBench, a benchmark

Software pipelining: An effective scheduling technique for VLIW machines

by Monica Lam , 1988
"... This paper shows that software pipelining is an effective and viable scheduling technique for VLIW processors. In software pipelining, iterations of a loop in the source program are continuously initiated at constant intervals, before the preceding iterations complete. The advantage of software pipe ..."
Abstract - Cited by 581 (3 self) - Add to MetaCart
This paper shows that software pipelining is an effective and viable scheduling technique for VLIW processors. In software pipelining, iterations of a loop in the source program are continuously initiated at constant intervals, before the preceding iterations complete. The advantage of software

Coverage Control for Mobile Sensing Networks

by Jorge Cortes, Sonia Martínez, Timur Karatas, Francesco Bullo , 2002
"... This paper presents control and coordination algorithms for groups of vehicles. The focus is on autonomous vehicle networks performing distributed sensing tasks where each vehicle plays the role of a mobile tunable sensor. The paper proposes gradient descent algorithms for a class of utility functio ..."
Abstract - Cited by 582 (49 self) - Add to MetaCart
functions which encode optimal coverage and sensing policies. The resulting closed-loop behavior is adaptive, distributed, asynchronous, and verifiably correct.

Active Learning with Statistical Models

by David A. Cohn, Zoubin Ghahramani, Michael I. Jordan , 1995
"... For manytypes of learners one can compute the statistically "optimal" way to select data. We review how these techniques have been used with feedforward neural networks [MacKay, 1992# Cohn, 1994]. We then showhow the same principles may be used to select data for two alternative, statist ..."
Abstract - Cited by 679 (10 self) - Add to MetaCart
For manytypes of learners one can compute the statistically "optimal" way to select data. We review how these techniques have been used with feedforward neural networks [MacKay, 1992# Cohn, 1994]. We then showhow the same principles may be used to select data for two alternative
Next 10 →
Results 11 - 20 of 46,597
Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University