• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations

Tools

Sorted by:
Try your query at:
Semantic Scholar Scholar Academic
Google Bing DBLP
Results 1 - 10 of 36,988
Next 10 →

Gradient projection for sparse reconstruction: Application to compressed sensing and other inverse problems

by Mário A. T. Figueiredo, Robert D. Nowak, Stephen J. Wright - IEEE JOURNAL OF SELECTED TOPICS IN SIGNAL PROCESSING , 2007
"... Many problems in signal processing and statistical inference involve finding sparse solutions to under-determined, or ill-conditioned, linear systems of equations. A standard approach consists in minimizing an objective function which includes a quadratic (squared ℓ2) error term combined with a spa ..."
Abstract - Cited by 539 (17 self) - Add to MetaCart
-constrained quadratic programming (BCQP) formulation of these problems. We test variants of this approach that select the line search parameters in different ways, including techniques based on the Barzilai-Borwein method. Computational experiments show that these GP approaches perform well in a wide range

Suffix arrays: A new method for on-line string searches

by Udi Manber, Gene Myers , 1991
"... A new and conceptually simple data structure, called a suffix array, for on-line string searches is intro-duced in this paper. Constructing and querying suffix arrays is reduced to a sort and search paradigm that employs novel algorithms. The main advantage of suffix arrays over suffix trees is that ..."
Abstract - Cited by 835 (0 self) - Add to MetaCart
A new and conceptually simple data structure, called a suffix array, for on-line string searches is intro-duced in this paper. Constructing and querying suffix arrays is reduced to a sort and search paradigm that employs novel algorithms. The main advantage of suffix arrays over suffix trees

Greedy Randomized Adaptive Search Procedures

by Mauricio G. C. Resende , Celso C. Ribeiro , 2002
"... GRASP is a multi-start metaheuristic for combinatorial problems, in which each iteration consists basically of two phases: construction and local search. The construction phase builds a feasible solution, whose neighborhood is investigated until a local minimum is found during the local search phas ..."
Abstract - Cited by 647 (82 self) - Add to MetaCart
GRASP is a multi-start metaheuristic for combinatorial problems, in which each iteration consists basically of two phases: construction and local search. The construction phase builds a feasible solution, whose neighborhood is investigated until a local minimum is found during the local search

The Effect of Line-Search Parameters on the Numerical Performance of Multi-Step Quasi-Newton Methods

by J. A. Ford
"... Over the past twelve years, multi-step quasi-Newton methods for the unconstrained optimization of a nonlinear function f (with gradient denoted by g) have been developed to the point where they exhibit substantial improvements in numerical performance when compared with the “industry standard ” BFGS ..."
Abstract - Add to MetaCart
” BFGS method- see [1-4], for example. These methods are based on the use of simple interpolatory curves in the variable space. Until recently, the multi-step methods had always been implemented under the same line-search conditions as those commonly recommended for the BFGS method (here, si = xi+1 – xi

Choosing multiple parameters for support vector machines

by Olivier Chapelle, Vladimir Vapnik, Olivier Bousquet, Sayan Mukherjee - MACHINE LEARNING , 2002
"... The problem of automatically tuning multiple parameters for pattern recognition Support Vector Machines (SVMs) is considered. This is done by minimizing some estimates of the generalization error of SVMs using a gradient descent algorithm over the set of parameters. Usual methods for choosing para ..."
Abstract - Cited by 470 (17 self) - Add to MetaCart
parameters, based on exhaustive search become intractable as soon as the number of parameters exceeds two. Some experimental results assess the feasibility of our approach for a large number of parameters (more than 100) and demonstrate an improvement of generalization performance.

Visual Information Seeking: Tight Coupling of Dynamic Query Filters with Starfield Displays

by Christopher Ahlberg, Ben Shneiderman , 1994
"... This paper offers new principles for visual information seeking (VIS). A key concept is to support browsing, which is distinguished from familiar query composition and information retrieval because of its emphasis on rapid filtering to reduce result sets, progressive refinement of search parameters, ..."
Abstract - Cited by 631 (51 self) - Add to MetaCart
This paper offers new principles for visual information seeking (VIS). A key concept is to support browsing, which is distinguished from familiar query composition and information retrieval because of its emphasis on rapid filtering to reduce result sets, progressive refinement of search parameters

Hierarchical mixtures of experts and the EM algorithm

by Michael I. Jordan, Robert A. Jacobs , 1993
"... We present a tree-structured architecture for supervised learning. The statistical model underlying the architecture is a hierarchical mixture model in which both the mixture coefficients and the mixture components are generalized linear models (GLIM’s). Learning is treated as a max-imum likelihood ..."
Abstract - Cited by 885 (21 self) - Add to MetaCart
problem; in particular, we present an Expectation-Maximization (EM) algorithm for adjusting the parame-ters of the architecture. We also develop an on-line learning algorithm in which the pa-rameters are updated incrementally. Comparative simulation results are presented in the robot dynamics domain.

The Ant System: Optimization by a colony of cooperating agents

by Marco Dorigo, Vittorio Maniezzo, Alberto Colorni - IEEE TRANSACTIONS ON SYSTEMS, MAN, AND CYBERNETICS-PART B , 1996
"... An analogy with the way ant colonies function has suggested the definition of a new computational paradigm, which we call Ant System. We propose it as a viable new approach to stochastic combinatorial optimization. The main characteristics of this model are positive feedback, distributed computation ..."
Abstract - Cited by 1300 (46 self) - Add to MetaCart
methodology to the classical Traveling Salesman Problem (TSP), and report simulation results. We also discuss parameter selection and the early setups of the model, and compare it with tabu search and simulated annealing using TSP. To demonstrate the robustness of the approach, we show how the Ant System (AS

Hidden Markov models in computational biology: applications to protein modeling

by Anders Krogh, Michael Brown, I. Saira Mian, Kimmen Sjölander, David Haussler - JOURNAL OF MOLECULAR BIOLOGY , 1994
"... Hidden.Markov Models (HMMs) are applied t.0 the problems of statistical modeling, database searching and multiple sequence alignment of protein families and protein domains. These methods are demonstrated the on globin family, the protein kinase catalytic domain, and the EF-hand calcium binding moti ..."
Abstract - Cited by 655 (39 self) - Add to MetaCart
motif. In each case the parameters of an HMM are estimated from a training set of unaligned sequences. After the HMM is built, it is used to obtain a multiple alignment of all the training sequences. It is also used to search the. SWISS-PROT 22 database for other sequences. that are members of the given

Where the REALLY Hard Problems Are

by Peter Cheeseman, Bob Kanefsky, William M. Taylor - IN J. MYLOPOULOS AND R. REITER (EDS.), PROCEEDINGS OF 12TH INTERNATIONAL JOINT CONFERENCE ON AI (IJCAI-91),VOLUME 1 , 1991
"... It is well known that for many NP-complete problems, such as K-Sat, etc., typical cases are easy to solve; so that computationally hard cases must be rare (assuming P != NP). This paper shows that NP-complete problems can be summarized by at least one "order parameter", and that the hard p ..."
Abstract - Cited by 683 (1 self) - Add to MetaCart
It is well known that for many NP-complete problems, such as K-Sat, etc., typical cases are easy to solve; so that computationally hard cases must be rare (assuming P != NP). This paper shows that NP-complete problems can be summarized by at least one "order parameter", and that the hard
Next 10 →
Results 1 - 10 of 36,988
Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University