Results 1 - 10
of
36,988
Gradient projection for sparse reconstruction: Application to compressed sensing and other inverse problems
- IEEE JOURNAL OF SELECTED TOPICS IN SIGNAL PROCESSING
, 2007
"... Many problems in signal processing and statistical inference involve finding sparse solutions to under-determined, or ill-conditioned, linear systems of equations. A standard approach consists in minimizing an objective function which includes a quadratic (squared ℓ2) error term combined with a spa ..."
Abstract
-
Cited by 539 (17 self)
- Add to MetaCart
-constrained quadratic programming (BCQP) formulation of these problems. We test variants of this approach that select the line search parameters in different ways, including techniques based on the Barzilai-Borwein method. Computational experiments show that these GP approaches perform well in a wide range
Suffix arrays: A new method for on-line string searches
, 1991
"... A new and conceptually simple data structure, called a suffix array, for on-line string searches is intro-duced in this paper. Constructing and querying suffix arrays is reduced to a sort and search paradigm that employs novel algorithms. The main advantage of suffix arrays over suffix trees is that ..."
Abstract
-
Cited by 835 (0 self)
- Add to MetaCart
A new and conceptually simple data structure, called a suffix array, for on-line string searches is intro-duced in this paper. Constructing and querying suffix arrays is reduced to a sort and search paradigm that employs novel algorithms. The main advantage of suffix arrays over suffix trees
Greedy Randomized Adaptive Search Procedures
, 2002
"... GRASP is a multi-start metaheuristic for combinatorial problems, in which each iteration consists basically of two phases: construction and local search. The construction phase builds a feasible solution, whose neighborhood is investigated until a local minimum is found during the local search phas ..."
Abstract
-
Cited by 647 (82 self)
- Add to MetaCart
GRASP is a multi-start metaheuristic for combinatorial problems, in which each iteration consists basically of two phases: construction and local search. The construction phase builds a feasible solution, whose neighborhood is investigated until a local minimum is found during the local search
The Effect of Line-Search Parameters on the Numerical Performance of Multi-Step Quasi-Newton Methods
"... Over the past twelve years, multi-step quasi-Newton methods for the unconstrained optimization of a nonlinear function f (with gradient denoted by g) have been developed to the point where they exhibit substantial improvements in numerical performance when compared with the “industry standard ” BFGS ..."
Abstract
- Add to MetaCart
” BFGS method- see [1-4], for example. These methods are based on the use of simple interpolatory curves in the variable space. Until recently, the multi-step methods had always been implemented under the same line-search conditions as those commonly recommended for the BFGS method (here, si = xi+1 – xi
Choosing multiple parameters for support vector machines
- MACHINE LEARNING
, 2002
"... The problem of automatically tuning multiple parameters for pattern recognition Support Vector Machines (SVMs) is considered. This is done by minimizing some estimates of the generalization error of SVMs using a gradient descent algorithm over the set of parameters. Usual methods for choosing para ..."
Abstract
-
Cited by 470 (17 self)
- Add to MetaCart
parameters, based on exhaustive search become intractable as soon as the number of parameters exceeds two. Some experimental results assess the feasibility of our approach for a large number of parameters (more than 100) and demonstrate an improvement of generalization performance.
Visual Information Seeking: Tight Coupling of Dynamic Query Filters with Starfield Displays
, 1994
"... This paper offers new principles for visual information seeking (VIS). A key concept is to support browsing, which is distinguished from familiar query composition and information retrieval because of its emphasis on rapid filtering to reduce result sets, progressive refinement of search parameters, ..."
Abstract
-
Cited by 631 (51 self)
- Add to MetaCart
This paper offers new principles for visual information seeking (VIS). A key concept is to support browsing, which is distinguished from familiar query composition and information retrieval because of its emphasis on rapid filtering to reduce result sets, progressive refinement of search parameters
Hierarchical mixtures of experts and the EM algorithm
, 1993
"... We present a tree-structured architecture for supervised learning. The statistical model underlying the architecture is a hierarchical mixture model in which both the mixture coefficients and the mixture components are generalized linear models (GLIM’s). Learning is treated as a max-imum likelihood ..."
Abstract
-
Cited by 885 (21 self)
- Add to MetaCart
problem; in particular, we present an Expectation-Maximization (EM) algorithm for adjusting the parame-ters of the architecture. We also develop an on-line learning algorithm in which the pa-rameters are updated incrementally. Comparative simulation results are presented in the robot dynamics domain.
The Ant System: Optimization by a colony of cooperating agents
- IEEE TRANSACTIONS ON SYSTEMS, MAN, AND CYBERNETICS-PART B
, 1996
"... An analogy with the way ant colonies function has suggested the definition of a new computational paradigm, which we call Ant System. We propose it as a viable new approach to stochastic combinatorial optimization. The main characteristics of this model are positive feedback, distributed computation ..."
Abstract
-
Cited by 1300 (46 self)
- Add to MetaCart
methodology to the classical Traveling Salesman Problem (TSP), and report simulation results. We also discuss parameter selection and the early setups of the model, and compare it with tabu search and simulated annealing using TSP. To demonstrate the robustness of the approach, we show how the Ant System (AS
Hidden Markov models in computational biology: applications to protein modeling
- JOURNAL OF MOLECULAR BIOLOGY
, 1994
"... Hidden.Markov Models (HMMs) are applied t.0 the problems of statistical modeling, database searching and multiple sequence alignment of protein families and protein domains. These methods are demonstrated the on globin family, the protein kinase catalytic domain, and the EF-hand calcium binding moti ..."
Abstract
-
Cited by 655 (39 self)
- Add to MetaCart
motif. In each case the parameters of an HMM are estimated from a training set of unaligned sequences. After the HMM is built, it is used to obtain a multiple alignment of all the training sequences. It is also used to search the. SWISS-PROT 22 database for other sequences. that are members of the given
Where the REALLY Hard Problems Are
- IN J. MYLOPOULOS AND R. REITER (EDS.), PROCEEDINGS OF 12TH INTERNATIONAL JOINT CONFERENCE ON AI (IJCAI-91),VOLUME 1
, 1991
"... It is well known that for many NP-complete problems, such as K-Sat, etc., typical cases are easy to solve; so that computationally hard cases must be rare (assuming P != NP). This paper shows that NP-complete problems can be summarized by at least one "order parameter", and that the hard p ..."
Abstract
-
Cited by 683 (1 self)
- Add to MetaCart
It is well known that for many NP-complete problems, such as K-Sat, etc., typical cases are easy to solve; so that computationally hard cases must be rare (assuming P != NP). This paper shows that NP-complete problems can be summarized by at least one "order parameter", and that the hard
Results 1 - 10
of
36,988