Results 1 - 10
of
76,168
Optimal Aggregation Algorithms for Middleware
- IN PODS
, 2001
"... Assume that each object in a database has m grades, or scores, one for each of m attributes. For example, an object can have a color grade, that tells how red it is, and a shape grade, that tells how round it is. For each attribute, there is a sorted list, which lists each object and its grade under ..."
Abstract
-
Cited by 717 (4 self)
- Add to MetaCart
under that attribute, sorted by grade (highest grade first). There is some monotone aggregation function, or combining rule, such as min or average, that combines the individual grades to obtain an overall grade. To determine the top k objects (that have the best overall grades), the naive algorithm
The Ant System: Optimization by a colony of cooperating agents
- IEEE TRANSACTIONS ON SYSTEMS, MAN, AND CYBERNETICS-PART B
, 1996
"... An analogy with the way ant colonies function has suggested the definition of a new computational paradigm, which we call Ant System. We propose it as a viable new approach to stochastic combinatorial optimization. The main characteristics of this model are positive feedback, distributed computation ..."
Abstract
-
Cited by 1300 (46 self)
- Add to MetaCart
An analogy with the way ant colonies function has suggested the definition of a new computational paradigm, which we call Ant System. We propose it as a viable new approach to stochastic combinatorial optimization. The main characteristics of this model are positive feedback, distributed
Multiobjective Optimization Using Nondominated Sorting in Genetic Algorithms
- Evolutionary Computation
, 1994
"... In trying to solve multiobjective optimization problems, many traditional methods scalarize the objective vector into a single objective. In those cases, the obtained solution is highly sensitive to the weight vector used in the scalarization process and demands the user to have knowledge about t ..."
Abstract
-
Cited by 539 (5 self)
- Add to MetaCart
In trying to solve multiobjective optimization problems, many traditional methods scalarize the objective vector into a single objective. In those cases, the obtained solution is highly sensitive to the weight vector used in the scalarization process and demands the user to have knowledge about
Genetic Algorithms for Multiobjective Optimization: Formulation, Discussion and Generalization
, 1993
"... The paper describes a rank-based fitness assignment method for Multiple Objective Genetic Algorithms (MOGAs). Conventional niche formation methods are extended to this class of multimodal problems and theory for setting the niche size is presented. The fitness assignment method is then modified to a ..."
Abstract
-
Cited by 633 (15 self)
- Add to MetaCart
to allow direct intervention of an external decision maker (DM). Finally, the MOGA is generalised further: the genetic algorithm is seen as the optimizing element of a multiobjective optimization loop, which also comprises the DM. It is the interaction between the two that leads to the determination of a
A training algorithm for optimal margin classifiers
- PROCEEDINGS OF THE 5TH ANNUAL ACM WORKSHOP ON COMPUTATIONAL LEARNING THEORY
, 1992
"... A training algorithm that maximizes the margin between the training patterns and the decision boundary is presented. The technique is applicable to a wide variety of classifiaction functions, including Perceptrons, polynomials, and Radial Basis Functions. The effective number of parameters is adjust ..."
Abstract
-
Cited by 1865 (43 self)
- Add to MetaCart
is adjusted automatically to match the complexity of the problem. The solution is expressed as a linear combination of supporting patterns. These are the subset of training patterns that are closest to the decision boundary. Bounds on the generalization performance based on the leave-one-out method and the VC
Unrealistic optimism about future life events.
- Journal of Personality and Social Psychology,
, 1980
"... Two studies investigated the tendency of people to be unrealistically optimistic about future life events. In Study 1, 258 college students estimated how much their own chances of experiencing 42 events differed from the chances of their classmates. Overall, they rated their own chances to be above ..."
Abstract
-
Cited by 535 (0 self)
- Add to MetaCart
Two studies investigated the tendency of people to be unrealistically optimistic about future life events. In Study 1, 258 college students estimated how much their own chances of experiencing 42 events differed from the chances of their classmates. Overall, they rated their own chances
Depth-first Iterative-Deepening: An Optimal Admissible Tree Search
- Artificial Intelligence
, 1985
"... The complexities of various search algorithms are considered in terms of time, space, and cost of solution path. It is known that breadth-first search requires too much space and depth-first search can use too much time and doesn't always find a cheapest path. A depth-first iteratiw-deepening a ..."
Abstract
-
Cited by 527 (24 self)
- Add to MetaCart
-first iteratiw-deepening algorithm is the only known algorithm that is capable of finding optimal solutions to randomly generated instances of the Fifeen Puzzle within practical resource limits. 1.
Guaranteed minimumrank solutions of linear matrix equations via nuclear norm minimization,”
- SIAM Review,
, 2010
"... Abstract The affine rank minimization problem consists of finding a matrix of minimum rank that satisfies a given system of linear equality constraints. Such problems have appeared in the literature of a diverse set of fields including system identification and control, Euclidean embedding, and col ..."
Abstract
-
Cited by 562 (20 self)
- Add to MetaCart
for the linear transformation defining the constraints, the minimum rank solution can be recovered by solving a convex optimization problem, namely the minimization of the nuclear norm over the given affine space. We present several random ensembles of equations where the restricted isometry property holds
Interior Point Methods in Semidefinite Programming with Applications to Combinatorial Optimization
- SIAM Journal on Optimization
, 1993
"... We study the semidefinite programming problem (SDP), i.e the problem of optimization of a linear function of a symmetric matrix subject to linear equality constraints and the additional condition that the matrix be positive semidefinite. First we review the classical cone duality as specialized to S ..."
Abstract
-
Cited by 547 (12 self)
- Add to MetaCart
to SDP. Next we present an interior point algorithm which converges to the optimal solution in polynomial time. The approach is a direct extension of Ye's projective method for linear programming. We also argue that most known interior point methods for linear programs can be transformed in a
Greedy Randomized Adaptive Search Procedures
, 2002
"... GRASP is a multi-start metaheuristic for combinatorial problems, in which each iteration consists basically of two phases: construction and local search. The construction phase builds a feasible solution, whose neighborhood is investigated until a local minimum is found during the local search phas ..."
Abstract
-
Cited by 647 (82 self)
- Add to MetaCart
phase. The best overall solution is kept as the result. In this chapter, we first describe the basic components of GRASP. Successful implementation techniques and parameter tuning strategies are discussed and illustrated by numerical results obtained for different applications. Enhanced or alternative
Results 1 - 10
of
76,168