Results 1 - 10
of
35,986
Optimal Aggregation Algorithms for Middleware
- IN PODS
, 2001
"... Assume that each object in a database has m grades, or scores, one for each of m attributes. For example, an object can have a color grade, that tells how red it is, and a shape grade, that tells how round it is. For each attribute, there is a sorted list, which lists each object and its grade under ..."
Abstract
-
Cited by 717 (4 self)
- Add to MetaCart
under that attribute, sorted by grade (highest grade first). There is some monotone aggregation function, or combining rule, such as min or average, that combines the individual grades to obtain an overall grade. To determine the top k objects (that have the best overall grades), the naive algorithm
A training algorithm for optimal margin classifiers
- PROCEEDINGS OF THE 5TH ANNUAL ACM WORKSHOP ON COMPUTATIONAL LEARNING THEORY
, 1992
"... A training algorithm that maximizes the margin between the training patterns and the decision boundary is presented. The technique is applicable to a wide variety of classifiaction functions, including Perceptrons, polynomials, and Radial Basis Functions. The effective number of parameters is adjust ..."
Abstract
-
Cited by 1865 (43 self)
- Add to MetaCart
is adjusted automatically to match the complexity of the problem. The solution is expressed as a linear combination of supporting patterns. These are the subset of training patterns that are closest to the decision boundary. Bounds on the generalization performance based on the leave-one-out method and the VC
Cluster Ensembles - A Knowledge Reuse Framework for Combining Multiple Partitions
- Journal of Machine Learning Research
, 2002
"... This paper introduces the problem of combining multiple partitionings of a set of objects into a single consolidated clustering without accessing the features or algorithms that determined these partitionings. We first identify several application scenarios for the resultant 'knowledge reuse&ap ..."
Abstract
-
Cited by 603 (20 self)
- Add to MetaCart
' framework that we call cluster ensembles. The cluster ensemble problem is then formalized as a combinatorial optimization problem in terms of shared mutual information. In addition to a direct maximization approach, we propose three effective and efficient techniques for obtaining high-quality combiners
Improved algorithms for optimal winner determination in combinatorial auctions and generalizations
, 2000
"... Combinatorial auctions can be used to reach efficient resource and task allocations in multiagent systems where the items are complementary. Determining the winners is NP-complete and inapproximable, but it was recently shown that optimal search algorithms do very well on average. This paper present ..."
Abstract
-
Cited by 582 (53 self)
- Add to MetaCart
Combinatorial auctions can be used to reach efficient resource and task allocations in multiagent systems where the items are complementary. Determining the winners is NP-complete and inapproximable, but it was recently shown that optimal search algorithms do very well on average. This paper
Depth-first Iterative-Deepening: An Optimal Admissible Tree Search
- Artificial Intelligence
, 1985
"... The complexities of various search algorithms are considered in terms of time, space, and cost of solution path. It is known that breadth-first search requires too much space and depth-first search can use too much time and doesn't always find a cheapest path. A depth-first iteratiw-deepening a ..."
Abstract
-
Cited by 527 (24 self)
- Add to MetaCart
-deepening algorithm is shown to be asymptotically optimal along all three dimensions for exponential pee searches. The algorithm has been used successfully in chess programs, has been eflectiuely combined with bi-directional search, and has been applied to best-first heuristic search as well. This heuristic depth
Optimally sparse representation in general (non-orthogonal) dictionaries via ℓ¹ minimization
- PROC. NATL ACAD. SCI. USA 100 2197–202
, 2002
"... Given a ‘dictionary’ D = {dk} of vectors dk, we seek to represent a signal S as a linear combination S = ∑ k γ(k)dk, with scalar coefficients γ(k). In particular, we aim for the sparsest representation possible. In general, this requires a combinatorial optimization process. Previous work considered ..."
Abstract
-
Cited by 633 (38 self)
- Add to MetaCart
Given a ‘dictionary’ D = {dk} of vectors dk, we seek to represent a signal S as a linear combination S = ∑ k γ(k)dk, with scalar coefficients γ(k). In particular, we aim for the sparsest representation possible. In general, this requires a combinatorial optimization process. Previous work
Optimal combination of density forecasts
- NATIONAL INSTITUTE OF ECONOMIC AND SOCIAL RESEARCH DISCUSSION PAPER NO
, 2005
"... This paper brings together two important but hitherto largely unrelated areas of the forecasting literature, density forecasting and forecast combination. It proposes a simple data-driven approach to direct combination of density forecasts using optimal weights. These optimal weights are those weigh ..."
Abstract
-
Cited by 44 (9 self)
- Add to MetaCart
This paper brings together two important but hitherto largely unrelated areas of the forecasting literature, density forecasting and forecast combination. It proposes a simple data-driven approach to direct combination of density forecasts using optimal weights. These optimal weights are those
A Fast Elitist Non-Dominated Sorting Genetic Algorithm for Multi-Objective Optimization: NSGA-II
, 2000
"... Multi-objective evolutionary algorithms which use non-dominated sorting and sharing have been mainly criticized for their (i) -4 computational complexity (where is the number of objectives and is the population size), (ii) non-elitism approach, and (iii) the need for specifying a sharing ..."
Abstract
-
Cited by 662 (15 self)
- Add to MetaCart
complexity is presented. Second, a selection operator is presented which creates a mating pool by combining the parent and child populations and selecting the best (with respect to fitness and spread) solutions. Simulation results on five difficult test problems show that the proposed NSGA-II is able
Averaging and the optimal combination of forecasts
, 2011
"... The optimal combination of forecasts, detailed in Bates and Granger (1969), has empirically often been overshadowed in practice by using the simple average instead. Explanations of why averaging might in practice work better than constructing the optimal combination have centered on estimation error ..."
Abstract
-
Cited by 1 (0 self)
- Add to MetaCart
The optimal combination of forecasts, detailed in Bates and Granger (1969), has empirically often been overshadowed in practice by using the simple average instead. Explanations of why averaging might in practice work better than constructing the optimal combination have centered on estimation
Evolutionary Algorithms for Multiobjective Optimization
, 2002
"... Multiple, often conflicting objectives arise naturally in most real-world optimization scenarios. As evolutionary algorithms possess several characteristics due to which they are well suited to this type of problem, evolution-based methods have been used for multiobjective optimization for more than ..."
Abstract
-
Cited by 450 (13 self)
- Add to MetaCart
than a decade. Meanwhile evolutionary multiobjective optimization has become established as a separate subdiscipline combining the fields of evolutionary computation and classical multiple criteria decision making. In this paper, the basic principles of evolutionary multiobjective optimization
Results 1 - 10
of
35,986