Results 1 - 10
of
172
Metaheuristics in combinatorial optimization: Overview and conceptual comparison
- ACM COMPUTING SURVEYS
, 2003
"... The field of metaheuristics for the application to combinatorial optimization problems is a rapidly growing field of research. This is due to the importance of combinatorial optimization problems for the scientific as well as the industrial world. We give a survey of the nowadays most important meta ..."
Abstract
-
Cited by 314 (17 self)
- Add to MetaCart
The field of metaheuristics for the application to combinatorial optimization problems is a rapidly growing field of research. This is due to the importance of combinatorial optimization problems for the scientific as well as the industrial world. We give a survey of the nowadays most important metaheuristics from a conceptual point of view. We outline the different components and concepts that are used in the different metaheuristics in order to analyze their similarities and differences. Two very important concepts in metaheuristics are intensification and diversification. These are the two forces that largely determine the behaviour of a metaheuristic. They are in some way contrary but also complementary to each other. We introduce a framework, that we call the I&D frame, in order to put different intensification and diversification components into relation with each other. Outlining the advantages and disadvantages of different metaheuristic approaches we conclude by pointing out the importance of hybridization of metaheuristics as well as the integration of metaheuristics and other methods for optimization.
Automatic Algorithm Configuration based on Local Search
- IN AAAI ’07: PROC. OF THE TWENTY-SECOND CONFERENCE ON ARTIFICAL INTELLIGENCE
, 2007
"... The determination of appropriate values for free algorithm parameters is a challenging and tedious task in the design of effective algorithms for hard problems. Such parameters include categorical choices (e.g., neighborhood structure in local search or variable/value ordering heuristics in tree sea ..."
Abstract
-
Cited by 76 (31 self)
- Add to MetaCart
The determination of appropriate values for free algorithm parameters is a challenging and tedious task in the design of effective algorithms for hard problems. Such parameters include categorical choices (e.g., neighborhood structure in local search or variable/value ordering heuristics in tree search), as well as numerical parameters (e.g., noise or restart timing). In practice, tuning of these parameters is largely carried out manually by applying rules of thumb and crude heuristics, while more principled approaches are only rarely used. In this paper, we present a local search approach for algorithm configuration and prove its convergence to the globally optimal parameter configuration. Our approach is very versatile: it can, e.g., be used for minimising run-time in decision problems or for maximising solution quality in optimisation problems. It further applies to arbitrary algorithms, including heuristic tree search and local search algorithms, with no limitation on the number of parameters. Experiments in four algorithm configuration scenarios demonstrate that our automatically determined parameter settings always outperform the algorithm defaults, sometimes by several orders of magnitude. Our approach also shows better performance and greater flexibility than the recent CALIBRA system. Our ParamILS code, along with instructions on how to use it for tuning your own algorithms, is available on-line at
An Adaptive Noise Mechanism for WalkSAT
, 2002
"... Stochastic local search algorithms based on the WalkSAT architecture are among the best known methods for solving hard and large instances of the propositional satisfiability problem (SAT). The performance and behaviour of these algorithms critically depends on the setting of the noise parameter ..."
Abstract
-
Cited by 75 (13 self)
- Add to MetaCart
Stochastic local search algorithms based on the WalkSAT architecture are among the best known methods for solving hard and large instances of the propositional satisfiability problem (SAT). The performance and behaviour of these algorithms critically depends on the setting of the noise parameter, which controls the greediness of the search process. The optimal setting for the noise parameter varies considerably between different types and sizes of problem instances; consequently, considerable manual tuning is typically required to obtain peak performance. In this paper, we characterise the impact of the noise setting on the behaviour of WalkSAT and introduce a simple adaptive noise mechanism for WalkSAT that does not require manual adjustment for different problem instances. We present experimental results indicating that by using this selftuning noise mechanism, various WalkSAT variants (including WalkSAT/SKC and Novelty ) achieve performance levels close to their peak performance for instance-specific, manually tuned noise settings.
A simple and effective iterated greedy algorithm for the permutation flowshop scheduling problem
- European Journal of Operational Research
, 2006
"... Over the last decade many metaheuristics have been applied to the flowshop scheduling problem, ranging from Simulated Annealing or Tabu Search to complex hybrid techniques. Some of these methods provide excellent effectiveness and efficiency at the expense of being utterly complicated. In fact, seve ..."
Abstract
-
Cited by 67 (14 self)
- Add to MetaCart
(Show Context)
Over the last decade many metaheuristics have been applied to the flowshop scheduling problem, ranging from Simulated Annealing or Tabu Search to complex hybrid techniques. Some of these methods provide excellent effectiveness and efficiency at the expense of being utterly complicated. In fact, several published methods require substantial implemen-tation efforts, exploit problem specific speed-up techniques that cannot be applied to slight variations of the original problem, and often re-implementations of these methods by other researchers produce results that are quite different from the original ones. In this work we present a new iterated greedy algorithm that applies two phases iteratively, named destruc-tion, were some jobs are eliminated from the incumbent solution, and construction, where the eliminated jobs are reinserted into the sequence using the well known NEH construction £Corresponding author 1 heuristic. Optionally, a local search can be applied after the construction phase. Our iterated greedy algorithm is both very simple to implement and, as shown by experimental results, highly effective when compared to state-of-the-art methods.
Ant Colony Optimization -- Artificial Ants as a Computational Intelligence Technique
- IEEE COMPUT. INTELL. MAG
, 2006
"... ..."
(Show Context)
A comparison of the performance of different metaheuristics on the timetabling problem
- IN: PROCEEDINGS OF THE 4TH INTERNATIONAL CONFERENCE ON PRACTICE AND THEORY OF AUTOMATED TIMETABLING (PATAT 2002
, 2002
"... ..."
(Show Context)
Ants can solve Constraint Satisfaction Problems
- IEEE Transactions on Evolutionary Computation
, 2001
"... In this paper we describe a new incomplete approach for solving constraint satisfaction problems (CSPs) based on the ant colony optimization (ACO) metaheuristic. The idea is to use artificial ants to keep track of promising areas of the search space by laying trails of pheromone. This pheromone info ..."
Abstract
-
Cited by 33 (11 self)
- Add to MetaCart
(Show Context)
In this paper we describe a new incomplete approach for solving constraint satisfaction problems (CSPs) based on the ant colony optimization (ACO) metaheuristic. The idea is to use artificial ants to keep track of promising areas of the search space by laying trails of pheromone. This pheromone information is used to guide the search, as a heuristic for choosing values to be assigned to variables.
An application of Iterated Local Search to Graph Coloring Problem
- PROCEEDINGS OF THE COMPUTATIONAL SYMPOSIUM ON GRAPH COLORING AND ITS GENERALIZATIONS
, 2002
"... Graph coloring is a well known problem from graph theory that, when solving it with local search algorithms, is typically treated as a series of constraint satisfaction problems: for a given number of colors k, one has to find a feasible coloring; once such a coloring is found, the number of colo ..."
Abstract
-
Cited by 32 (2 self)
- Add to MetaCart
Graph coloring is a well known problem from graph theory that, when solving it with local search algorithms, is typically treated as a series of constraint satisfaction problems: for a given number of colors k, one has to find a feasible coloring; once such a coloring is found, the number of colors is decreased and the local search starts again. Here we explore the application of Iterated Local Search to the graph coloring problem. Iterated Local Search is a simple and powerful metaheuristic that has shown very good results for a variety of optimization problems. In our research we investigate different perturbation schemes and present computational results on some hard instances from the DIMACS benchmark suite.
A Review on the Ant Colony Optimization Metaheuristic: Basis, Models and New Trends
- Mathware & Soft Computing
, 2002
"... Ant Colony Optimization (ACO) is a recent metaheuristic method that is inspired by the behavior of real ant colonies. In this paper, we review the underlying ideas of this approach that lead from the biological inspiration to the ACO metaheuristic, which gives a set of rules of how to apply ACO ..."
Abstract
-
Cited by 31 (2 self)
- Add to MetaCart
Ant Colony Optimization (ACO) is a recent metaheuristic method that is inspired by the behavior of real ant colonies. In this paper, we review the underlying ideas of this approach that lead from the biological inspiration to the ACO metaheuristic, which gives a set of rules of how to apply ACO algorithms to challenging combinatorial problems. We present some of the algorithms that were developed under this framework, give an overview of current applications, and analyze the relationship between ACO and some of the best known metaheuristics. In addition, we describe recent theoretical developments in the eld and we conclude by showing several new trends and new research directions in this eld.
Combining metaheuristics and exact algorithms in combinatorial optimization: a survey and classification
- In: Proc. the First International Work-Conference on the Interplay Between Natural and Artificial Computation, LNCS
, 2005
"... Abstract. In this survey we discuss different state-of-the-art approaches of combining exact algorithms and metaheuristics to solve combinatorial optimization problems. Some of these hybrids mainly aim at providing optimal solutions in shorter time, while others primarily focus on getting better heu ..."
Abstract
-
Cited by 30 (3 self)
- Add to MetaCart
(Show Context)
Abstract. In this survey we discuss different state-of-the-art approaches of combining exact algorithms and metaheuristics to solve combinatorial optimization problems. Some of these hybrids mainly aim at providing optimal solutions in shorter time, while others primarily focus on getting better heuristic solutions. The two main categories in which we divide the approaches are collaborative versus integrative combinations. We further classify the different techniques in a hierarchical way. Altogether, the surveyed work on combinations of exact algorithms and metaheuristics documents the usefulness and strong potential of this research direction. 1