Results 1  10
of
47
Disjoint pattern database heuristics
 Artificial Intelligence
, 2002
"... We explore a method for computing admissible heuristic evaluation functions for search problems. It utilizes pattern databases (Culberson & Schaeffer, 1998), which are precomputed tables of the exact cost of solving various subproblems of an existing problem. Unlike standard pattern database heu ..."
Abstract

Cited by 140 (35 self)
 Add to MetaCart
(Show Context)
We explore a method for computing admissible heuristic evaluation functions for search problems. It utilizes pattern databases (Culberson & Schaeffer, 1998), which are precomputed tables of the exact cost of solving various subproblems of an existing problem. Unlike standard pattern database heuristics, however, we partition our problems into disjoint subproblems, so that the costs of solving the different subproblems can be added together without overestimating the cost of solving the original problem. Previously (Korf & Felner, 2002) we showed how to statically partition the slidingtile puzzles into disjoint groups of tiles to compute an admissible heuristic, using the same partition for each state and problem instance. Here we extend the method and show that it applies to other domains as well. We also present another method for additive heuristics which we call dynamically partitioned pattern databases. Here we partition the problem into disjoint subproblems for each state of the search dynamically. We discuss the pros and cons of each of these methods and apply both methods to three different problem domains: the slidingtile puzzles, the 4peg Towers of Hanoi problem, and finding an optimal vertex cover of a graph. We find that in some problem domains, static partitioning is most effective, while in others dynamic partitioning is a better choice. In each of these problem domains, either statically partitioned or dynamically partitioned pattern database heuristics are the best known heuristics for the problem.
MemoryBased Heuristics for Explicit State Spaces
 PROCEEDINGS OF THE TWENTYFIRST INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE (IJCAI09)
, 2009
"... In many scenarios, quickly solving a relatively small search problem with an arbitrary start and arbitrary goal state is important (e.g., GPS navigation). In order to speed this process, we introduce a new class of memorybased heuristics, called true distance heuristics, that store true distances be ..."
Abstract

Cited by 32 (19 self)
 Add to MetaCart
In many scenarios, quickly solving a relatively small search problem with an arbitrary start and arbitrary goal state is important (e.g., GPS navigation). In order to speed this process, we introduce a new class of memorybased heuristics, called true distance heuristics, that store true distances between some pairs of states in the original state space can be used for a heuristic between any pair of states. We provide a number of techniques for using and improving true distance heuristics such that most of the benefits of the allpairs shortestpath computation can be gained with less than 1 % of the memory. Experimental results on a number of domains show a 614 fold improvement in search speed compared to traditional heuristics.
Limited discrepancy beam search
 In Proceedings of the International Joint Conference on Artificial Intelligence (IJCAI
, 2005
"... Beam search reduces the memory consumption of bestfirst search at the cost of finding longer paths but its memory consumption can still exceed the given memory capacity quickly. We therefore develop BULB (Beam search Using Limited discrepancy Backtracking), a complete memorybounded search method th ..."
Abstract

Cited by 30 (2 self)
 Add to MetaCart
Beam search reduces the memory consumption of bestfirst search at the cost of finding longer paths but its memory consumption can still exceed the given memory capacity quickly. We therefore develop BULB (Beam search Using Limited discrepancy Backtracking), a complete memorybounded search method that is able to solve more problem instances of large search problems than beam search and does so with a reasonable runtime. At the same time, BULB tends to find shorter paths than beam search because it is able to use larger beam widths without running out of memory. We demonstrate these properties of BULB experimentally for three standard benchmark domains. 1
Maximizing over multiple pattern databases speeds up heuristic search
 Artificial Intelligence
, 2006
"... A pattern database (PDB) is a heuristic function stored as a lookup table. This paper considers how best to use a fixed amount (m units) of memory for storing pattern databases. In particular, we examine whether using n pattern databases of size m/n instead of one pattern database of size m improves ..."
Abstract

Cited by 26 (13 self)
 Add to MetaCart
(Show Context)
A pattern database (PDB) is a heuristic function stored as a lookup table. This paper considers how best to use a fixed amount (m units) of memory for storing pattern databases. In particular, we examine whether using n pattern databases of size m/n instead of one pattern database of size m improves search performance. In all the state spaces considered, the use of multiple smaller pattern databases reduces the number of nodes generated by IDA*. The paper provides an explanation for this phenomenon based on the distribution of heuristic values that occur during search. 1 Introduction and
Recent progress in heuristic search: A case study of the fourpeg Towers of Hanoi problem
 in: International Joint Conference on Artificial Intelligence (IJCAI07
"... We integrate a number of recent advances in heuristic search, and apply them to the fourpeg Towers of Hanoi problem. These include frontier search, diskbased search, multiple compressed disjoint additive pattern database heuristics, and breadthfirst heuristic search. The main new idea we introduc ..."
Abstract

Cited by 22 (9 self)
 Add to MetaCart
(Show Context)
We integrate a number of recent advances in heuristic search, and apply them to the fourpeg Towers of Hanoi problem. These include frontier search, diskbased search, multiple compressed disjoint additive pattern database heuristics, and breadthfirst heuristic search. The main new idea we introduce here is the use of pattern database heuristics to search for any of a number of explicit goal states, with no overhead compared to a heuristic for a single goal state. We perform the first complete breadthfirst searches of the 21 and 22disc fourpeg Towers of Hanoi problems, and extend the verification of a “presumed optimal solution ” to this problem from 24 to 30 discs, a problem that is 4096 times larger. Fourpeg Towers of Hanoi Problem The threepeg Towers of Hanoi problem is well known in
Partialexpansion A* with selective node generation
 In Proceedings of AAAI
, 2012
"... A * is often described as being ‘optimal’, in that it expands the minimum number of unique nodes. But, A * may generate many extra nodes which are never expanded. This is a performance loss, especially when the branching factor is large. Partial Expansion A * (PEA*) (Yoshizumi, Miura, and Ishida 200 ..."
Abstract

Cited by 15 (8 self)
 Add to MetaCart
(Show Context)
A * is often described as being ‘optimal’, in that it expands the minimum number of unique nodes. But, A * may generate many extra nodes which are never expanded. This is a performance loss, especially when the branching factor is large. Partial Expansion A * (PEA*) (Yoshizumi, Miura, and Ishida 2000) addresses this problem when expanding a node, n, by generating all the children of n but only storing children with the same fcost as n. n is reinserted into the OPEN list, but with the fcost of the next best child. This paper introduces an enhanced version of PEA * (EPEA*). Given a priori domain knowledge, EPEA * generates only the children with the same fcost as the parent. EPEA* is generalized to its iterativedeepening variant, EPEIDA*. For some domains, these algorithms yield substantial performance improvements. Stateoftheart results were obtained for the pancake puzzle and for some multiagent pathfinding instances. Drawbacks of EPEA * are also discussed.
Learning from multiple heuristics
 In Proceedings of AAAI08
, 2008
"... Heuristic functions for singleagent search applications estimate the cost of the optimal solution. When multiple heuristics exist, taking their maximum is an effective way to combine them. A new technique is introduced for combining multiple heuristic values. Inspired by the evaluation functions us ..."
Abstract

Cited by 15 (5 self)
 Add to MetaCart
(Show Context)
Heuristic functions for singleagent search applications estimate the cost of the optimal solution. When multiple heuristics exist, taking their maximum is an effective way to combine them. A new technique is introduced for combining multiple heuristic values. Inspired by the evaluation functions used in twoplayer games, the different heuristics in a singleagent application are treated as features of the problem domain. An ANN is used to combine these features into a single heuristic value. This idea has been implemented for the slidingtile puzzle and the 4peg Towers of Hanoi, two classic singleagent search domains. Experimental results show that this technique can lead to a large reduction in the search effort at a small cost in the quality of the solution obtained.
Solving the 24puzzle with instance dependent pattern databases
 In Proceedings of SARA05
, 2005
"... Abstract. A pattern database (PDB) is a heuristic function in a form of a lookup table which stores the cost of optimal solutions for instances of subproblems. These subproblems are generated by abstracting the entire search space into a smaller space called the pattern space. Traditionally, the ent ..."
Abstract

Cited by 12 (3 self)
 Add to MetaCart
(Show Context)
Abstract. A pattern database (PDB) is a heuristic function in a form of a lookup table which stores the cost of optimal solutions for instances of subproblems. These subproblems are generated by abstracting the entire search space into a smaller space called the pattern space. Traditionally, the entire pattern space is generated and each distinct pattern has an entry in the pattern database. Recently, [10] described a method for reducing pattern database memory requirements by storing only pattern database values for a specific instant of start and goal state thus enabling larger PDBs to be used and achieving speedup in the search. We enhance their method by dynamically growing the pattern database until memory is full, thereby allowing using any size of memory. We also show that memory could be saved by storing hierarchy of PDBs. Experimental results on the large 24 sliding tile puzzle show improvements of up to a factor of 40 over previous benchmark results [8]. 1
The Compression Power of Symbolic Pattern Databases
"... The heuristics used for planning and search often take the form of pattern databases generated from abstracted versions of the given state space. Pattern databases are typically stored space, which limits the size of the abstract state space and therefore the quality of the heuristic that can be use ..."
Abstract

Cited by 11 (3 self)
 Add to MetaCart
(Show Context)
The heuristics used for planning and search often take the form of pattern databases generated from abstracted versions of the given state space. Pattern databases are typically stored space, which limits the size of the abstract state space and therefore the quality of the heuristic that can be used with a given amount of memory. In the AIPS2002 conference Stefan Edelkamp introduced an alternative representation, called symbolic pattern databases, which, for the Blocks World, required two orders of magnitude less memory than a lookup table to store a pattern database. This paper presents experimental evidence that Edelkamp’s result is not restricted to a single domain. Symbolic pattern databases, in the form of Algebraic Decision Diagrams, are one or more orders of magnitude smaller than lookup tables on a wide variety of problem domains and abstractions.