Iterative deepening A*
Iterative deepening A*
- Iterative deepening A* (IDA***) is a graph traversal and [[pathfinding|path search]] algorithm that can find the shortest path between a designated start node and any member of a set of goal nodes in a weighted graph. It is a variant of iterative deepening depth-first search that borrows the idea to use a heuristic function to conservatively estimate the remaining cost to get to the goal from the [[a*-search-algorithm]]. Since it is a depth-first search algorithm, its memory usage is lower than in A*, but unlike ordinary iterative deepening search, it concentrates on exploring the most promising nodes and thus does not go to the same depth everywhere in the search tree. Unlike A*, IDA* does not utilize dynamic programming and therefore often ends up exploring the same nodes many times.
While the standard iterative deepening depth-first search uses search depth as the cutoff for each iteration, the IDA* uses the more informative <math>f(n) = g(n) + h(n)</math>, where <math>g(n)</math> is the cost to travel from the root to node <math>n</math> and <math>h(n)</math> is a problem-specific heuristic estimate of the cost to travel from <math>n</math> to the goal.
The algorithm was first described by Richard E. Korf in 1985.
Description Iterative-deepening-A works as follows: at each iteration, perform a depth-first search, cutting off a branch when its total cost <math>f(n) = g(n) + h(n)</math> exceeds a given threshold*. This threshold starts at the estimate of the cost at the initial state, and increases for each iteration of the algorithm. At each iteration, the threshold used for the next iteration is the minimum cost of all values that exceeded the current threshold.
IDA is beneficial when the problem is memory constrained. A search keeps a large queue of unexplored nodes that can quickly fill up memory. By contrast, because IDA does not remember any node except the ones on the current path, it requires an amount of memory that is only linear in the length of the solution that it constructs. Its time complexity is analyzed by Korf et al. under the assumption that the heuristic cost estimate is consistent*, meaning that
:<math>h(n) \le \mathrm{cost}(n, n') + h(n')</math>
for all nodes and all neighbors of ; they conclude that compared to a brute-force tree search over an exponential-sized problem, IDA* achieves a smaller search depth (by a constant factor), but not a smaller branching factor.
Recursive best-first search is another memory-constrained version of A search that can be faster in practice than IDA, since it requires less regenerating of nodes. Mahanti et al. (1992) further showed that this limitation is not exclusive to IDA: no best-first limited-memory heuristic search algorithm can universally achieve <math>O(N)</math> complexity on trees due to memory constraints. They also specify that IDA achieves <math>O(N)</math> complexity only if the number of iterations with minimal node growth is bounded, a condition not always satisfied. To avoid the worst-case scenario, Iterative Budgeted Exponential Search (IBEX) has been introduced in 2019.