# Consistent heuristic

In the study of path-finding problems in artificial intelligence, a heuristic function is said to be consistent, or monotone, if its estimate is always less than or equal to the estimated distance from any neighbouring vertex to the goal, plus the cost of reaching that neighbour.

Formally, for every node N and each successor P of N, the estimated cost of reaching the goal from N is no greater than the step cost of getting to P plus the estimated cost of reaching the goal from P. That is:

$h(N)\leq c(N,P)+h(P)$ and
$h(G)=0.\,$ where

• h is the consistent heuristic function
• N is any node in the graph
• P is any descendant of N
• G is any goal node
• c(N,P) is the cost of reaching node P from N

A consistent heuristic is also admissible, i.e. it never overestimates the cost of reaching the goal (the converse, however, is not always true). This is proved by induction on $m$ , the length of the best path from node to goal. By assumption, $h(N_{m})\leq h^{*}(N_{m})$ , where $h^{*}(n)$ denotes the cost of the shortest path from n to the goal. Therefore,

$h(N_{m+1})\leq c(N_{m+1},N_{m})+h(N_{m})\leq c(N_{m+1},N_{m})+h^{*}(N_{m})=h^{*}(N_{m+1})$ ,

making it admissible. ($N_{m+1}$ is any node whose best path to the goal, of length m+1, goes through some immediate child $N_{m}$ whose best path to the goal is of length m.)

## Consequences of monotonicity

Consistent heuristics are called monotone because the estimated final cost of a partial solution, $f(N_{j})=g(N_{j})+h(N_{j})$ is monotonically non-decreasing along the best path to the goal, where $g(N_{j})=\sum _{i=2}^{j}c(N_{i-1},N_{i})$ is the cost of the best path from start node $N_{1}$ to $N_{j}$ . It's necessary and sufficient for a heuristic to obey the triangle inequality in order to be consistent.

In the A* search algorithm, using a consistent heuristic means that once a node is expanded, the cost by which it was reached is the lowest possible, under the same conditions that Dijkstra's algorithm requires in solving the shortest path problem (no negative cost cycles). In fact, if the search graph is given cost $c'(N,P)=c(N,P)+h(P)-h(N)$ for a consistent $h$ , then A* is equivalent to best-first search on that graph using Dijkstra's algorithm. In the unusual event that an admissible heuristic is not consistent, a node will need repeated expansion every time a new best (so-far) cost is achieved for it.

If the given heuristic $h$ is admissible but not consistent, one can artificially force the heuristic values along a path to be monotonically non-decreasing by using

$h'(P)\gets \max(h(P),h'(N)-c(N,P))$ as the heuristic value for $P$ instead of $h(P)$ , where $N$ is the node immediately preceding $P$ on the path and $h'(start)=h(start)$ . This idea is due to László Mérō and is now known as pathmax. Contrary to common belief, pathmax does not turn an admissible heuristic into a consistent heuristic. For example, if A* uses pathmax and a heuristic that is admissible but not consistent, it is not guaranteed to have an optimal path to a node when it is first expanded.