If we want to find the length of the shortest path
between all pairs of vertices,
we could run Dijkstra's algorithm *n* times, once from each possible
starting vertex.
This yields a cubic time algorithm for all-pairs shortest path, since
.

Can we do better? Significantly improving the complexity is still an open question, but there is a superslick dynamic programming algorithm that also runs in .

There are several ways to characterize the shortest path between two nodes
in a graph.
The Floyd-Warshall algorithm starts by numbering
the vertices of the graph from *1* to *n*.
We use these numbers here not to label the vertices,
but to order them.
Define to be the length of the shortest path from *i* to *j*
using only vertices numbered
from 1, 2,..., *k* as possible intermediate vertices.

What does this mean?
When *k* = 0, we are allowed no intermediate vertices, so
that every path consists
of at most one edge.
Thus .
In general, adding a new vertex *k+1* as a possible intermediary
helps only if there is a short path that goes through it, so

This recurrence performs only a constant amount of work per cell. The following dynamic programming algorithm implements the recurrence:

Floyd(

G)Let , the weight matrix of

Gfor

k=1 tonfor

i=1 tonfor

j=1 ton

The Floyd-Warshall all-pairs shortest path runs in time,
which asymptotically is no better than *n* calls to Dijkstra's algorithm.
However, the loops are so tight and the program so short that it runs
better in practice.
It is also notable as one of the rare algorithms that work better on adjacency
matrices than adjacency lists.

Mon Jun 2 23:33:50 EDT 1997