Here is an exercise in the Algorithm Design Manual.
Consider the problem of determining whether a given undirected graph G = (V, E) contains a triangle
If you have an adjacency matrix, you can find triangles by squaring the matrix and seeing if the original matrix and square matrix have a non-zero entry in the same place.
A naive matrix multiplication takes time O(n^3)
, but there are fast matrix multiplication algorithms that do better. One of the best known is the Coppersmith-Winograd algorithm, which runs in O(n^2.4)
time. That means the algorithm goes something like:
O(V^2)
time to convert to an adjacency matrix.O(V^2.4)
time to compute the square of the adjacency matrix.O(V^2)
time to check over the matrices for coinciding non-zero entries.O(V)
time to narrow down the third node common to both the known nodes.So overall this takes O(V^2.4)
time; more precisely it takes however long matrix multiplication takes. You can dynamically switch between this algorithm and the if-any-edge-end-points-have-a-common-neighbor algorithm that amit explains in their answer to improve that to O(V min(V^1.4, E))
.
Here's a paper that goes more in-depth into the problem.
It's kind of neat how dependent-on-theoretical-discoveries this problem is.
If conjectures about matrix multiplication actually being quadratic turn out to be true, then you would get a really nice time bound of O(V^2)
or O(V^2 log(V))
or something like that. But if quantum computers work out, we'll be able to do even better than that (something like O(V^1.3)
)!