This code is given in python official essays on graph theory. Here\'s the code:
def find_all_paths(graph, start, end, path=[]):
path = path + [start]
The key to seeing that it is a DFS is that the recursion happens before the accumulation of paths. In other words the recursion will go as deep as it needs to go before putting anything on the "paths" list. All the deepest siblings are accumulated on "paths" before returning the list.
I believe the code is correct with the "append" rather than "extend", since "paths" is the accumulator of all paths. Though it could probably be written as
paths += find_all_paths(graph, node, end, path)
(edit) ...instead of
newpaths = find_all_paths(graph, node, end, path)
for newpath in newpaths:
paths.append(newpath)
Consider the following modifications and execution script:
def find_all_paths(graph, start, end, path=[]):
path = path + [start]
print 'adding %d'%start
if start == end:
return [path]
if not graph.has_key(start):
return []
paths = []
for node in graph[start]:
if node not in path:
paths.extend(find_all_paths(graph, node, end, path))
print 'returning ' + str(paths)
return paths
G = {1:[2,3,4], 2:[1,4], 3:[1,4], 4:[1,2,3]}
find_all_paths(G, 1, 4)
Output:
adding 1
adding 2
adding 4
returning [[1, 2, 4]]
adding 3
adding 4
returning [[1, 3, 4]]
adding 4
returning [[1, 2, 4], [1, 3, 4], [1, 4]]
Note how the first path is returned before adding 3, and the second path is returned before adding 4.
Yes, this algorithm is indeed a DFS. Notice how it recurses right away (go into the child) when looping over the various nodes, as opposed to a Breadth First Search which would basically make a list of viable nodes (e.g. everything on the same level of depth, a.k.a. siblings) and only recursing when those do not match your requirements.