Weighted Graphs and Dijkstra's Algorithm

Weighted Graph

We can add attributes to edges.  We call the attributes weights.

 For example if we are using the graph as a map where the vertices are the cites and the edges are highways between the cities.

  1. Then if we want the shortest travel distance between cities an appropriate weight would be the road mileage.

  2. If we are concerned with the dollar cost of a trip and went the cheapest trip then an appropriate weight for the edges would be the cost to travel between the cities.

In both examples we want the shortest trip in terms of the edge-weights.  In other words if we designate the weights of the edges as:


w( (vi , vi+1) )


The length of a path, P, is


w(P) = ∑ w((vi , vi+1)) for all edges in P


We call the distance between u and v, d(u, v) = min w(P) for all paths between u and v.

Note weights can be negative.  Then if a negative weight edge is in a cycle of the graph, we could make the distance between two vertices as negative as we want by cycling many times through the cycle.  This would generate an unrealistic answer or cause the algorithm to never exit.  So in the case of negative weighted edges we have to be careful.

Optimization and Greedy Algorithms

Our goal is find the shortest distance from an initial vertex, v, to each vertices of the graph.  (This is not the traveling sales man problem.)  This is an optimization problem. An optimization problem is a problem where we have a goal to achieve but we also want to achieve the goal at a minimum cost.  We want the best solution if there are many solutions to the problem we want the solution that gives the minimum cost.  We can also optimize for maximum benefit.  For example if some one paid us to go from city to city then naturally we would want the path that paid us the most.  Optimization is a typical class of problem for computer scientist.

An algorithm that sometimes can solve optimization problems is the Greedy Algorithm. In the greedy algorithm we make several small steps to our goal and at each step we choose the optimal step, greedy-choice.  The solution is built from these small steps with local optimal solutions.  The greedy algorithm is not guaranteed to give us the optimal solution, if a global optimal solution can be found using a greedy algorithm then we say that the problem posses the greedy choice property.

Dijkstra's Algorithm for shortest distance

We perform a weighted-breath-first search from the start vertex, v, calculating our best guess distance for each adjacent vertex. The vertex with the smallest distant is assured to have the correct distance. So we can improve our guess of all the vertices adjacent to the vertex with the minimum distance. 

The author calls relaxation the process for improving the guess.  He suggests that a metaphor for remembering the term relaxation is spring.  Our initial guess in this case are too large, or the spring is stretched.  Then the improvements make the guess/estimate smaller, or the spring relaxes to it proper shape.

For algorithm we let D(u) represent our estimate of the distance u from v. (When the algorithm is done D will contain the correct distance.)  Initialize D to

D(v) = 0
D(u) = inf   for u != v


Note that the distance is correct for v.  We can improve D for node adjacent to v by edge relaxation:


Edge Relaxation:


if (D(u) + w((u, z) ) < D(zthen  D(z) = D(u) + w( (u, z) )


We then add to the cloud the vertex with the smallest guess for the distance.  We will want to keep a priority queue Q of the vertices not in the cloud.

Algorithm: ShortestPath(G, v)  // a little miss leading since the output is only the distance

input: A simple undirected weighted graph G

with non negative edge weights and a start vertex, v.

output: D(u) the distance u is from v.

Initialize D(v) = 0 and D(u) = inf for u != v
Initialize priority queue Q of all vertices in G using D as the key.
while Q is not empty do

u = Q.removeMin()
for each vertex z adjacent to u and in Q do

if  D(u) + w((u, z)) < D(z) then

    D(z) = D(u) + w((u, z))

    update z in Q

return D

Note how the use of the priority queue makes the algorithm quite easy to understand and implement.


Algorithm is greedy because at each iteration of the while loop we assume that the vertex with the minimum distance has the correct distance and we locally update the distances. 


The running time of algorithm depends on the graph, G, and priority queue, Q, implementation.  We assume that G is implemented by adjacency list structure.  This allows us to update D by iterating through the adjacent vertices of u in time O(degree(u)).

Implementing the priority queue, Q, with a heap makes removal efficient O(lg n) where n is the number of vertices in G.  If also keep a locater data type which gives us the location of item in the priory queue in O(1), for example and additional reference, locater,  kept with the item = (key, element). The location reference is the Position in the heap.  If we did not have the locater then search through the heap for the adjacent vertex would take O(n) instead of O(1) as with the locater. Then when we insert an item we get the location returned and can access the item in the heap using the locater.  We can update the D for the adjacent vertex in O(degree(u) lg n) the time.

We calculate the running time using a Heap

Now suppose we use an unsorted sequence for the priority queue, Q.  Now removeMin take O(n) but we can update the keys quickly by simply changing the key value after using the locater to find the position in the sequence.

We calculate the running time using an Unsorted Sequence:

In summary:


using Heap: O((n+m) lg n)
using Unsorted Sequence: O(n2+m)


Which to use?  Depends if there are lots of edges or vertices.  If m > n2/lg n then use the unsorted sequence.

Shortest Path

How to determine the shortest path? There are two techniques:

·      We can keep additional map, from, which indicates from what vertex we got to the current vertex. The map, from, is updated when the current vertex’s distance is relaxed by vertex just removed from the priority queue.

·      After finishing the algorithm, we could back calculate how we must have got to the current vertex.

In both cases the path is determined in reverse direction, from the destination vertex to the start vertex.