I'm trying to fix some constraints for the Graph coloring problem using networkx and gurobi. For each i ∈ V, we define the following set of intervals. Each interval [l,u] ∈ Ii represents a possible pair of minimum color l and maximum color u for the set of edges incident to vertex i. Also, for each k∈K , we represent the set of intervals for vertex i ∈ V which include color k by :
This is all the code that i wrote:
import networkx as nx
import gurobi as gb
from itertools import combinations, chain
import pygraphviz as pygv
import os
import matplotlib.pyplot as plt
from IPython.display import SVG, display
Creation of the graph, adding nodes and edges and two lists.
G = nx.Graph()
G.add_nodes_from ([0,1,2,3,4])
G.add_edge(0,1)
G.add_edge(0,2)
G.add_edge(0,3)
G.add_edge(0,4)
G.add_edge(1,2)
G.add_edge(1,3)
G.add_edge(2,4)
G.add_edge(3,4)
U = list(G.nodes)
K = G.number_of_edges()
Z = []
creating a list with colors. We assume that K = {0, 1, . . . , K − 1} and K ≤ |E|
def nrofCol():
Z.clear()
z = 0
while z < K - 1:
Z.append(z)
z = z+1
return Z
Z = nrofCol()
Z
here i'm defining the value of the interval (l,u), creating two list with all the colors.
u = []
l = []
for z in Z:
u.append(Z[z])
l.append(Z[z])
I will be a list of tuple
I = []
for z in Z:
for u in range (z, len(Z)):
I.append(tuple((z, u)))
adding color attribute to each edge
for colored_arc in ((u,v,z) for u,v in G.edges() for z in Z):
G[colored_arc[0]][colored_arc[1]][colored_arc[2]] = colored_arc
and added variables to the model using Gurobi:
indices = []
for u,v in G.edges():
for z in Z:
indices.append((u,v,z))
x = mdic.addVars(indices, vtype = gb.GRB.BINARY)
indices2 = []
for u in G.nodes():
for i in range(0,len(I)):
indices2.append(tuple((u,I[i])))
for u in U:
y = mdic.addVars(indices2, vtype = gb.GRB.BINARY)
mdic.update()
Constraints are: Contraints and objective function
Constraints 2a- Ensure that each edge takes exactly one color
for u,v in G.edges():
mdic.addConstr(x.sum(u,v) == 1, name='one_color')
mdic.update()
2b- Choose exactly one interval (from the set of eligible intervals) for each vertex.
for u in U:
mdic.addConstr(y.sum(u) == 1, name='used')
mdic.update()
2c- Guarantee that not only adjacent edges take different colors, but also edges incident to a vertex take colors from the set of colors included in the interval chosen for that vertex
for u in U:
for z in Z:
mdic.addConstr((x.sum(u,'*',z) + x.sum('*',u,z)) <= y.sum(u,I), name='different_colors')
mdic.update()
objective function
expr=0
for u in U:
for i in range(0,len(I)):
a = [i[1] for i in I][i]
b = [i[0] for i in I][i]
expr += (a - b - G.degree[u] + 1) * y[u,I[i]]
mdic.setObjective(expr, gb.GRB.MINIMIZE)
mdic.update()
mdic.optimize()
Using this code, the model is unfeasible. I know that variable x and constraints 2a are defined in the right way. I'm not sure about variable y, other constraints and the objective function. How can i change this?
The interval list I[i]
should be defined for each node separately:
I = []
for i in G.nodes():
Ii = []
for z in Z:
for u in range (z, len(Z)):
if u - z >= G.degree(i) - 1:
Ii.append(tuple((z, u)))
I.apppend(Ii)
The usage of I
must then be changed to something like this:
indices2 = []
for i in G.nodes():
for interval in I[i]:
indices2.append(tuple((i,interval)))
Variables y
should now only be created once:
y = mdic.addVars(indices2, vtype = gb.GRB.BINARY)
Constraint 2c must also be changed because in your screenshot the right sum iterates only over a subset of I
:
for u in U:
for z in Z:
y_sum = 0
for z1,z2 in I[u]:
if z1 <= z and z <= z2:
y_sum += y[u,(z1,z2)]
mdic.addConstr((x.sum(u,'*',z) + x.sum('*',u,z)) <= y_sum, name='different_colors')
来源:https://stackoverflow.com/questions/53263559/graph-coloring-with-intervals-gurobi-constraints