I have a set S of n points in dimension d for which I can calculate all pairwise distances if need be. I need to select k points in this set so that t
Your problem seemed similar to the weighted minimum vertex cover problem (which is NP-complete). Thanks to @Gareth Rees for the comments below clarifying that I was incorrect in understanding a vertex cover's relationship to the set you're looking for. But you may still investigate the vertex cover problem and literature because your problem might be discussed alongside it, as they still do share some features.
If you're willing to work with diameter instead of summed graph weight, you could use the approach for the minimal diameter set that you linked in your question. If your current distance measure is called d
(the one for which you want the points furthest from each other) then just define d' = 1/d
and solve the minimum distance problem with d'
.
There might also be a relationship between some form of graph cutting algorithm, like say normalized cut, and the subset you seek. If your distance measure is used as the graph weight or affinity between nodes, you might be able to modify an existing graph cutting objective function to match your objective function (looking for the group of k nodes that have maximum summed weight).
This seems like a combinatorially difficult problem. You might consider something simple like simulated annealing. The proposal function could just choose at point that's currently in the k
-subset at random and replace it randomly with a point not currently in the k
-subset.
You would need a good cooling schedule for the temperature term and may need to use reheating as a function of cost. But this sort of this is really simple to program. As long as n
is reasonably small, you can then just constantly randomly select k
-subsets and anneal towards a k
-subset with very large total distance.
This would only give you an approximation, but even deterministic methods probably will solve this approximately.
Below is a first hack at what the simulated annealing code might be. Note that I'm not making guarantees about this. It could be an inefficient solution if calculating distance is too hard or the problem instance size grows too large. I'm using very naive geometric cooling with a fixed cooling rate, and you may also want to tinker with a fancier proposal than just randomly swapping around nodes.
all_nodes = np.asarray(...) # Set of nodes
all_dists = np.asarray(...) # Pairwise distances
N = len(all_nodes)
k = 10 # Or however many you want.
def calculate_distance(node_subset, distances):
# A function you write to determine sum of distances
# among a particular subset of nodes.
# Initial random subset of k elements
shuffle = np.random.shuffle(all_nodes)
current_subset = shuffle[0:k]
current_outsiders = shuffle[k:]
# Simulated annealing parameters.
temp = 100.0
cooling_rate = 0.95
num_iters = 10000
# Simulated annealing loop.
for ii in range(num_iters):
proposed_subset = current_subset.copy()
proposed_outsiders = current_outsiders.copy()
index_to_swap = np.random.randint(k)
outsider_to_swap = np.random.randint(N - k)
tmp = current_subset[index_to_swap]
proposed_subset[index_to_swap] = current_outsiders[outsider_to_swap]
proposed_outsiders[outsider_to_swap] = tmp
potential_change = np.exp((-1.0/temp)*
calculate_distance(proposed_subset,all_dists)/
calculate_distance(current_subset, all_dists))
if potential_change > 1 or potential_change >= np.random.rand():
current_subset = proposed_subset
current_outsiders = proposed_outsiders
temp = cooling_rate * temp