For the most part, your code seems to work well. The main reason that it's slow to converge is that you only look at the two neighbors on either side of your current point: if you expand your search to include any point in A, or even just a wider neighborhood around your current point, you'll be able to move around the search space much more quickly.
Another trick with simulated annealing is determining how to adjust the temperature. You started with a very high temperature, where basically the optimizer would always move to the neighbor, no matter what the difference in the objective function value between the two points. This kind of random movement doesn't get you to a better point on average. The trick is finding a low enough starting temperature value such that the optimizer will move to better points significantly more often than it moves to worse points, but at the same time having a starting temperature that is high enough to allow the optimizer to explore the search space. As I mentioned in my first point, if the neighborhood that you select points from is too limited, then you'll never be able to properly explore the search space even if you have a good temperature schedule.
Your original code was somewhat hard to read, both because you used a lot of conventions that Python programmers try to avoid (e.g., semicolons at ends of lines), and because you did a few things that programmers in general try to avoid (e.g., using lowercase L as a variable name, which looks very similar to the numeral 1
). I rewrote your code to make it both more readable and more Pythonic (with the help of autopep8
). Check out the pep8 standard for more information.
In make_move
, my rewrite picks one random neighbor from across the whole search space. You can try rewriting it to look in an expanded local neighborhood of the current point, if you're interested in seeing how well that works (something between what you had done above and what I've done here).
import random
import math
LIMIT = 100000
def update_temperature(T, k):
return T - 0.001
def get_neighbors(i, L):
assert L > 1 and i >= 0 and i < L
if i == 0:
return [1]
elif i == L - 1:
return [L - 2]
else:
return [i - 1, i + 1]
def make_move(x, A, T):
# nhbs = get_neighbors(x, len(A))
# nhb = nhbs[random.choice(range(0, len(nhbs)))]
nhb = random.choice(xrange(0, len(A))) # choose from all points
delta = A[nhb] - A[x]
if delta < 0:
return nhb
else:
p = math.exp(-delta / T)
return nhb if random.random() < p else x
def simulated_annealing(A):
L = len(A)
x0 = random.choice(xrange(0, L))
T = 1.
k = 1
x = x0
x_best = x0
while T > 1e-3:
x = make_move(x, A, T)
if(A[x] < A[x_best]):
x_best = x
T = update_temperature(T, k)
k += 1
print "iterations:", k
return x, x_best, x0
def isminima_local(p, A):
return all(A[p] < A[i] for i in get_neighbors(p, len(A)))
def func(x):
return math.sin((2 * math.pi / LIMIT) * x) + 0.001 * random.random()
def initialize(L):
return map(func, xrange(0, L))
def main():
A = initialize(LIMIT)
local_minima = []
for i in xrange(0, LIMIT):
if(isminima_local(i, A)):
local_minima.append([i, A[i]])
x = 0
y = A[x]
for xi, yi in enumerate(A):
if yi < y:
x = xi
y = yi
global_minumum = x
print "number of local minima: %d" % (len(local_minima))
print "global minimum @%d = %0.3f" % (global_minumum, A[global_minumum])
x, x_best, x0 = simulated_annealing(A)
print "Solution is @%d = %0.3f" % (x, A[x])
print "Best solution is @%d = %0.3f" % (x_best, A[x_best])
print "Start solution is @%d = %0.3f" % (x0, A[x0])
main()