Multiple parameter optimization with lots of local minima

后端 未结 5 887
栀梦
栀梦 2021-02-04 11:47

I\'m looking for algorithms to find a \"best\" set of parameter values. The function in question has a lot of local minima and changes very quickly. To make matters even worse,

5条回答
  •  小鲜肉
    小鲜肉 (楼主)
    2021-02-04 12:23

    I've tried Simulated Annealing and Particle Swarm Optimization. (As a reminder, I couldn't use gradient descent because the gradient cannot be computed).

    I've also tried an algorithm that does the following:

    • Pick a random point and a random direction
    • Evaluate the function
    • Keep moving along the random direction for as long as the result keeps improving, speeding up on every successful iteration.
    • When the result stops improving, step back and instead attempt to move into an orthogonal direction by the same distance.

    This "orthogonal direction" was generated by creating a random orthogonal matrix (adapted this code) with the necessary number of dimensions.

    If moving in the orthogonal direction improved the result, the algorithm just continued with that direction. If none of the directions improved the result, the jump distance was halved and a new set of orthogonal directions would be attempted. Eventually the algorithm concluded it must be in a local minimum, remembered it and restarted the whole lot at a new random point.

    This approach performed considerably better than Simulated Annealing and Particle Swarm: it required fewer evaluations of the (very slow) function to achieve a result of the same quality.

    Of course my implementations of S.A. and P.S.O. could well be flawed - these are tricky algorithms with a lot of room for tweaking parameters. But I just thought I'd mention what ended up working best for me.

提交回复
热议问题