How to perform non-linear optimization with scipy/numpy or sympy?

前端 未结 4 1416
情书的邮戳
情书的邮戳 2021-02-06 07:23

I am trying to find the optimal solution to the follow system of equations in Python:

(x-x1)^2 + (y-y1)^2 - r1^2 = 0
(x-x2)^2 + (y-y2)^2 - r2^2 = 0
(x-x3)^2 + (y         


        
4条回答
  •  既然无缘
    2021-02-06 07:36

    I noticed that the code in the accepted solution doesn't work any longer... I think maybe scipy.optimize has changed its interface since the answer was posted. I could be wrong. Regardless, I second the suggestion to use the algorithms in scipy.optimize, and the accepted answer does demonstrate how (or did at one time, if the interface has changed).

    I'm adding an additional answer here, purely to suggest an alternative package that uses the scipy.optimize algorithms at the core, but is much more robust for constrained optimization. The package is mystic. One of the big improvements is that mystic gives constrained global optimization.

    First, here's your example, done very similarly to the scipy.optimize.minimize way, but using a global optimizer.

    from mystic import reduced
    
    @reduced(lambda x,y: abs(x)+abs(y)) #choice changes answer
    def objective(x, a, b, c):
      x,y = x
      eqns = (\
        (x - a[0])**2 + (y - b[0])**2 - c[0]**2,
        (x - a[1])**2 + (y - b[1])**2 - c[1]**2,
        (x - a[2])**2 + (y - b[2])**2 - c[2]**2)
      return eqns
    
    bounds = [(None,None),(None,None)] #unnecessary
    
    a = (0,2,0)
    b = (0,0,2)
    c = (.88,1,.75)
    args = a,b,c
    
    from mystic.solvers import diffev2
    from mystic.monitors import VerboseMonitor
    mon = VerboseMonitor(10)
    
    result = diffev2(objective, args=args, x0=bounds, bounds=bounds, npop=40, \ 
                     ftol=1e-8, disp=False, full_output=True, itermon=mon)
    
    print result[0]
    print result[1]
    

    With results looking like this:

    Generation 0 has Chi-Squared: 38868.949133
    Generation 10 has Chi-Squared: 2777.470642
    Generation 20 has Chi-Squared: 12.808055
    Generation 30 has Chi-Squared: 3.764840
    Generation 40 has Chi-Squared: 2.996441
    Generation 50 has Chi-Squared: 2.996441
    Generation 60 has Chi-Squared: 2.996440
    Generation 70 has Chi-Squared: 2.996433
    Generation 80 has Chi-Squared: 2.996433
    Generation 90 has Chi-Squared: 2.996433
    STOP("VTRChangeOverGeneration with {'gtol': 1e-06, 'target': 0.0, 'generations': 30, 'ftol': 1e-08}")
    [ 0.66667151  0.66666422]
    2.99643333334
    

    As noted, the choice of the lambda in reduced affects which point the optimizer finds as there is no actual solution to the equations.

    mystic also provides the ability to convert symbolic equations to a function, where the resulting function can be used as an objective, or as a penalty function. Here is the same problem, but using the equations as a penalty instead of the objective.

    def objective(x):
        return 0.0
    
    equations = """
    (x0 - 0)**2 + (x1 - 0)**2 - .88**2 == 0
    (x0 - 2)**2 + (x1 - 0)**2 - 1**2 == 0
    (x0 - 0)**2 + (x1 - 2)**2 - .75**2 == 0
    """
    
    bounds = [(None,None),(None,None)] #unnecessary
    
    from mystic.symbolic import generate_penalty, generate_conditions
    from mystic.solvers import diffev2
    
    pf = generate_penalty(generate_conditions(equations), k=1e12)
    
    result = diffev2(objective, x0=bounds, bounds=bounds, penalty=pf, \
                     npop=40, gtol=50, disp=False, full_output=True)
    
    print result[0]
    print result[1]
    

    With results:

    [ 0.77958328  0.8580965 ]
    3.6473132399e+12
    

    The results are different than before because the penalty applied is different than we applied earlier in reduced. In mystic, you can select what penalty you want to apply.

    The point was made that the equation has no solution. You can see from the result above, that the result is heavily penalized, so that's a good indication that there is no solution. However, mystic has another way you can see there in no solution. Instead of applying a more traditional penalty, which penalizes the solution where the constraints are violated... mystic provides a constraint, which is essentially a kernel transformation, that removes all potential solutions that don't meet the constants.

    def objective(x):
        return 0.0
    
    equations = """
    (x0 - 0)**2 + (x1 - 0)**2 - .88**2 == 0
    (x0 - 2)**2 + (x1 - 0)**2 - 1**2 == 0
    (x0 - 0)**2 + (x1 - 2)**2 - .75**2 == 0
    """
    
    bounds = [(None,None),(None,None)] #unnecessary
    
    from mystic.symbolic import generate_constraint, generate_solvers, simplify
    from mystic.symbolic import generate_penalty, generate_conditions    
    from mystic.solvers import diffev2
    
    cf = generate_constraint(generate_solvers(simplify(equations)))
    
    result = diffev2(objective, x0=bounds, bounds=bounds, \
                     constraints=cf, \
                     npop=40, gtol=50, disp=False, full_output=True)
    
    print result[0]
    print result[1]
    

    With results:

    [          nan  657.17740835]
    0.0
    

    Where the nan essentially indicates there is no valid solution.

    FYI, I'm the author, so I have some bias. However, mystic has been around almost as long as scipy.optimize, is mature, and has had a more stable interface over that length of time. The point being, if you need a much more flexible and powerful constrained nonlinear optimizer, I suggest mystic.

提交回复
热议问题