Minimizing Least Squares with Algebraic Constraints and Bounds

懵懂的女人 提交于 2021-01-28 03:12:33

问题


I'm attempting to minimize a sum of least squares based on some vector summations. Briefly, I'm creating an equation that takes ideal vectors, weights them with a determined coefficient, and then sums the weighted vectors. The sum of least squares comes in once this sum is compared to the actual vector measurements found for some observation.

To give an example:

# Observation A has the following measurements:
A = [0, 4.1, 5.6, 8.9, 4.3]

# How similar is A to ideal groups identified by the following:
group1 = [1, 3, 5, 10, 3]
group2 = [6, 3, 2, 1, 10]
group3 = [3, 3, 4, 2, 1]

# Let y be the predicted measurement for A with coefficients s1, s2, and s3:
y = s1 * group1 + s2 * group2 + s3 * group3

# y will be some vector of length 5, similar to A
# Now find the sum of least squares between y and A
sum((y_i - A_i)** 2 for y_i in y for A_i in A)

Necessary bounds and constraints

0 <= s1, s2, s3 <= 1

s1 + s2 + s3 = 1

y = s1 * group1 + s2 * group2 + s3 * group3

This sum of least squares for y and A is what I'd like to minimize to get the coefficients s1, s2, s3, but I'm having difficulties identifying what the proper choice in scipy.optimize might be. It doesn't seem like the functions there for minimizing sum of least squares can handle algebraic variable constraints. The data I'm working with is thousands of observations with these vectorized measurements. Any thoughts or ideas would be greatly appreciated!


回答1:


For your case you can use minimize() from scipy.optimize like this:

minimize(fun=obj_fun, args=argtpl x0=xinit, bounds=bnds, constraints=cons)

where obj_fun(x, *args) is your objective function, argtpl a tuple of (optional) arguments for you objective function, xinit a initial point, bnds a list of tuples for the bounds of your variables and cons a list of dicts for your constraints.

import numpy as np
from scipy.optimize import minimize

# Observation A has the following measurements:
A = np.array([0, 4.1, 5.6, 8.9, 4.3])
# How similar is A to ideal groups identified by the following:
group1 = np.array([1, 3, 5, 10, 3])
group2 = np.array([6, 3, 2, 1, 10])
group3 = np.array([3, 3, 4, 2, 1])

# Define the objective function
# x is the array containing your wanted coefficients
def obj_fun(x, A, g1, g2, g3):
    y = x[0] * g1 + x[1] * g2 + x[2] * g3
    return np.sum((y-A)**2)

# Bounds for the coefficients
bnds = [(0, 1), (0, 1), (0, 1)]
# Constraint: x[0] + x[1] + x[2] - 1 = 0
cons = [{"type": "eq", "fun": lambda x: x[0] + x[1] + x[2] - 1}]

# Initial guess
xinit = np.array([1, 1, 1])
res = minimize(fun=obj_fun, args=(A, group1, group2, group3), x0=xinit, bounds=bnds, constraints=cons)
print(res.x)

Solution for your example:

array([9.25609756e-01, 7.43902439e-02, 6.24242179e-12])


来源:https://stackoverflow.com/questions/52087336/minimizing-least-squares-with-algebraic-constraints-and-bounds

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!