CQM squares objective function expansion bug
BehrouzSohrabi opened this issue · 2 comments
When I was trying to write an objective function where the sum of squares needed to be minimized I realized the energy was close to the square of sums instead. Then I looked at the cqm.objective
and noticed that the coefficients of linear variables are added to their quadratic coefficients.
It actually works like this: 5x^2 - 3x = 2x
Here is a simple example that reproduces the problem. Let's say we have this objective function:
The expanded form will be:
import numpy as np
from dimod import ConstrainedQuadraticModel, Binary, quicksum
# variables
x = np.array([Binary(f"x_{n}") for n in [1,2,3]])
# objective function defined as sum of squares of the residuals between LHS and RHS
RHS = [5, 10]
LHS = [(2*x[0] + 3*x[1] + 4*x[2]), (5*x[0] + 6*x[1] + 7*x[2])]
Residuals = np.array(LHS) - np.array(RHS)
cqm = ConstrainedQuadraticModel()
cqm.set_objective(quicksum(Residuals**2))
print(cqm.objective)
# Expected:
# {'x_1': -120, 'x_2': -150, 'x_3': -180},
# {('x_1', 'x_1'): 29, ('x_2', 'x_2'): 45, ('x_3', 'x_3'): 65}
# {('x_2', 'x_1'): 72, ('x_3', 'x_1'): 86, ('x_3', 'x_2'): 108}, 125,
# {'x_1': 'BINARY', 'x_2': 'BINARY', 'x_3': 'BINARY'}
# Output
# {'x_1': -91, 'x_2': -105, 'x_3': -115},
# {('x_2', 'x_1'): 72, ('x_3', 'x_1'): 86, ('x_3', 'x_2'): 108}, 125,
# {'x_1': 'BINARY', 'x_2': 'BINARY', 'x_3': 'BINARY'}
Hi @BehrouzSohrabi,
if in the expanded form above you use x_i^2 = x_i
(because a binary variable squared is equal to itself), you notice that the coefficient obtained are correct!
For example: 'x_1': -120
and ('x_1, 'x_1'): 29
. The sum of the two coefficients is -91 which is exactly what you obtained.
Thanks for your comment @alexzucca90
You're right. That shouldn't be the cause of the problem. I'll keep looking.