Categorical dimensions with >2 levels break sum_equals constraints
Closed this issue · 0 comments
To reproduce:
from ProcessOptimizer import Optimizer
from ProcessOptimizer.utils import expected_minimum
from ProcessOptimizer.space.constraints import SumEquals
dimensions = [
(0.0, 1.0),
(0.0, 1.0),
(0.0, 1.0),
("A", "B", "C", "D"),
]
seed = 42
# Build optimizer
opt = Optimizer(
dimensions=dimensions,
lhs=False,
acq_func="EI",
n_initial_points=5,
random_state=seed,
)
# Create relevant constraint
constraints = [SumEquals(dimensions=[0, 1, 2], value=1.5)]
opt.set_constraints(constraints)
x = opt.ask(5)
y = [2, 3, 1, 0, 1]
# Get initial results and feed to model
res = opt.tell(x, y)
[x_min, val] = expected_minimum(res)
This results in the following error:
*** ValueError: operands could not be broadcast together with shapes (1,4) (7,)
The problem has to do with how we handle categoricals with more than two levels. For reasons that are right now totally opaque to me, these are expanded into a greater number of transformed dimensions (one extra per level past the initial two?), which breaks a small part of the expected_minimum function on lines 294-310:
cons = None
# Prepare a linear constraint, if applicable
if hasattr(res.constraints, "sum_equals"):
A = np.zeros((1, res.space.n_dims))
value = res.constraints.sum_equals[0].value
for dim in res.constraints.sum_equals[0].dimensions:
# Normalization rescales the ratio that the constrained dimensions
# need to be added together, by an amount that depends on the length
# of each dimension
dim_length = res.space.bounds[dim][1] - res.space.bounds[dim][0]
A[0, dim] = dim_length/value
# The value we have to sum to has also been changed by the normalization
# so we calculate the new value by just applying the scaled sum of the
# corresponding factors in the first candidate point
new_value = np.sum(A*xs[0])
# Create the constraint object
cons = lin_constraint(A, lb=new_value, ub=new_value)
Specifically, we die on the call of new_value. One would expect xs[0] to have the same length as A. One would be wrong. For this example, xs[0]
has seven entries, not 5 like the original space. I believe the fix is faily simple, to change the call the defines A to the following: A = np.zeros((1, res.space.transformed_n_dims))
, but we'll see if this bricks the tests.