Bounds not applied for BoundedMixedIntegerLP
castrong opened this issue · 0 comments
In encode_relu
NeuralVerification.jl/src/optimization/utils/constraints.jl
Lines 141 to 156 in 237f092
we have
function encode_relu(::BoundedMixedIntegerLP, model, ẑᵢⱼ, zᵢⱼ, δᵢⱼ, l̂ᵢⱼ, ûᵢⱼ)
if l̂ᵢⱼ >= 0.0
@constraint(model, zᵢⱼ == ẑᵢⱼ)
@constraint(model, δᵢⱼ == 1)
elseif ûᵢⱼ <= 0.0
@constraint(model, zᵢⱼ == 0.0)
@constraint(model, δᵢⱼ == 0)
else
@constraints(model, begin
zᵢⱼ >= 0.0
zᵢⱼ >= ẑᵢⱼ
zᵢⱼ <= ûᵢⱼ * δᵢⱼ
zᵢⱼ <= ẑᵢⱼ - l̂ᵢⱼ * (1 - δᵢⱼ)
end)
end
end
The upper and lower bounds are not actually applied to the variable z_ij. I think this can significantly slow down solving the MIP. I believe we should add those bounds here. Additionally at
NeuralVerification.jl/src/optimization/utils/constraints.jl
Lines 86 to 90 in 237f092
should we add the bounds for appropriate subtypes of AbstractLinearPrograms that have bounds?
Or perhaps the optimizer would infer the bounds since z_hat_i is an affine combination of z_i-1? But I'm unsure, and perhaps it would be better to be safe and add the bounds here?
Empirically I saw a 3x different in computational speed on some tests I ran, so if people are using the BoundedMixedIntegerLP encoding it could be helpful to integrate this. My workaround for now for some linear optimization queries has involved adding on the bounds after encoding the network, as in this function: