ARM-software/mango

Issue with Constrained Optimization

mhdrake opened this issue ยท 4 comments

Hello,

We are working on a project that involves constrained optimization. I found the following error that occurs when using a constraint. Here is the example:

from mango import scheduler, Tuner
from scipy.stats import uniform
import numpy as np

np.random.seed(42)

# Search space
param_space = dict(x=uniform(-10, 20))

x_min = -2
x_max = 2

# Quadratic objective Function
@scheduler.serial
def objective(x):
    return x * x

def constraint(samples):
  '''
      Given a list of samples (each sample is a dict with parameter names as keys)
      Returns a list of True/False elements indicating whether the corresponding sample
      satisfies the constraints or not
  '''

  x = np.array([s['x'] for s in samples])
  return (x <= x_max) & (x >= x_min)

conf_dict = dict(
    constraint = constraint,
    domain_size = 1000,
    initial_random = 1,
    num_iteration  = 15
)

# Initialize and run Tuner
tuner = Tuner(param_space, objective, conf_dict)


for i in range(100):

    result = tuner.minimize()
    print(f'Optimal value of parameters: {result["best_params"]}.')



The optimizer operates successfully for a number of iterations but then exits with the following error:

----------------------------------------------------------------------
IndexError                                Traceback (most recent call last)
[<ipython-input-19-739aa7b673c8>](https://localhost:8080/#) in <cell line: 39>()
     39 for i in range(100):
     40 
---> 41     result = tuner.minimize()
     42     print(f'Optimal value of parameters: {result["best_params"]}.')
     43 

2 frames
[/usr/local/lib/python3.10/dist-packages/mango/tuner.py](https://localhost:8080/#) in minimize(self)
    158     def minimize(self):
    159         self.maximize_objective = False
--> 160         return self.run()
    161 
    162     def process_initial_custom(self):

[/usr/local/lib/python3.10/dist-packages/mango/tuner.py](https://localhost:8080/#) in run(self)
    145     def run(self):
    146         if self.config.is_bayesian:
--> 147             self.results = self.runBayesianOptimizer()
    148         elif self.config.is_random:
    149             self.results = self.runRandomOptimizer()

[/usr/local/lib/python3.10/dist-packages/mango/tuner.py](https://localhost:8080/#) in runBayesianOptimizer(self)
    310 
    311             results["best_objective"] = np.max(results["objective_values"])
--> 312             results["best_params"] = results["params_tried"][
    313                 np.argmax(results["objective_values"])
    314             ]

IndexError: index 8 is out of bounds for axis 0 with size 8

Thank you for your help, and we appreciate any suggestions you have for resolving these issues.

Hi @mhdrake,

I have pushed a fix for this issue (version 1.4.2) and the example you have provided runs through without errors on my end. Please check on your end and let me know if fixes it.

Thank you, @tihom, for your prompt response. That does fix the error on our end.

However, we noticed that the constrained optimization now operates much slower (even if the constraint is not restricting the domain space). Do you know why this is? I have attached an example below:

from mango import scheduler, Tuner
from scipy.stats import uniform
import numpy as np
import time
from IPython.utils.capture import capture_output

np.random.seed(42)

# Search space
param_space = dict(x=uniform(-10, 20))

# Quadratic objective Function
@scheduler.serial
def objective(x):
    return x * x

options = [
    {"x_min": -10, "x_max": 10, "use_constraint": True},
    {"x_min": -2, "x_max": 2, "use_constraint": True},
    {"use_constraint": False}
]

for option in options:
    x_min = option.get("x_min")
    x_max = option.get("x_max")
    use_constraint = option["use_constraint"]

    conf_dict = dict(
        domain_size=1000,
        initial_random=1,
        num_iteration=10
    )

    if use_constraint:

      def constraint(samples, xmin = x_min, xmax = x_max):
        '''
        Given a list of samples (each sample is a dict with parameter names as keys)
        Returns a list of True/False elements indicating whether the corresponding sample
        satisfies the constraints or not
        '''
        x = np.array([s['x'] for s in samples])
        return (x <= xmax) & (x >= xmin)

      conf_dict["constraint"] = constraint

    # Initialize Tuner
    tuner = Tuner(param_space, objective, conf_dict)

    total_time = 0
    N = 50
    for i in range(N):
        t0 = time.time()
        with capture_output():
          result = tuner.minimize()
        t1 = time.time()
        total_time += (t1 - t0)

    avg_time = total_time / N
    print(f"Option with x_min = {x_min}, x_max = {x_max}, use_constraint = {use_constraint}: Average time per iteration = {avg_time}")

Example output is shown below.

Option with x_min = -10, x_max = 10, use_constraint = True: Average time per iteration = 2.431731390953064
Option with x_min = -2, x_max = 2, use_constraint = True: Average time per iteration = 2.6352290105819702
Option with x_min = None, x_max = None, use_constraint = False: Average time per iteration = 0.9084393453598022

@mhdrake the change did slow down sampling, thanks for catching this and providing a reproducible snippet. I have pushed an update to make the sampling more adaptive. The run times for constrained cases would be only slightly slower now:

Option with x_min = -10, x_max = 10, use_constraint = True: Average time per iteration = 0.40442839622497556
Option with x_min = -2, x_max = 2, use_constraint = True: Average time per iteration = 0.4306916332244873
Option with x_min = None, x_max = None, use_constraint = False: Average time per iteration = 0.39583759784698486

Thank you, @mohit-bn! The speed looks back to normal. I appreciate your help in resolving this so quickly.