Out-of-design trials may cause repeated trials
ailitw opened this issue · 6 comments
I have a setting where parameter constraints and the number of digits cause the search space to be quite narrow and as a result some trials from ax_client.get_next_trial()
end up being out-of-design. Whenever that happens all subsequent trials end up being the same exact point and I'm unable to get new trials.
I suspect this has something to do with filtering out-of-design trials, when those points are filtered the acquisition function optimization just completes exactly as in the last step that produced the out-of-design trial. I have tried setting the trials as abandoned with ax_client.abandon_trial(trial_index=trial_index)
but it seems abandoned trials are also filtered to not include out-of-design points.
Below is a minimal example to reproduce the issue:
import numpy as np
from ax.service.ax_client import AxClient, ObjectiveProperties
from ax.utils.measurement.synthetic_functions import hartmann6
def evaluate(parameters):
x = np.array([parameters.get(f"x{i+1}") for i in range(6)])
return {
"hartmann6": (hartmann6(x), 0.0),
}
ax_client = AxClient()
parameters = [
{
"name": f"x{i+1}",
"type": "range",
"bounds": [0.0, 1.0],
"value_type": "float",
"digits": 1,
}
for i in range(6)
]
experiment_kwargs = {
"name": "hartmann_test_experiment",
"parameters": parameters,
"objectives": {"hartmann6": ObjectiveProperties(minimize=True)},
"parameter_constraints": [
"x1 + x2 + x3 + x4 + x5 + x6 <= 1.0",
"x1 + x2 + x3 + x4 + x5 + x6 >= 0.999",
],
}
ax_client.create_experiment(**experiment_kwargs)
for i in range(20):
new_point, trial_index = ax_client.get_next_trial()
ax_client.complete_trial(trial_index=trial_index, raw_data=evaluate(new_point))
[INFO 04-04 14:10:01] ax.service.ax_client: Starting optimization with verbose logging. To disable logging, set the `verbose_logging` argument to `False`. Note that float values in the logs are rounded to 6 decimal points.
[INFO 04-04 14:10:01] ax.service.utils.instantiation: Created search space: SearchSpace(parameters=[RangeParameter(name='x1', parameter_type=FLOAT, range=[0.0, 1.0], digits=1), RangeParameter(name='x2', parameter_type=FLOAT, range=[0.0, 1.0], digits=1), RangeParameter(name='x3', parameter_type=FLOAT, range=[0.0, 1.0], digits=1), RangeParameter(name='x4', parameter_type=FLOAT, range=[0.0, 1.0], digits=1), RangeParameter(name='x5', parameter_type=FLOAT, range=[0.0, 1.0], digits=1), RangeParameter(name='x6', parameter_type=FLOAT, range=[0.0, 1.0], digits=1)], parameter_constraints=[ParameterConstraint(1.0*x1 + 1.0*x2 + 1.0*x3 + 1.0*x4 + 1.0*x5 + 1.0*x6 <= 1.0), ParameterConstraint(-1.0*x1 + -1.0*x2 + -1.0*x3 + -1.0*x4 + -1.0*x5 + -1.0*x6 <= -0.999)]).
[INFO 04-04 14:10:01] ax.modelbridge.dispatch_utils: Using Models.GPEI since there are more ordered parameters than there are categories for the unordered categorical parameters.
[INFO 04-04 14:10:01] ax.modelbridge.dispatch_utils: Calculating the number of remaining initialization trials based on num_initialization_trials=None max_initialization_trials=None num_tunable_parameters=6 num_trials=None use_batch_trials=False
[INFO 04-04 14:10:01] ax.modelbridge.dispatch_utils: calculated num_initialization_trials=12
[INFO 04-04 14:10:01] ax.modelbridge.dispatch_utils: num_completed_initialization_trials=0 num_remaining_initialization_trials=12
[INFO 04-04 14:10:01] ax.modelbridge.dispatch_utils: Using Bayesian Optimization generation strategy: GenerationStrategy(name='Sobol+GPEI', steps=[Sobol for 12 trials, GPEI for subsequent trials]). Iterations after 12 will take longer to generate due to model-fitting.
[INFO 04-04 14:10:02] ax.service.ax_client: Generated new trial 0 with parameters {'x1': 0.2, 'x2': 0.1, 'x3': 0.3, 'x4': 0.1, 'x5': 0.2, 'x6': 0.1}.
[INFO 04-04 14:10:02] ax.service.ax_client: Completed trial 0 with data: {'hartmann6': (-0.157493, 0.0)}.
[INFO 04-04 14:10:02] ax.service.ax_client: Generated new trial 1 with parameters {'x1': 0.0, 'x2': 0.2, 'x3': 0.4, 'x4': 0.2, 'x5': 0.1, 'x6': 0.1}.
[INFO 04-04 14:10:02] ax.service.ax_client: Completed trial 1 with data: {'hartmann6': (-0.118905, 0.0)}.
[INFO 04-04 14:10:02] ax.service.ax_client: Generated new trial 2 with parameters {'x1': 0.2, 'x2': 0.1, 'x3': 0.1, 'x4': 0.3, 'x5': 0.1, 'x6': 0.2}.
[INFO 04-04 14:10:02] ax.service.ax_client: Completed trial 2 with data: {'hartmann6': (-0.237511, 0.0)}.
[INFO 04-04 14:10:02] ax.service.ax_client: Generated new trial 3 with parameters {'x1': 0.1, 'x2': 0.1, 'x3': 0.7, 'x4': 0.0, 'x5': 0.0, 'x6': 0.1}.
[INFO 04-04 14:10:02] ax.service.ax_client: Completed trial 3 with data: {'hartmann6': (-0.048001, 0.0)}.
[INFO 04-04 14:10:02] ax.service.ax_client: Generated new trial 4 with parameters {'x1': 0.1, 'x2': 0.2, 'x3': 0.0, 'x4': 0.1, 'x5': 0.5, 'x6': 0.1}.
[INFO 04-04 14:10:02] ax.service.ax_client: Completed trial 4 with data: {'hartmann6': (-0.067061, 0.0)}.
[INFO 04-04 14:10:03] ax.service.ax_client: Generated new trial 5 with parameters {'x1': 0.1, 'x2': 0.2, 'x3': 0.2, 'x4': 0.3, 'x5': 0.0, 'x6': 0.2}.
[INFO 04-04 14:10:03] ax.service.ax_client: Completed trial 5 with data: {'hartmann6': (-0.111679, 0.0)}.
[INFO 04-04 14:10:03] ax.service.ax_client: Generated new trial 6 with parameters {'x1': 0.2, 'x2': 0.2, 'x3': 0.3, 'x4': 0.1, 'x5': 0.1, 'x6': 0.1}.
[INFO 04-04 14:10:03] ax.service.ax_client: Completed trial 6 with data: {'hartmann6': (-0.100623, 0.0)}.
[INFO 04-04 14:10:03] ax.service.ax_client: Generated new trial 7 with parameters {'x1': 0.0, 'x2': 0.1, 'x3': 0.4, 'x4': 0.0, 'x5': 0.3, 'x6': 0.2}.
[INFO 04-04 14:10:03] ax.service.ax_client: Completed trial 7 with data: {'hartmann6': (-0.295631, 0.0)}.
[INFO 04-04 14:10:03] ax.service.ax_client: Generated new trial 8 with parameters {'x1': 0.3, 'x2': 0.4, 'x3': 0.0, 'x4': 0.0, 'x5': 0.2, 'x6': 0.1}.
[INFO 04-04 14:10:03] ax.service.ax_client: Completed trial 8 with data: {'hartmann6': (-0.067841, 0.0)}.
[INFO 04-04 14:10:03] ax.service.ax_client: Generated new trial 9 with parameters {'x1': 0.3, 'x2': 0.3, 'x3': 0.3, 'x4': 0.1, 'x5': 0.0, 'x6': 0.0}.
[INFO 04-04 14:10:03] ax.service.ax_client: Completed trial 9 with data: {'hartmann6': (-0.033633, 0.0)}.
[INFO 04-04 14:10:03] ax.service.ax_client: Generated new trial 10 with parameters {'x1': 0.0, 'x2': 0.1, 'x3': 0.6, 'x4': 0.1, 'x5': 0.2, 'x6': 0.0}.
[INFO 04-04 14:10:03] ax.service.ax_client: Completed trial 10 with data: {'hartmann6': (-0.063682, 0.0)}.
[INFO 04-04 14:10:04] ax.service.ax_client: Generated new trial 11 with parameters {'x1': 0.2, 'x2': 0.0, 'x3': 0.2, 'x4': 0.1, 'x5': 0.3, 'x6': 0.2}.
[INFO 04-04 14:10:04] ax.service.ax_client: Completed trial 11 with data: {'hartmann6': (-0.350309, 0.0)}.
[INFO 04-04 14:10:08] ax.service.ax_client: Generated new trial 12 with parameters {'x1': 0.1, 'x2': 0.0, 'x3': 0.2, 'x4': 0.1, 'x5': 0.3, 'x6': 0.3}.
[INFO 04-04 14:10:08] ax.service.ax_client: Completed trial 12 with data: {'hartmann6': (-0.645269, 0.0)}.
[INFO 04-04 14:10:13] ax.service.ax_client: Generated new trial 13 with parameters {'x1': 0.1, 'x2': 0.0, 'x3': 0.2, 'x4': 0.1, 'x5': 0.3, 'x6': 0.4}.
[INFO 04-04 14:10:13] ax.service.ax_client: Completed trial 13 with data: {'hartmann6': (-1.061896, 0.0)}.
[INFO 04-04 14:10:13] ax.modelbridge.base: Leaving out out-of-design observations for arms: 13_0
[INFO 04-04 14:10:18] ax.service.ax_client: Generated new trial 14 with parameters {'x1': 0.1, 'x2': 0.0, 'x3': 0.2, 'x4': 0.1, 'x5': 0.3, 'x6': 0.4}.
[INFO 04-04 14:10:18] ax.service.ax_client: Completed trial 14 with data: {'hartmann6': (-1.061896, 0.0)}.
[INFO 04-04 14:10:18] ax.modelbridge.base: Leaving out out-of-design observations for arms: 13_0
[INFO 04-04 14:10:22] ax.service.ax_client: Generated new trial 15 with parameters {'x1': 0.1, 'x2': 0.0, 'x3': 0.2, 'x4': 0.1, 'x5': 0.3, 'x6': 0.4}.
[INFO 04-04 14:10:22] ax.service.ax_client: Completed trial 15 with data: {'hartmann6': (-1.061896, 0.0)}.
[INFO 04-04 14:10:22] ax.modelbridge.base: Leaving out out-of-design observations for arms: 13_0
[INFO 04-04 14:10:27] ax.service.ax_client: Generated new trial 16 with parameters {'x1': 0.1, 'x2': 0.0, 'x3': 0.2, 'x4': 0.1, 'x5': 0.3, 'x6': 0.4}.
[INFO 04-04 14:10:27] ax.service.ax_client: Completed trial 16 with data: {'hartmann6': (-1.061896, 0.0)}.
[INFO 04-04 14:10:27] ax.modelbridge.base: Leaving out out-of-design observations for arms: 13_0
[INFO 04-04 14:10:31] ax.service.ax_client: Generated new trial 17 with parameters {'x1': 0.1, 'x2': 0.0, 'x3': 0.2, 'x4': 0.1, 'x5': 0.3, 'x6': 0.4}.
[INFO 04-04 14:10:31] ax.service.ax_client: Completed trial 17 with data: {'hartmann6': (-1.061896, 0.0)}.
[INFO 04-04 14:10:31] ax.modelbridge.base: Leaving out out-of-design observations for arms: 13_0
[INFO 04-04 14:10:36] ax.service.ax_client: Generated new trial 18 with parameters {'x1': 0.1, 'x2': 0.0, 'x3': 0.2, 'x4': 0.1, 'x5': 0.3, 'x6': 0.4}.
[INFO 04-04 14:10:36] ax.service.ax_client: Completed trial 18 with data: {'hartmann6': (-1.061896, 0.0)}.
[INFO 04-04 14:10:36] ax.modelbridge.base: Leaving out out-of-design observations for arms: 13_0
[INFO 04-04 14:10:40] ax.service.ax_client: Generated new trial 19 with parameters {'x1': 0.1, 'x2': 0.0, 'x3': 0.2, 'x4': 0.1, 'x5': 0.3, 'x6': 0.4}.
[INFO 04-04 14:10:40] ax.service.ax_client: Completed trial 19 with data: {'hartmann6': (-1.061896, 0.0)}.
as a result some trials from ax_client.get_next_trial() end up being out-of-design
This shouldn't really be happening in the first place. Seems like the candidate generation produces outputs that slightly violate the rather tight constraints.
From your constraint it looks like what you'd actually want is to just constrain the parameters to live on the simplex? Seems like the ideal solution would be to actually support equality constraints?
cc @mpolson64
as a result some trials from ax_client.get_next_trial() end up being out-of-design
This shouldn't really be happening in the first place. Seems like the candidate generation produces outputs that slightly violate the rather tight constraints.
In principle getting slightly out-of-bounds outputs doesn't matter for my use cases as I can just try with a new trial. I'm willing to wait a little longer to get a valid output with these narrow constraints. However, getting stuck is a problem, I would need some way to nudge the generator to give me new point.
From your constraint it looks like what you'd actually want is to just constrain the parameters to live on the simplex? Seems like the ideal solution would be to actually support equality constraints?
Yes, supporting equality constraints would allow more flexible use cases without workarounds.
I'm having very similar issues with the optimization getting stuck to an out-of-design sample and generating that all over again as a new trial.
I believe that the Sobol fallback on stuck optimization, which @saitcakmak is planning to work on, will help with this, so assigning this to him.