Simple sweep optimizer written in Python.
The approaches in this repo are exhaustive searches through a combination of hyperparameters (the inputs for the feasible decision space of the objective function).
The class format is based off of the adaptive timestep PSO optimizer by jonathan46000 for data collection baseline. This repo does not feature any PSO optimization. Instead, the format has been used to retain modularity with other optimizers. Several methods using surrogate models have been moved into their own repositories (Bayesian, Gradient Based Search); see the directory for the most up to date list of optimizers.
Now featuring AntennaCAT hooks for GUI integration and user input handling.
- Sweep Optimization
- Requirements
- Implementation
- Error Handling
- Example Implementations
- References
- Publications and Integration
- How to Cite
- Licensing
A grid-based sweep optimizer, often referred to as grid search, is a simple yet effective optimization technique commonly used for hyperparameter tuning in machine learning models. This method systematically explores a specified subset of the hyperparameter space by evaluating the performance of a model with all possible combinations of the provided hyperparameter values.
Random search is an optimization method where solutions are randomly sampled from a defined space, evaluated, and iteratively improved based on the evaluations, aiming to find an optimal or near-optimal solution. Random search is generally not as efficient as more advanced optimization algorithms like gradient-based methods or evolutionary algorithms, especially in problems where the search space is structured or the objective function has a particular shape that can be exploited.
This project requires numpy and matplotlib.
Use 'pip install -r requirements.txt' to install the following dependencies:
contourpy==1.2.1
cycler==0.12.1
fonttools==4.51.0
importlib_resources==6.4.0
kiwisolver==1.4.5
matplotlib==3.8.4
numpy==1.26.4
packaging==24.0
pillow==10.3.0
pyparsing==3.1.2
python-dateutil==2.9.0.post0
six==1.16.0
zipp==3.18.1
Users must create their own constraint function for their problems, if there are constraints beyond the problem bounds. This is then passed into the constructor. If the default constraint function is used, it always returns true (which means there are no constraints).
More search types will be added, but for initial deployment, a standard grid search is used.
The variables from PSO_python have been reused to retain modularity with AntennaCAT and provide some consistency between optimizers.
self.M : An array of current search location(s).
self.output_size : An integer value for the output size of obj func
self.Active : An array indicating the activity status of each particle.
self.Gb : Global best position, initialized with a large value.
self.F_Gb : Fitness value corresponding to the global best position.
self.targets : Target values for the optimization process.
self.search_resolution : Step search resolutions.
self.current_step : Current step increment.
self.maxit : Maximum number of iterations.
self.E_TOL : Error tolerance.
self.obj_func : Objective function to be optimized.
self.constr_func : Constraint function.
self.iter : Current iteration count.
self.current_particle : Index of the current particle being evaluated.
self.number_of_particles : Total number of particles.
self.allow_update : Flag indicating whether to allow updates.
self.search_method : search method for the optimization problem.
self.Flist : List to store fitness values.
self.Fvals : List to store fitness values.
self.Mlast : Last search location
# Constant variables
NO_OF_PARTICLES = 1 # Number of independent agents searching the space
LB = func_configs.LB # Lower boundaries
UB = func_configs.UB # Upper boundaries
OUT_VARS = func_configs.OUT_VARS # Number of output variables (y-values)
TARGETS = func_configs.TARGETS # Target values for output
MIN_RES = [[0.1, 0.2, 0.3]] # Minimum resolution for search
MAX_RES = [[0.01, 0.02, 0.01]] # Maximum resolution for search
E_TOL = 10 ** -3 # Convergence Tolerance. For Sweep, this should be a larger value
MAXIT = 5000 # Maximumuseful for debug)
SEARCH_METHOD = 1 # int search 1 = basic_grid, 2 = random_search
The basic grid search uses the current position of a particle (or agent), and increments it one step towards the upper bounds based on the defined problem space. It can use 1 or more particles (or agents) to search a space. If one particle is used, it will start at the lower bound of the decision space, and increment based on the minimum resolution until the particle reaches the maximum boundary limit.
Resolution is a multi-dimensional vector to allow for tuning in all dimensions of the input space.
This method does not tend to converge with a small error tolerance.
The random search generates NO_OF_PARTICLES agents in order to search the defined problem space. Each agent is independent and does not move from its initial generated position.
The no preference method of multi-objective optimization, but a Pareto Front is not calculated. Instead the best choice (smallest norm of output vectors) is listed as the output.
The optimizer minimizes the absolute value of the difference from the target outputs and the evaluated outputs. Future versions may include options for function minimization absent target values.
There are three functions included in the repository:
- Himmelblau's function, which takes 2 inputs and has 1 output
- A multi-objective function with 3 inputs and 2 outputs (see lundquist_3_var)
- A single-objective function with 1 input and 1 output (see one_dim_x_test)
Each function has four files in a directory:
- configs_F.py - contains imports for the objective function and constraints, CONSTANT assignments for functions and labeling, boundary ranges, the number of input variables, the number of output values, and the target values for the output
- constr_F.py - contains a function with the problem constraints, both for the function and for error handling in the case of under/overflow.
- func_F.py - contains a function with the objective function.
- graph.py - contains a script to graph the function for visualization.
Other multi-objective functions can be applied to this project by following the same format (and several have been collected into a compatible library, and will be released in a separate repo)
Plotted Himmelblau Function with 3D Plot on the Left, and a 2D Contour on the Right
Global Minima | Boundary | Constraints |
---|---|---|
f(3, 2) = 0 | ||
f(-2.805118, 3.121212) = 0 | ||
f(-3.779310, -3.283186) = 0 | ||
f(3.584428, -1.848126) = 0 |
Plotted Multi-Objective Function Feasible Decision Space and Objective Space with Pareto Front
Num. Input Variables | Boundary | Constraints |
---|---|---|
3 |
|
|
Plotted Single Input, Single-objective Function Feasible Decision Space and Objective Space with Pareto Front
Num. Input Variables | Boundary | Constraints |
---|---|---|
1 |
Local minima at
Global minima at
In the sweep.py class, the objective function is called twice. Some optimizer/objective function/parameter combinations cause under/overflows when using numpy. It is a known bug in numpy that as of 5/2024 basic numpy cannot convert floats to longFloats or float128().
-
- When the constraints are called to verify if the particle is in bounds, and to apply the selected boundary method. At this point, the 'noErrors' boolean is used to confirm if the objection function resolves. If the objective function does not resolve, or the particle is out of bounds, the boundary conditions are applied.
-
- To evaluate the objective function as part of the traditional particle swarm algorithm
main_test.py provides a sample use case of the optimizer.
main_test_details.py provides an example using a parent class, and the self.suppress_output and detailedWarnings flags to control error messages that are passed back to the parent class to be printed with a timestamp. This implementation sets up the hooks for integration with AntennaCAT in order to provide the user feedback of warnings and errors.
Grid Search. Left: particle search locations, Right: fitness function results (open circles), and target (red star(
Random Search. Left: particle search locations, Right: fitness function results (open circles), and target (red star(
main_test_graph.py provides an example using a parent class, and the self.suppress_output and detailedWarnings flags to control error messages that are passed back to the parent class to be printed with a timestamp. Additionally, a realtime graph shows particle locations at every step.
The figures above are a snapshots of the search. The left shows all of the search locations of a single particle (NOTE: toggle a the 'clear' boolean to turn this feature off), and the right side shows the target (marked by a star) and the fitness function locations (the open circles). While the fitness of the particle is very close to the target, it does not come closer than the 10E-6 tolerance, so the search does not converge.
NOTE: if you close the graph as the code is running, the code will continue to run, but the graph will not re-open.
This repo does not currently reference any code of papers for the sweep algorithm.
For the original code base, see the adaptive timestep PSO optimizer by jonathan46000
This software works as a stand-alone implementation, and as one of the optimizers integrated into AntennaCAT.
Publications featuring the code in this repo will be added as they become public.
This is a basic sweep algorithm, and as such is not based on any particular code, repo, or publication.
If you wish to cite this, either cite the repository, or the tie-in AntennaCAT publication
The code in this repository has been released under GPL-2.0