/cat_swarm_python

an AntennaCAT compatable cat swarm optimizer. Includes branches for cat swarm, sand cat swarm, and a 'quantum' inspired movement model for cat swarm

Primary LanguagePythonGNU General Public License v2.0GPL-2.0

cat_swarm_python

Basic cat swarm optimizer written in Python. Modified from the adaptive timestep PSO optimizer by jonathan46000 to keep a consistent format between optimizers in AntennaCAT.

Now featuring AntennaCAT hooks for GUI integration and user input handling.

Table of Contents

Cat Swarm Optimization

Cat Swarm Optimization (CSO) is a nature-inspired optimization algorithm based on the behaviors of cats. It was introduced in "Cat Swarm Optimization" [1] in 2006. This algorithm resembles the traditional Particle Swarm Optimization (PSO) algorithm as it has location and velocity aspects, but the agents/particles in this algorithm have two options for movement in the update step. These two options are modeled off of two primary behaviors of cats: seeking and tracing. These modes represent the exploration and exploitation phases in optimization, respectively.

CSO divides the population of candidate solutions (cats) into two groups: one group for the seeking mode and another for the tracing mode. Each cat can switch between these modes according to a specified probability.

  1. Seeking Mode:

The seeking mode is responsible for exploring the search space to discover new and potentially better solutions. In this mode, cats simulate a behavior where they observe their surroundings and make decisions to move to new positions based on several possible moves. This helps the algorithm avoid getting stuck in local optima.

  1. Tracing Mode:

The tracing mode is responsible for exploiting the search space by following the best solutions found so far. In this mode, cats simulate a behavior where they move towards a promising position, akin to a cat chasing prey. This helps refine solutions and converge towards the global optimum.

Requirements

This project requires numpy and matplotlib.

Use 'pip install -r requirements.txt' to install the following dependencies:

contourpy==1.2.1
cycler==0.12.1
fonttools==4.51.0
importlib_resources==6.4.0
kiwisolver==1.4.5
matplotlib==3.8.4
numpy==1.26.4
packaging==24.0
pillow==10.3.0
pyparsing==3.1.2
python-dateutil==2.9.0.post0
six==1.16.0
zipp==3.18.1

Implementation

Constraint Handling

Users must create their own constraint function for their problems, if there are constraints beyond the problem bounds. This is then passed into the constructor. If the default constraint function is used, it always returns true (which means there are no constraints).

Boundary Types

This optimizer has 4 different types of bounds, Random (Particles that leave the area respawn), Reflection (Particles that hit the bounds reflect), Absorb (Particles that hit the bounds lose velocity in that direction), Invisible (Out of bound particles are no longer evaluated).

Some updates have not incorporated appropriate handling for all boundary conditions. This bug is known and is being worked on. The most consistent boundary type at the moment is Random. If constraints are violated, but bounds are not, currently random bound rules are used to deal with this problem.

Multi-Object Optimization

The no preference method of multi-objective optimization, but a Pareto Front is not calculated. Instead, the best choice (smallest norm of output vectors) is listed as the output.

Objective Function Handling

The optimizer minimizes the absolute value of the difference from the target outputs and the evaluated outputs. Future versions may include options for function minimization absent target values.

Internal Objective Function Example

There are three functions included in the repository:

  1. Himmelblau's function, which takes 2 inputs and has 1 output
  2. A multi-objective function with 3 inputs and 2 outputs (see lundquist_3_var)
  3. A single-objective function with 1 input and 1 output (see one_dim_x_test)

Each function has four files in a directory:

  1. configs_F.py - contains imports for the objective function and constraints, CONSTANT assignments for functions and labeling, boundary ranges, the number of input variables, the number of output values, and the target values for the output
  2. constr_F.py - contains a function with the problem constraints, both for the function and for error handling in the case of under/overflow.
  3. func_F.py - contains a function with the objective function.
  4. graph.py - contains a script to graph the function for visualization.

Other multi-objective functions can be applied to this project by following the same format (and several have been collected into a compatible library, and will be released in a separate repo)

Himmelblau function

Plotted Himmelblau Function with 3D Plot on the Left, and a 2D Contour on the Right

$$f(x, y) = (x^2 + y - 11)^2 + (x + y^2 - 7)^2$$
Global Minima Boundary Constraints
f(3, 2) = 0 $-5 \leq x,y \leq 5$
f(-2.805118, 3.121212) = 0 $-5 \leq x,y \leq 5$
f(-3.779310, -3.283186) = 0 $-5 \leq x,y \leq 5$
f(3.584428, -1.848126) = 0 $-5 \leq x,y \leq 5$

Function Feasible Decision Space and Objective Space with Pareto Front

Plotted Multi-Objective Function Feasible Decision Space and Objective Space with Pareto Front

$$\text{minimize}: \begin{cases} f_{1}(\mathbf{x}) = (x_1-0.5)^2 + (x_2-0.1)^2 \\\ f_{2}(\mathbf{x}) = (x_3-0.2)^4 \end{cases}$$
Num. Input Variables Boundary Constraints
3 $0.21\leq x_1\leq 1$
$0\leq x_2\leq 1$
$0.1 \leq x_3\leq 0.5$
$x_3\gt \frac{x_1}{2}$ or $x_3\lt 0.1$

Function Feasible Decision Space and Objective Space with Pareto Front

Plotted Single Input, Single-objective Function Feasible Decision Space and Objective Space with Pareto Front

$$f(\mathbf{x}) = sin(5 * x^3) + cos(5 * x) * (1 - tanh(x^2))$$
Num. Input Variables Boundary Constraints
1 $0\leq x\leq 1$ $0\leq x\leq 1$

Local minima at $(0.444453, -0.0630916)$

Global minima at $(0.974857, -0.954872)$

Example Implementations

Basic Swarm Example

main_test.py provides a sample use case of the optimizer.

Detailed Messages

main_test_details.py provides an example using a parent class, and the self.suppress_output and detailedWarnings flags to control error messages that are passed back to the parent class to be printed with a timestamp. This implementation sets up the hooks for integration with AntennaCAT to provide the user feedback of warnings and errors.

Realtime Graph

Example Cat Swarm Optimization

main_test_graph.py provides an example using a parent class, and the self.suppress_output and detailedWarnings flags to control error messages that are passed back to the parent class to be printed with a timestamp. Additionally, a realtime graph shows particle locations at every step. In this example, the cat swarm is not well-tuned to the problem and is not fast to converge, but the error from the target is relatively small.

NOTE: if you close the graph as the code is running, the code will continue to run, but the graph will not re-open.

References

[1] S.-C. Chu, P. Tsai, and J.-S. Pan, “Cat Swarm Optimization,” Lecture Notes in Computer Science, pp. 854–858, 2006, doi: https://doi.org/10.1007/978-3-540-36668-3_94.

Publications and Integration

This software works as a stand-alone implementation, and as one of the optimizers integrated into AntennaCAT.

Publications featuring the code in this repo will be added as they become public.

Licensing

The code in this repository has been released under GPL-2.0