/MVMO

Python package for heuristic optimization

Primary LanguagePythonMIT LicenseMIT

Mean Variance Mapping Optimization Algorithm

MVMO is a Python package to perform heuristic optimization on constrained and unconsrained optimization problems whose convexity and/or linearity may not be fully known. It is based on swarm optimization principles, and uses a continuously updated mean and variance of best solutions from optimization process. Note: since this is a heuristic algorithm, it does not provide the optimal solution, but near optimal solution. This is however done in a very quick time compared to traditional optimization solvers.

Installation

MVMO can be installed from PyPI using:

pip install MVMO

MVMO requires numpy and pandas to work.

Usage

Initialisation

The MVMO optimizer can be called with arguments iterations, num_mutation, and population_size. This defines key parameters for MVMO.

Defining objective function

MVMO by default optimizes the objective function for minimum. For maximisation, the objective function will need to be modified. Objective functions can be defined by the user as per requirement. This is shown in examples. The MVMO package provides the following test function benchmarks from Wikipedia:

  1. Rosenbrock
  2. Matyas
  3. Booth
  4. Himm
  5. Easom

Constraint definition

Constraints can be inequality or equality. The constraints are passed as a dictionary object with keywords ineq, eq, or func to symbolise whether the specified constraint is inequality, equality or a function definition. Inequality and equality contraints are specified in str format, and must follow the convention:

$$g(x) <= 0 #inequality constraint h(x) = 0 #equality constraint$$

Complex constraints can also be defined as python functions. An example of how to use the MVMO package for constrained optimization is shown later. It uses all three constraint defintions.

Binary and Integer variables

MVMO also provides the ablity to define binary and integer variables in optimization decision easily. This can be done by specifying the index of the variables with binary or integer keyword in the optimize function. This is shown in the example later.

Optimization

The optimize() method can be called on the optimizer to perform optimization. It returns a res dictioanry object upon the completion of optimization. This contains:

  1. objective: Provides best objective function value, and where it was obtained.
  2. x: The optimized decision vector
  3. convergence: Provides the list of objective function values over optimization process. This can beused to plot convergence graph.
  4. register: A pandas dataframe of the size of population_size which contains best saved objective function values and X vectors.
  5. metrics: Provides the internal mean and variance of stored solutions that was used for optimization.
  6. scaling_factors" provides a list of scaling factors used over the iterations.

The convergence graph can be plotted with MVMO.plot(res['convergence']).

The following example shows minimization of constrained Rosenbrock function:

from MVMO import MVMO
from MVMO import test_functions
function = test_functions.rosen
optimizer = MVMO(iterations=5000, num_mutation=1, population_size=10)

def func_constr(X):
	return True if X[0]**2 + X[1]**2 < 1 else False
	
bds = [(0,1.5), (1,3.5)]
constr = {'ineq':"(X[0] - 1)**3 - X[1] + 1",
		  'eq':"X[0]+X[1]-2",
		  'func':func_constr}
res = optimizer.optimize(obj_fun=function, bounds=bds, constraints=constr, binary=[0], integer=[1])

print(res['x')

MVMO.plot(res['convergence'])