jmejia8/Metaheuristics.jl

Issues from encapsulating `State` in `Algorithm`

jonathanfischer97 opened this issue · 2 comments

Issue

  • The Algorithm structure currently holds a State object. This design is counterintuitive as Algorithm should ideally represent static parameters, options, and optimization criteria, while State should hold mutable information updated during the algorithm's execution.
  • This encapsulation of State within Algorithm can lead to confusion about the responsibilities of these two structures and unintended side effects.

Unnecessary encapsulation

In the main optimization loop, most operations directly interface with State, suggesting that the current encapsulation in Algorithm isn't fully exploited or even necessary.

For instance, the update_state! function is passed all the fields of Algorithm separately, which seems to defeat the purpose of having State encapsulated within Algorithm.

Here's an example from during.jl:

function _during_optimization!(status, parameters, problem, information, options, convergence, logger, termination)
    ...
    update_state!(status, parameters, problem, information, options)
    ...
end

This is an indication that State is already being treated as an independent object in the source code, and doesn't need any relation to Algorithm. As such, a refactor wouldn't require extensive changes from what I've seen.

User Impact

This issue became apparent when using the package in a typical way:

# Define method
de = DE(;N = 10000, F_min = 0.5, F_max =1.0, strategy = :best1, options = options)

# Run optimization and return final state
de_optimization = optimize(objective_parallel, bounds, de, logger=logger)
  1. In this example, de is assumed to hold static optimization parameters that can be reused. However, it also holds State in its status field, which is not obvious unless one examines the source code.

    • As a side note, it's also not obvious that the DE constructor returns an Algorithm rather than the type DE <: AbstractDifferentialEvolution that it shares a name with. I can understand why this construction was done, but I think this is atypical and another potential source of confusion.
  2. Because de is mutated during the optimize call, if the optimization is run again after a previous run without reinitializing de, the optimization starts the search from the previous best solution, rather than starting from a clean slate.

  3. This can lead to unintended side effects for users trying to run benchmarks or comparisons between different optimization options, such as I was.

Proposed Solution

Consider refactoring the code to separate these two concepts more clearly.

This could involve removing the State object from Algorithm and instead passing it as a separate argument to the functions that need to mutate it. Specifically within src/optimize:

  1. The user only passes the Algorithm method, without the status field, to optimize.

  2. optimize calls _before_optimization!, which creates a new State object, rather than mutating with initialize!(method.status, ...) as it does currently.

  3. _during_optimization! and _after_optimization! are passed status separately from the other Algorithm properties, which they already do, so no changes needed after initialization!

I'd be interested to hear your thoughts on this. If you agree with this assessment, I'd be happy to help with the refactoring process, as like I said, there don't seem to be many dependencies on Algorithm anyways.

Hi @jonathanfischer97

Thank you for the very good issue. Of course, many components of the base code can be improved, and your contribution is more than welcome. Please, feel free to send your PRs on this issue, but the refactor could take place for v4.

Actually, the code base is planed to be updated (during 2024), solving a lot of issues related to the design and performance of the package.

No problem! I love this package and have been using it extensively in my research. Have a lot of ideas for improvements.

I'll fork it and make PRs for modifications I think would be general improvements.