β€οΈ Found EvoX helpful? Please consider giving it a star to show your support! β
π Experience the transformative power of Distributed GPU-Acceleration in Evolutionary Computation (EC). EvoX isn't just another frameworkβit's a pioneering toolset crafted to redefine EC's frontiers. Dive deep into a vast collection of Evolutionary Algorithms (EAs) and engage with an expansive range of Benchmark Problems. Tackle everything from intricate tasks to computationally intensive challenges. With EvoX, achieve unmatched speed and adaptability, ensuring your optimization journey is swift and seamless. Embrace the future of EC with EvoX!
-
π Blazing Fast Performance:
- Experience GPU-Accelerated optimization, achieving speeds 10x-100x faster than traditional methods.
- Leverage the power of distributed workflows for even more rapid optimization.
-
π Versatile Optimization Suite:
- Cater to all your needs with both Single-objective and Multi-objective optimization capabilities.
- Dive into a comprehensive library of benchmark problems, ensuring robust testing and evaluation.
- Explore the frontier of AI with extensive tools for neuroevolution tasks.
-
π οΈ Designed for Simplicity:
- Embrace the elegance of functional programming, simplifying complex algorithmic compositions.
- Benefit from hierarchical state management, ensuring modular and clean programming.
- Jumpstart your journey with our detailed tutorial.
Elevate Your Optimization Game with EvoX!: Step into a meticulously crafted platform tailored for both researchers and enthusiasts. Effortlessly traverse the vast optimization landscapes, confront and conquer widely-acknowledged black-box optimization challenges, and venture into the intricate realms of neuroevolution. It's not merely about breadthβit's about velocity. Supercharge your projects with GPU acceleration and streamlined distributed workflows. Plus, with a foundation in functional programming and hierarchical state management, EvoX promises a seamless, modular user experience.
- βοΈ Highlighted Features
- 𧬠Comprehensive Evolutionary Algorithms
- π Diverse Benchmark Problems
- π§ Setting Up EvoX
- π Dive Right In: Quick Start
- π Explore More with Examples
- π€ Join the EvoX Community
- π Citing EvoX
Category | Algorithm Names |
---|---|
Differential Evolution | CoDE, JaDE, SaDE, SHADE, IMODE, ... |
Evolution Strategies | CMA-ES, PGPE, OpenES, CR-FM-NES, xNES, ... |
Particle Swarm Optimization | FIPS, CSO, CPSO, CLPSO, SL-PSO, ... |
Category | Algorithm Names |
---|---|
Dominance-based | NSGA-II, NSGA-III, SPEA2, BiGE, KnEA, ... |
Decomposition-based | MOEA/D, RVEA, t-DEA, MOEAD-M2M, EAG-MOEAD, ... |
Indicator-based | IBEA, HypE, SRA, MaOEA-IGD, AR-MOEA, ... |
Category | Problem Names |
---|---|
Numerical | DTLZ, LSMOP, MaF, ZDT, CEC'22, ... |
Neuroevolution | Brax, Gym, TorchVision Dataset, ... |
Dive deeper! For a comprehensive list and further details, explore our API Documentation for algorithms and Benchmark Problems.
Install evox
effortlessly via pip
:
pip install evox
Note: To install EvoX with JAX and hardware acceleration capabilities, please refer to our comprehensive installation guide.
Kickstart your journey with EvoX in just a few simple steps:
- Import necessary modules:
import evox
from evox import algorithms, problems, workflows
- Configure an algorithm and define a problem:
pso = algorithms.PSO(
lb=jnp.full(shape=(2,), fill_value=-32),
ub=jnp.full(shape=(2,), fill_value=32),
pop_size=100,
)
ackley = problems.numerical.Ackley()
- Compose and initialize the workflow:
workflow = workflows.StdWorkflow(pso, ackley)
key = jax.random.PRNGKey(42)
state = workflow.init(key)
- Run the workflow:
# Execute the workflow for 100 iterations
for i in range(100):
state = workflow.step(state)
Eager to delve deeper? The example directory is brimming with comprehensive use-cases and applications of EvoX.
- Engage in enlightening discussions and share your experiences on GitHub's discussion board.
- Welcome to join our QQ group (ID: 297969717).
We use weblate for translation, to help us translate the document, please visit here.
If EvoX has propelled your research or projects, consider citing our work:
@article{evox,
title = {{EvoX}: {A} {Distributed} {GPU}-accelerated {Framework} for {Scalable} {Evolutionary} {Computation}},
author = {Huang, Beichen and Cheng, Ran and Li, Zhuozhao and Jin, Yaochu and Tan, Kay Chen},
journal = {arXiv preprint arXiv:2301.12457},
eprint = {2301.12457},
year = {2023}
}