blackbox-optimization
There are 86 repositories under blackbox-optimization topic.
google/vizier
Python-based research interface for blackbox and hyperparameter optimization, based on the internal Google Vizier Service.
SimonBlanke/Gradient-Free-Optimizers
Simple and reliable optimization with local, global, population-based and sequential techniques in numerical discrete search spaces.
chrisstroemel/Simple
Experimental Global Optimization Algorithm
PKU-DAIR/open-box
Towards Generalized and Efficient Blackbox Optimization System/Package (KDD 2021 & JMLR 2024)
ARM-software/mango
Parallel Hyperparameter Tuning in Python
libprima/prima
PRIMA is a package for solving general nonlinear optimization problems without using derivatives. It provides the reference implementation for Powell's derivative-free optimization methods, i.e., COBYLA, UOBYQA, NEWUOA, BOBYQA, and LINCOA. PRIMA means Reference Implementation for Powell's methods with Modernization and Amelioration, P for Powell.
c-bata/goptuna
A hyperparameter optimization framework, inspired by Optuna.
bbopt/nomad
NOMAD - A blackbox optimization software
microprediction/humpday
Elo ratings for global black box derivative-free optimizers
WilliamLwj/PyXAB
PyXAB - A Python Library for X-Armed Bandit and Online Blackbox Optimization Algorithms
auto-flow/ultraopt
Distributed Asynchronous Hyperparameter Optimization better than HyperOpt. 比HyperOpt更强的分布式异步超参优化库。
pdfo/pdfo
Powell's Derivative-Free Optimization solvers.
logicalclocks/maggy
Distribution transparent Machine Learning experiments on Apache Spark
tilleyd/cec2017-py
Python module for CEC 2017 single objective optimization test function suite.
parmoo/parmoo
Python library for parallel multiobjective simulation optimization
thomas-young-2013/open-box
Generalized and Efficient Blackbox Optimization System.
evhub/bbopt
Black box hyperparameter optimization made easy.
Hvass-Labs/swarmops
Heuristic Optimization for Python
OPTML-Group/DeepZero
[ICLR'24] "DeepZero: Scaling up Zeroth-Order Optimization for Deep Model Training" by Aochuan Chen*, Yimeng Zhang*, Jinghan Jia, James Diffenderfer, Jiancheng Liu, Konstantinos Parasyris, Yihua Zhang, Zheng Zhang, Bhavya Kailkhura, Sijia Liu
PKU-DAIR/mindware
An efficient open-source AutoML system for automating machine learning lifecycle, including feature engineering, neural architecture search, and hyper-parameter tuning.
Hvass-Labs/MetaOps
Tuning the Parameters of Heuristic Optimizers (Meta-Optimization / Hyper-Parameter Optimization)
bbopt/NOMAD.jl
Julia interface to the NOMAD blackbox optimization software
nestordemeure/Simplers
Rust implementation of the Simple(x) Global Optimization algorithm
jbrea/CMAEvolutionStrategy.jl
A julia implementation of the CMA Evolution Strategy for derivative-free optimization of potentially non-linear, non-convex or noisy functions over continuous domains.
sibirbil/marsopt
Mixed Adaptive Random Search (MARS) for Optimization
boschresearch/blackboxopt
Blackbox optimization algorithms with a common interface, along with useful helpers like parallel optimization loops, analysis and visualization scripts.
ma921/SOBER
Fast Bayesian optimization, quadrature, inference over arbitrary domain with GPU parallel acceleration
damon-demon/Black-Box-Defense
Robustify Black-Box Models (ICLR'22 - Spotlight)
mila-iqia/Awesome-Offline-Model-Based-Optimization
📰 Must-Read Papers on Offline Model-Based Optimization 🔥
sambo-optimization/sambo
🎯 📈 Sequential And Model-Based Optimization with SCE-UA, SMBO, and SHGO algos. No deps—SOTA perfomance.
Artelys/knitro-modeling-examples
Nonlinear programming application examples solved with Artelys Knitro
bbopt/HyperNOMAD
A library for the hyperparameter optimization of deep neural networks
bbopt/solar
The SOLAR blackbox optimization problem
optuna/bboc-optuna-developers
Black-box optimizer submitted to BBO challenge at NeurIPS 2020
rajcscw/pytorch-optimize
A simple black-box optimization framework to train your pytorch models for optimizing non-differentiable objectives
Shahul-Rahman/SPGD-Search-Party-Gradient-Descent-algorithm
SPGD: Search Party Gradient Descent algorithm, a Simple Gradient-Based Parallel Algorithm for Bound-Constrained Optimization. Link: https://www.mdpi.com/2227-7390/10/5/800