zeroth-order-optimization

There are 23 repositories under zeroth-order-optimization topic.

  • Evolutionary-Intelligence/pypop

    PyPop7: A Pure-Python Library for POPulation-based Black-Box Optimization (BBO), especially their *Large-Scale* versions/variants. https://pypop.rtfd.io/

    Language:Python1744925
  • square-attack

    max-andr/square-attack

    Square Attack: a query-efficient black-box adversarial attack via random search [ECCV 2020]

    Language:Python14451326
  • humpday

    microprediction/humpday

    Elo ratings for global black box derivative-free optimizers

    Language:Python12262917
  • LeCAR-Lab/CoVO-MPC

    Official implementation for the paper "CoVO-MPC: Theoretical Analysis of Sampling-based MPC and Optimal Covariance Design" accepted by L4DC 2024. CoVO-MPC is an optimal sampling-based MPC algorithm.

    Language:Python932205
  • pdfo

    pdfo/pdfo

    Powell's Derivative-Free Optimization solvers.

    Language:Fortran9272225
  • LeCAR-Lab/model-based-diffusion

    Official implementation for the paper "Model-based Diffusion for Trajectory Optimization". Model-based diffusion (MBD) is a novel diffusion-based trajectory optimization framework that employs a dynamics model to run the reverse denoising process to generate high-quality trajectories.

    Language:Jupyter Notebook702
  • as791/ZOO_Attack_PyTorch

    This repository contains the PyTorch implementation of Zeroth Order Optimization Based Adversarial Black Box Attack (https://arxiv.org/abs/1708.03999)

    Language:Python322612
  • ZO-Bench/ZO-LLM

    [ICML 2024] Official code for the paper "Revisiting Zeroth-Order Optimization for Memory-Efficient LLM Fine-Tuning: A Benchmark ".

    Language:Python31434
  • OPTML-Group/DeepZero

    [ICLR'24] "DeepZero: Scaling up Zeroth-Order Optimization for Deep Model Training" by Aochuan Chen*, Yimeng Zhang*, Jinghan Jia, James Diffenderfer, Jiancheng Liu, Konstantinos Parasyris, Yihua Zhang, Zheng Zhang, Bhavya Kailkhura, Sijia Liu

    Language:Python25122
  • damon-demon/Black-Box-Defense

    Robustify Black-Box Models (ICLR'22 - Spotlight)

    Language:Python24234
  • caesarcai/ZORO

    Zeroth-Order Regularized Optimization (ZORO): Approximately Sparse Gradients and Adaptive Sampling

    Language:Python8214
  • hassaanhashmi/pd_zdpg_plus

    Code for IEEE MLSP 2021 paper titled "Model-Free Learning of Optimal Deterministic Resource Allocations in Wireless Systems via Action-Space Exploration"

    Language:Python8002
  • caesarcai/SCOBO

    SCOBO: Sparsity-aware Comparison Oracle Based Optimization

    Language:Python4101
  • optiprofiler/optiprofiler

    Benchmarking optimization solvers.

    Language:Python4091
  • rebeccadf/Zeroth-order-optimization-methods

    Implementation of the algorithms described in the papers "ZO-AdaMM: Zeroth Order Adaptive Momentum" by Chen et al., "Stochastic first- and zeroth-order methods" by Ghadimi et al. and "SignSGD via zeroth- order oracle" by Liu et al.

    Language:Jupyter Notebook4100
  • VCL3D/nevergrad

    Nevergrad Optimizer Benchmarking for 3D Performance Capture

    Language:Python2100
  • blockwise-direct-search/bds

    Blockwise Direct Search

    Language:MATLAB13771
  • cangcn/NES-HT

    Hard-Thresholding Meets Evolution Strategies in Reinforcement Learning

    Language:Python1200
  • QiqiDuan257/pop-lsbbo

    A pure-MATLAB library for POPulation-based Large-Scale Black-Box Optimization (pop-lsbbo).

    Language:MATLAB1000
  • StatNLP/sparse_szo

    Sparse Perturbations for Improved Convergence in Stochastic Zeroth-Order Optimization

    Language:Python1701
  • libprima/.github

    PRIMA: Reference Implementation for Powell's methods with Modernization and Amelioration

  • qq-me/torchzero

    0th order optimizers, gradient chaining, random gradient approximation

    Language:Python0200
  • ZigeW/SODA

    [NeurIPS 2023] “SODA: Robust Training of Test-Time Data Adaptors”

    Language:Python0101