/chop

CHOP: An optimization library based on PyTorch, with applications to adversarial examples.

Primary LanguagePythonOtherNOASSERTION

pytorCH OPtimize: a library for continuous and constrained optimization built on PyTorch

...with applications to adversarially attacking and training neural networks.

Build Status Coverage Status DOI

⚠️ This library is in early development, API might change without notice. The examples will be kept up to date. ⚠️

Stochastic Algorithms

We define stochastic optimizers in the chop.stochastic module. These follow PyTorch Optimizer conventions, similar to the torch.optim module.

Full Gradient Algorithms

We also define full-gradient algorithms which operate on a batch of optimization problems in the chop.optim module. These are used for adversarial attacks, using the chop.Adversary wrapper.

Installing

Run the following:

pip install chop-pytorch

or

pip install git+https://github.com/openopt/chop.git

for the latest development version.

Welcome to chop!

Examples:

See examples directory and our webpage.

Tests

Run the tests with pytests tests.

Citing

If this software is useful to your research, please consider citing it as

@article{chop,
  author       = {Geoffrey Negiar, Fabian Pedregosa},
  title        = {CHOP: continuous optimization built on Pytorch},
  year         = 2020,
  url          = {https://github.com/openopt/chop}
}

Affiliations

Geoffrey Négiar is in the Mahoney lab and the El Ghaoui lab at UC Berkeley.

Fabian Pedregosa is at Google Research.