/relu-or-pwl

Evaluate neural networks with ReLU activation as an alternative for piecewise linearization in MILP.

Primary LanguageJupyter Notebook

This repository is not active