This repository contains code and data for the preprint Learning cubing heuristics for SAT from DRAT proofs (submitted to AITP 2020).
- Clone repository to
neurocuber-public
- Enter a virtual environment with
python>=3.6=
. From inside the virtual environment, runpip3 install -r requirements.txt
- Install the pybind11 wrappers around Z3 by cloning https://github.com/Z3Prover/z3 to
neurocuber-public/z3/z3/
. AdjustCMakeLists.txt
according to your system and install usingpython3 setup.py install
. - For data generation, compile the modified version of
drat-trim
intools/
by usinggcc
. - Ensure that
cadical
is installed (https://github.com/arminbiere/cadical). - Adjust global variables in
config.py
according to your system. - Run
bash reproduce_results.sh
. This will take several days; parallelize this as required.
Follow the same steps as above, but comment out the first two lines in reproduce_results.sh
.
The folders plot_data/top1/
, plot_data/top3/
, and plot_data/playout/
contain the data, in the form of CSVs extracted from Tensorboard summaries, which are reported in the preprint. Enter the virtual environment and run python data2.py --all
from inside plot_data/
to reproduce all plots and tables.
Our architecture (neurocuber.py
) is a re-implementation in TensorFlow 2 of the simplified NeuroSAT architecture used by Selsam and Bjørner in NeuroCore.
There was also preliminary work by Selsam (here) on applying the same architecture to variable branching for cube-and-conquer (using importance sampling of paths through the DPLL search tree).