/Aletheia

A Python package for unwrapping ReLU DNNs

Aletheia

A Python package for unwrapping ReLU Neural Networks

Installation

The following environments are required:

  • Python 3.6, 3.7, 3.8, 3.9 (Try Google Colab)
  • matplotlib>=3.1.3
  • numpy>=1.17
  • pandas>=1.1.2
  • seaborn>=0.9.0
  • scikit-learn>=0.23.0
  • statsmodels>=0.12.2
pip install aletheia-dnn

Usage

Load data

import numpy as np 
import pandas as pd 
import matplotlib.pyplot as plt
from sklearn.datasets import make_circles
from sklearn.model_selection import train_test_split

random_state = 0

x, y = make_circles(n_samples=2000, noise=0.1, random_state=random_state)
train_x, test_x, train_y, test_y = train_test_split(x, y, test_size=0.2, random_state=random_state)

plt.figure(figsize=(10,8))
scatter = plt.scatter(x[:, 0], x[:, 1], c=y)
plt.legend(*scatter.legend_elements(), loc="upper right")
plt.show()

Train a ReLU Net

from sklearn.neural_network import MLPClassifier
mlp = MLPClassifier(hidden_layer_sizes=[40] * 4, max_iter=2000, early_stopping=True, 
                    n_iter_no_change=100, validation_fraction=0.2,
                    solver='adam', activation="relu", random_state=random_state, 
                    learning_rate_init=0.001)
mlp.fit(train_x, train_y)

UnwrapperClassifier

from aletheia import *
clf = UnwrapperClassifier(mlp.coefs_, mlp.intercepts_)
clf.fit(train_x, train_y)
clf.summary()

CoCircleSummary

Partitioned regions

clf.visualize2D_regions(figsize=(8, 8), meshsize=300, show_label=False)

CoCircleRegions

Simplification

from sklearn.metrics import make_scorer, roc_auc_score
from sklearn.model_selection import GridSearchCV, PredefinedSplit
from sklearn.linear_model import LogisticRegressionCV, LogisticRegression

datanum = train_x.shape[0]
indices = np.arange(datanum)
idx1, idx2 = train_test_split(indices, test_size=0.2, random_state=random_state)
val_fold = np.ones((len(indices)))
val_fold[idx1] = -1

grid = GridSearchCV(MergerClassifier(unwrapper=None, 
                                     weights=mlp.coefs_, 
                                     biases=mlp.intercepts_,
                                     min_samples=30,
                                     n_neighbors=np.round(clf.nllms * 0.01).astype(int),
                                     refit_model=LogisticRegression()),
                                     param_grid={"n_clusters": [1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 15, 20]},
                                     scoring={"auc": make_scorer(roc_auc_score, needs_proba=True)},
                                     cv=PredefinedSplit(val_fold), refit="auc", n_jobs=10, error_score=np.nan)
grid.fit(train_x, train_y)
clf_merge = grid.best_estimator_
clf_merge.summary()

Local Inference

tmpid = 0
clf_merge.visualize2D_one_line(tmpid, figsize=(8, 8))
clf_merge.local_inference_wald(tmpid).round(4)

Citations

Agus Sudjianto, William Knauth, Rahul Singh, Zebin Yang and Aijun Zhang. 2020. Unwrapping The Black Box of Deep ReLU Networks: Interpretability, Diagnostics, and Simplification. arXiv:2011.04041

@article{sudjianto2020unwrapping,
  title={Unwrapping The Black Box of Deep ReLU Networks: Interpretability, Diagnostics, and Simplification},
  author={Sudjianto, Agus and Knauth, William and Singh, Rahul and Yang, Zebin and Zhang, Aijun},
  journal={arXiv:2011.04041},
  year={2020}
}