/My_investigate

XAI approaches based on the TensorFlow framework to understand neural networks decision

Primary LanguagePython

My iNNvestigate

iNNvestigate can be installed with the following commands. The library is based on Keras and therefore requires a supported Keras-backend o(Currently only the Tensorflow backend is supported. We test with Python 3.6, Tensorflow 1.12 and Cuda 9.x.):

pip install innvestigate
# Installing Keras backend
pip install [tensorflow | theano | cntk]

To use the example scripts and notebooks one additionally needs to install the package matplotlib:

pip install matplotlib

The library's tests can be executed via:

git clone https://github.com/albermax/innvestigate.git
cd innvestigate
python setup.py test

Usage and Examples

The iNNvestigate library contains implementations for the following methods:

  • function:
    • gradient: The gradient of the output neuron with respect to the input.
    • smoothgrad: SmoothGrad averages the gradient over number of inputs with added noise.
  • signal:
    • deconvnet: DeConvNet applies a ReLU in the gradient computation instead of the gradient of a ReLU.
    • guided: Guided BackProp applies a ReLU in the gradient computation additionally to the gradient of a ReLU.
    • pattern.net: PatternNet estimates the input signal of the output neuron.
  • attribution:
    • input_t_gradient: Input * Gradient
    • deep_taylor[.bounded]: DeepTaylor computes for each neuron a rootpoint, that is close to the input, but which's output value is 0, and uses this difference to estimate the attribution of each neuron recursively.
    • pattern.attribution: PatternAttribution applies Deep Taylor by searching rootpoints along the singal direction of each neuron.
    • lrp.*: LRP attributes recursively to each neuron's input relevance proportional to its contribution of the neuron output.
    • integrated_gradients: IntegratedGradients integrates the gradient along a path from the input to a reference.
    • deeplift.wrapper: DeepLIFT (wrapper around original code, slower) computes a backpropagation based on "finite" gradients.
  • miscellaneous:
    • input: Returns the input.
    • random: Returns random Gaussian noise.

Visualization

MNIST examples

Group 1628

Brain examples

Group 2045

Counterfactual examples

Group 1889