pytorch/captum

DeepLift on Transformers Based Text Classifiers

Opened this issue · 0 comments

❓ Questions and Help

I see from https://captum.ai/api/deep_lift.html that DeepLift supports a limited number of non-linear activations
My case of application is explaining BERT / RoBERTa base classifiers
What is not clear to me is if I can apply DeepLift as it is or I have to write some wrapper / custom functions
As for now, it works a bit oddly: the contributions returned does not always make sense especially if I compare them e.g. with those from IG

Thanks a lot! Great package :)