Integrated Gradients for Llama2
oscarwzt opened this issue · 0 comments
oscarwzt commented
Hi,
I would like to ask how to use Integrated Gradients with Llama2
I tried
ig = IntegratedGradients(model)
llm_ig = LLMAttribution(ig, tokenizer)
but I get the error "AssertionError: LLMAttribution does not support <class 'captum.attr._core.integrated_gradients.IntegratedGradients'> ", and the same occurs with using LLMGradientAttribution.
From this Bert IG tutorial I tried LayeredIntegratedGradients and it works fine if I set the layer as the embedding layer, but I'm more interested in understanding the other layers, for example, one of the attention layers model.layers[30].self_attn
. But this would cause another error that I don't understand at all (I will post a separate question).
Thank you!