samiraabnar/attention_flow

Sample for Table 1

Opened this issue · 1 comments

Dear authors,

Thank you for your amazing work! I find the idea of attention flow very intriguing and would like to compute two other rows of Table 1 in your work with LRP and method from this paper, in which they compared with rollout but not attention flow using different metrics.

Could you please point me to the 2000 samples you used for computing table 1? And it looks like there is one module util missing that prevent me from running run_bert_sv.py, would you mind to release it, please?

Thank you!

Hi,

Thanks for your kind words. I really appreciate that.

About the examples used for Table 1. They are just the first 2000 examples of the test set of the dataset as they are stored (I read the data without shuffling and simply stop the computation after processing the 2000 first examples).
And thanks for noticing the missing util directory. I added some of the missing utils to the repo, but in order to run the run_bert_sv.py you need access to tasks and models that are not part of this project.

I am going to import everything needed so that all the experiments from our paper are easily replicable, but until then, you can fined everything related to the specific models and tasks that I have used in this other repository:
https://github.com/samiraabnar/Reflect
(you should look into util, tasks, tf2_models, and tfds_data directories.)

Cheer