Issue with Attention Visualization
Arkkienkeli opened this issue · 2 comments
Arkkienkeli commented
Hello, thank you for the work that you published, very exciting!
I am trying to play with the code and took the code from the Attention Visualization notebook.
I get the following error when I am trying to run create_hierarchical_heatmaps_indiv
Traceback (most recent call last):
File "hipt4kinference_attentionvisualization.py", line 28, in <module>
create_hierarchical_heatmaps_indiv(region, model256, model4k,
File "/storage01/nikitam/HIPT/HIPT_4K/hipt_heatmap_utils.py", line 388, in create_hierarchical_heatmaps_indiv
score256_1 = concat_scores256(a256_1[:,i,:,:], size=(s//16,)*2)
TypeError: concat_scores256() missing 2 required positional arguments: 'w_256' and 'h_256'
Thank you!
Richarizardd commented
Hi @Arkkienkeli - you should be able to fix this by setting w_256
and h_256
to be 16.
Anivader commented
(w_256, h_256) is basically (W/patch_size, H/patch_size), where W, H are 4096 and patch size is 256. So, you will get 16.