tech-srl/how_attentive_are_gats
Code for the paper "How Attentive are Graph Attention Networks?" (ICLR'2022)
Python
Stargazers
- aabbas90Max Planck Institute for Informatics
- ajay-sreeramTarget
- AmitBracha
- AnthonySong98
- ariecattan@BIU-NLP
- AsuradaYuciDLUT phd candidate, IIAU
- bgauzere
- chaitjoUniversity of Cambridge
- Davidchenw
- enocheNanyang Technological University
- eric8607242NCKU
- eridgdPittsburgh, PA
- February24-LeeSNU
- fly51flyPRIS
- GarveyZ
- GillesJBelgium
- hypnopump
- jacobhepkemaWellcome Sanger Institute
- jeanmarcalkazzi@idealworks
- jeongwhanchoi@bigdyl-kaist
- Laughing-Boy@fossasia @loklak @udacity
- LeadBeetleKressbronn am Bodensee, 88079
- LFhaseSomeplace
- matteomedioliiGenius
- ModMorphModMorph.AI
- nikitavoloboevTbilisi
- OrdinaryCrazyTiktok.Inc.
- ruin2
- sailfish009freelancer
- urialon
- wdanFudan University
- WPZgithubHangzhou
- WuSht
- Yamguocheng
- yotofu
- zk5580752Baidu