attention-bias

There are 1 repositories under attention-bias topic.

  • LSAS

    ICME'23, Lightweight sub-attention strategy (LSAS) utilizes high-order sub-attention modules to improve the original self-attention modules.

    Language:Python3