PyTorch Multi-Head Attention Install pip install torch-multi-head-attention Usage from torch_multi_head_attention import MultiHeadAttention MultiHeadAttention(in_features=768, head_num=12)