multi-head-self-attention
There are 7 repositories under multi-head-self-attention topic.
Altaheri/EEG-ATCNet
Attention temporal convolutional network for EEG-based motor imagery classification
TatevKaren/BabyGPT-Build_GPT_From_Scratch
BabyGPT: Build Your Own GPT Large Language Model from Scratch Pre-Training Generative Transformer Models: Building GPT from Scratch with a Step-by-Step Guide to Generative AI in PyTorch and Python
datnnt1997/multi-head_self-attention
A Faster Pytorch Implementation of Multi-Head Self-Attention
cyk1337/Transformer-in-PyTorch
Transformer/Transformer-XL/R-Transformer examples and explanations
ChristianOrr/transformers
Transformer creation from scratch using Jax.
duongttr/HyPepTox-Fuse
Official implementation of HyPepTox-Fuse framework
naivoder/AttentionIsAllYouNeed
PyTorch implementation of transformers with multi-headed self attention