attention-mechanism

There are 1568 repositories under attention-mechanism topic.

  • SimGNN

    SimGNN

    A PyTorch implementation of "SimGNN: A Neural Network Approach to Fast Graph Similarity Computation" (WSDM 2019).

    Language:Python749
  • keras-attention

    Visualizing RNNs using the attention mechanism

    Language:Python747
  • OpenSTL

    OpenSTL: A Comprehensive Benchmark of Spatio-Temporal Predictive Learning

    Language:Python726
  • Deeplearning.ai-Natural-Language-Processing-Specialization

    This repository contains my full work and notes on Coursera's NLP Specialization (Natural Language Processing) taught by the instructor Younes Bensouda Mourri and Łukasz Kaiser offered by deeplearning.ai

    Language:Jupyter Notebook719
  • TimeSformer-pytorch

    Implementation of TimeSformer from Facebook AI, a pure attention-based solution for video classification

    Language:Python688
  • linear-attention-transformer

    Transformer based on a variant of attention that is linear complexity in respect to sequence length

    Language:Python676
  • bottleneck-transformer-pytorch

    Implementation of Bottleneck Transformer in Pytorch

    Language:Python668
  • MIRNet

    [ECCV 2020] Learning Enriched Features for Real Image Restoration and Enhancement. SOTA results for image denoising, super-resolution, and image enhancement.

    Language:Python661
  • keras-self-attention

    Attention mechanism for processing sequential data that considers the context for each timestamp.

    Language:Python655
  • Transformer-TTS

    A Pytorch Implementation of "Neural Speech Synthesis with Transformer Network"

    Language:Python651
  • GeoTransformer

    [CVPR2022] Geometric Transformer for Fast and Robust Point Cloud Registration

    Language:Python635
  • MORAN_v2

    MORAN: A Multi-Object Rectified Attention Network for Scene Text Recognition

    Language:Python631
  • memorizing-transformers-pytorch

    Implementation of Memorizing Transformers (ICLR 2022), attention net augmented with indexing and retrieval of memories using approximate nearest neighbors, in Pytorch

    Language:Python623
  • neural_sp

    End-to-end ASR/LM implementation with PyTorch

    Language:Python593
  • point-transformer-pytorch

    Implementation of the Point Transformer layer, in Pytorch

    Language:Python587
  • transformer

    A Pytorch Implementation of "Attention is All You Need" and "Weighted Transformer Network for Machine Translation"

    Language:Python546
  • nuwa-pytorch

    Implementation of NÜWA, state of the art attention network for text to video synthesis, in Pytorch

    Language:Python540
  • nmt-keras

    Neural Machine Translation with Keras

    Language:Python532
  • parti-pytorch

    Implementation of Parti, Google's pure attention-based text-to-image neural network, in Pytorch

    Language:Python522
  • Aini_Modules

    A PyTorch Computer Vision (CV) module library for building n-D networks flexibly ~

    Language:Python500
  • OverlapPredator

    [CVPR 2021, Oral] PREDATOR: Registration of 3D Point Clouds with Low Overlap.

    Language:Python498
  • Structured-Self-Attention

    A Structured Self-attentive Sentence Embedding

    Language:Python494
  • YOLO-Multi-Backbones-Attention

    Model Compression—YOLOv3 with multi lightweight backbones(ShuffleNetV2 HuaWei GhostNet), attention, prune and quantization

    Language:Python493
  • keras-gat

    Keras implementation of the graph attention networks (GAT) by Veličković et al. (2017; https://arxiv.org/abs/1710.10903)

    Language:Python473
  • PaperRobot

    Code for PaperRobot: Incremental Draft Generation of Scientific Ideas

    Language:Python471
  • LaMDA-rlhf-pytorch

    Open-source pre-training implementation of Google's LaMDA in PyTorch. Adding RLHF similar to ChatGPT.

    Language:Python462
  • CaraNet

    Context Axial Reverse Attention Network for Small Medical Objects Segmentation

    Language:Python461
  • Multi-Scale-Attention

    Code for our paper "Multi-scale Guided Attention for Medical Image Segmentation"

    Language:Python460
  • ring-attention-pytorch

    Implementation of 💍 Ring Attention, from Liu et al. at Berkeley AI, in Pytorch

    Language:Python455
  • MultiModalMamba

    A novel implementation of fusing ViT with Mamba into a fast, agile, and high performance Multi-Modal Model. Powered by Zeta, the simplest AI framework ever.

    Language:Python431
  • ChangeFormer

    [IGARSS'22]: A Transformer-Based Siamese Network for Change Detection

    Language:Python421
  • DA-RNN

    📃 𝖀𝖓𝖔𝖋𝖋𝖎𝖈𝖎𝖆𝖑 PyTorch Implementation of DA-RNN (arXiv:1704.02971)

    Language:Jupyter Notebook418
  • enformer-pytorch

    Implementation of Enformer, Deepmind's attention network for predicting gene expression, in Pytorch

    Language:Python416
  • STANet

    official implementation of the spatial-temporal attention neural network (STANet) for remote sensing image change detection

    Language:Python406
  • linformer-pytorch

    My take on a practical implementation of Linformer for Pytorch.

    Language:Python403
  • triplet-attention

    triplet-attention

    Official PyTorch Implementation for "Rotate to Attend: Convolutional Triplet Attention Module." [WACV 2021]

    Language:Jupyter Notebook400