Pinned Repositories
2001-lab1
4042_experiments
adapt-mnmt
Dynamic Transfer Learning for Low-Resource Neural Machine Translation
anfis-pytorch
Implementation of ANFIS using the pyTorch framework
cz2002-assignment
Lab Assignment for CZ2002 2020-2021 Sem 1
CZ2002-Lab
Lab Exercises for CZ2002 2020
cz2006-grocery-app
Coursework for CZ2006: Software Engineering done in AY20/21 Semester 2
cz3005-lab
Lab Submissions for CZ3005: Artificial Intelligence in 20-21 S2
cz4045-project2
Source Code for Project 2 of CZ4045
dotfiles
all my dotfiles
ajppp's Repositories
ajppp/2001-lab1
ajppp/4042_experiments
ajppp/adapt-mnmt
Dynamic Transfer Learning for Low-Resource Neural Machine Translation
ajppp/anfis-pytorch
Implementation of ANFIS using the pyTorch framework
ajppp/cz2002-assignment
Lab Assignment for CZ2002 2020-2021 Sem 1
ajppp/CZ2002-Lab
Lab Exercises for CZ2002 2020
ajppp/cz2006-grocery-app
Coursework for CZ2006: Software Engineering done in AY20/21 Semester 2
ajppp/cz3005-lab
Lab Submissions for CZ3005: Artificial Intelligence in 20-21 S2
ajppp/cz4045-project2
Source Code for Project 2 of CZ4045
ajppp/dotfiles
all my dotfiles
ajppp/huggingface-experiments
Scripts for training translation models to do transfer learning using huggingface
ajppp/dace
DaCe - Data Centric Parallel Programming
ajppp/do-you-even-need-attention
Exploring whether attention is necessary for vision transformers
ajppp/pyqsp
Python quantum signal processing
ajppp/pytorch-a3c
PyTorch implementation of Asynchronous Advantage Actor Critic (A3C) from "Asynchronous Methods for Deep Reinforcement Learning".
ajppp/QSPPACK
A toolbox for solving phase factors in Quantum signal processing.
ajppp/qsvt_experiments
Experiments with QSVT
ajppp/the-story-of-heads
This is a repository with the code for the ACL 2019 paper "Analyzing Multi-Head Self-Attention: Specialized Heads Do the Heavy Lifting, the Rest Can Be Pruned" and the ACL 2021 paper "Analyzing Source and Target Contributions to NMT Predictions".
ajppp/xattn-transfer-for-mt
Code and data to accompany the camera-ready version of "Cross-Attention is All You Need: Adapting Pretrained Transformers for Machine Translation" in EMNLP 2021