Pinned Repositories
transformers
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
DIS
CS-423 Distributed Information Systems
AICC-I
CS-101 Advanced information, computation, communication I
auto-encoder-based-transformer-compression
[EACL 2023] Revisiting Offline Compression: Going Beyond Factorization-based Methods for Transformer Language Models
INLP_neural_practical_session
Text Classification practical session (EPFL CS431 : Introduction to NLP course)
LoRA-XS
LoRA-XS: Low-Rank Adaptation with Extremely Small Number of Parameters
multilingual-code-switched-reasoning
[EMNLP 2023 - Findings] Breaking the Language Barrier: Improving Cross-Lingual Reasoning with Structured Self-Attention
orientation_based_embedding_compression
Compressing token embeddings for transformer-based language models using a novel orientation-based loss objective (accepted to RepL4NLP workshop at ACL 2021)
peft
🤗 PEFT: State-of-the-art Parameter-Efficient Fine-Tuning.
transformers
🤗Transformers: State-of-the-art Natural Language Processing for Pytorch and TensorFlow 2.0.
MohammadrezaBanaei's Repositories
MohammadrezaBanaei/LoRA-XS
LoRA-XS: Low-Rank Adaptation with Extremely Small Number of Parameters
MohammadrezaBanaei/orientation_based_embedding_compression
Compressing token embeddings for transformer-based language models using a novel orientation-based loss objective (accepted to RepL4NLP workshop at ACL 2021)
MohammadrezaBanaei/auto-encoder-based-transformer-compression
[EACL 2023] Revisiting Offline Compression: Going Beyond Factorization-based Methods for Transformer Language Models
MohammadrezaBanaei/INLP_neural_practical_session
Text Classification practical session (EPFL CS431 : Introduction to NLP course)
MohammadrezaBanaei/AICC-I
CS-101 Advanced information, computation, communication I
MohammadrezaBanaei/multilingual-code-switched-reasoning
[EMNLP 2023 - Findings] Breaking the Language Barrier: Improving Cross-Lingual Reasoning with Structured Self-Attention
MohammadrezaBanaei/peft
🤗 PEFT: State-of-the-art Parameter-Efficient Fine-Tuning.
MohammadrezaBanaei/transformers
🤗Transformers: State-of-the-art Natural Language Processing for Pytorch and TensorFlow 2.0.