slowbull's Stars
donnemartin/system-design-primer
Learn how to design large-scale systems. Prep for the system design interview. Includes Anki flashcards.
CyC2018/CS-Notes
:books: 技术面试必备基础知识、Leetcode、计算机操作系统、计算机网络、系统设计
microsoft/DeepSpeed
DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.
facebookresearch/fairseq
Facebook AI Research Sequence-to-Sequence Toolkit written in Python.
mozilla/DeepSpeech
DeepSpeech is an open source embedded (offline, on-device) speech-to-text engine which can run in real time on devices ranging from a Raspberry Pi 4 to high power GPU servers.
facebookresearch/ParlAI
A framework for training and evaluating AI models on a variety of openly available dialogue datasets.
flashlight/wav2letter
Facebook AI Research's Automatic Speech Recognition Toolkit
TensorSpeech/TensorFlowTTS
:stuck_out_tongue_closed_eyes: TensorFlowTTS: Real-Time State-of-the-art Speech Synthesis for Tensorflow 2 (supported including English, French, Korean, Chinese, German and Easy to adapt for other languages)
tensorflow/lingvo
Lingvo
Tencent/PocketFlow
An Automatic Model Compression (AutoMC) framework for developing smaller and faster AI applications.
mravanelli/pytorch-kaldi
pytorch-kaldi is a project for developing state-of-the-art DNN/RNN hybrid speech recognition systems. The DNN part is managed by pytorch, while feature extraction, label computation, and decoding are performed with the kaldi toolkit.
HobbitLong/RepDistiller
[ICLR 2020] Contrastive Representation Distillation (CRD), and benchmark of recent knowledge distillation methods
mit-han-lab/once-for-all
[ICLR 2020] Once for All: Train One Network and Specialize it for Efficient Deployment
D-X-Y/AutoDL-Projects
Automated deep learning algorithms implemented in PyTorch.
mit-han-lab/proxylessnas
[ICLR 2019] ProxylessNAS: Direct Neural Architecture Search on Target Task and Hardware
HobbitLong/CMC
[arXiv 2019] "Contrastive Multiview Coding", also contains implementations for MoCo and InstDis
wyharveychen/CloserLookFewShot
source code to ICLR'19, 'A Closer Look at Few-shot Classification'
ssnl/dataset-distillation
Open-source code for paper "Dataset Distillation"
google-research/lottery-ticket-hypothesis
A reimplementation of "The Lottery Ticket Hypothesis" (Frankle and Carbin) on MNIST.
THUNLP-MT/THUMT
An open-source neural machine translation toolkit developed by Tsinghua Natural Language Processing Group
mit-han-lab/dlg
[NeurIPS 2019] Deep Leakage From Gradients
cybertronai/pytorch-lamb
Implementation of https://arxiv.org/abs/1904.00962
facebookresearch/covost
CoVoST: A Large-Scale Multilingual Speech-To-Text Translation Corpus (CC0 Licensed)
google-research/rigl
End-to-end training of sparse deep neural networks with little-to-no performance loss.
submission2019/cnn-quantization
Quantization of Convolutional Neural networks.
facebookresearch/stochastic_gradient_push
Stochastic Gradient Push for Distributed Deep Learning
adobe/Deep-Audio-Prior
Audio Source Separation Without Any Training Data.
PatrickZH/Improved-Deep-Leakage-from-Gradients
The code for "Improved Deep Leakage from Gradients" (iDLG).
LaraQianYang/Ouroboros
Ouroboros: On Accelerating Training of Transformer-Based Language Models
slowbull/FeaturesReplay
A PyTorch implementation of the paper "Training Neural Networks Using Features Replay"