Pinned Repositories
bert
TensorFlow code and pre-trained models for BERT
bert-as-service
Mapping a variable-length sentence to a fixed-length vector using BERT model
bert-prune
BERT-Tickets
[NeurIPS 2020] "The Lottery Ticket Hypothesis for Pre-trained BERT Networks", Tianlong Chen, Jonathan Frankle, Shiyu Chang, Sijia Liu, Yang Zhang, Zhangyang Wang, Michael Carbin
cuad
CUAD (NeurIPS 2021)
EPIJudge
EPI Judge - Preview Release
finetune
Scikit-learn style model finetuning for NLP
gptq
Code for the ICLR 2023 paper "GPTQ: Accurate Post-training Quantization of Generative Pretrained Transformers".
QuantumGNN_molecules
Quantum graph neural network (quantum GNN) for molecular property prediction.
seizure
Epileptic seizure detection using auto-encoded features
natuan's Repositories
natuan/QuantumGNN_molecules
Quantum graph neural network (quantum GNN) for molecular property prediction.
natuan/seizure
Epileptic seizure detection using auto-encoded features
natuan/bert
TensorFlow code and pre-trained models for BERT
natuan/bert-as-service
Mapping a variable-length sentence to a fixed-length vector using BERT model
natuan/bert-prune
natuan/BERT-Tickets
[NeurIPS 2020] "The Lottery Ticket Hypothesis for Pre-trained BERT Networks", Tianlong Chen, Jonathan Frankle, Shiyu Chang, Sijia Liu, Yang Zhang, Zhangyang Wang, Michael Carbin
natuan/cuad
CUAD (NeurIPS 2021)
natuan/EPIJudge
EPI Judge - Preview Release
natuan/finetune
Scikit-learn style model finetuning for NLP
natuan/gptq
Code for the ICLR 2023 paper "GPTQ: Accurate Post-training Quantization of Generative Pretrained Transformers".
natuan/high_performance_python
Code for the book "High Performance Python" by Micha Gorelick and Ian Ozsvald with OReilly
natuan/lime
Lime: Explaining the predictions of any machine learning classifier
natuan/natural-language-processing
Resources for "Natural Language Processing" Coursera course.
natuan/llama-recipes
Examples and recipes for Llama 2 model
natuan/lm-evaluation-harness
A framework for few-shot evaluation of autoregressive language models.
natuan/minGPT
A minimal PyTorch re-implementation of the OpenAI GPT (Generative Pretrained Transformer) training
natuan/nlpaug
Data augmentation for NLP
natuan/omnio
Python library for opening URIs as streaming file-like objects
natuan/optimum
🚀 Accelerate training and inference of 🤗 Transformers and 🤗 Diffusers with easy to use hardware optimization tools
natuan/Pretrained-Language-Model
Pretrained language model and its related optimization techniques developed by Huawei Noah's Ark Lab.
natuan/pytorch-image-models
PyTorch image models, scripts, pretrained weights -- ResNet, ResNeXT, EfficientNet, EfficientNetV2, NFNet, Vision Transformer, MixNet, MobileNet-V3/V2, RegNet, DPN, CSPNet, and more
natuan/pytorch-tutorial
PyTorch Tutorial for Deep Learning Researchers
natuan/QNN
Tutorials on Quantized Neural Network using Tensorflow Lite
natuan/smoothquant
SmoothQuant: Accurate and Efficient Post-Training Quantization for Large Language Models
natuan/sparsegpt
Code for the paper "SparseGPT: Massive Language Models Can Be Accurately Pruned in One-Shot".
natuan/sparseml
Libraries for applying sparsification recipes to neural networks with a few lines of code, enabling faster and smaller models
natuan/transformers
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
natuan/USA-cities-and-states
Full list of US states and cities
natuan/vision
Datasets, Transforms and Models specific to Computer Vision