Pinned Repositories
earth-forecasting-transformer
Official implementation of Earthformer
mxnet
Lightweight, Portable, Flexible Distributed/Mobile Deep Learning with Dynamic, Mutation-aware Dataflow Dep Scheduler; for Python, R, Julia, Scala, Go, Javascript and more
tvm
Open deep learning compiler stack for cpu, gpu and specialized accelerators
autogluon
Fast and Accurate ML in 3 Lines of Code
gluon-nlp
NLP made easy
automl_multimodal_benchmark
Repository for Multimodal AutoML Benchmark
aws-summit-2017-seoul
Demo codes in our presentation about MXNet in AWS Seoul Summit 2017
CodeBERT
CodeBERT
gluonnlp-gpt2
HKO-7
Source code of paper "[NIPS2017] Deep Learning for Precipitation Nowcasting: A Benchmark and A New Model"
sxjscience's Repositories
sxjscience/benchmark_ops
sxjscience/DeBERTa
The implementation of DeBERTa
sxjscience/horovod
Distributed training framework for TensorFlow, Keras, PyTorch, and Apache MXNet.
sxjscience/apls
Python code to evaluate the APLS metric
sxjscience/autocfg
All you need is a config for automl
sxjscience/autogluon-benchmarking
Benchmarking Utilities for AutoGluon
sxjscience/aws-batch-config
sxjscience/aws-efa-nccl-baseami-pipeline
EFA/NCCL base AMI build Packer and CodeBuild/Pipeline files. Also base Docker build files to enable EFA/NCCL in containers
sxjscience/cc_net
Tools to download and cleanup Common Crawl data
sxjscience/CLIP
Contrastive Language-Image Pretraining
sxjscience/d8
sxjscience/DeepLearningExamples
Deep Learning Examples
sxjscience/DeepSpeedExamples
Example models using DeepSpeed
sxjscience/dont-stop-pretraining
Code associated with the Don't Stop Pretraining ACL 2020 paper
sxjscience/DPR
Dense Passage Retriever - is a set of tools and models for open domain Q&A task.
sxjscience/eks-kubeflow-workshop
Kubeflow workshop on EKS. Mainly focus on AWS integration examples. Please go check kubeflow website http://kubeflow.org for other examples
sxjscience/electra
ELECTRA: Pre-training Text Encoders as Discriminators Rather Than Generators
sxjscience/flax
Flax is a neural network ecosystem for JAX that is designed for flexibility.
sxjscience/hardware-aware-transformers
[ACL 2020] HAT: Hardware-Aware Transformers for Efficient Natural Language Processing
sxjscience/KDD20-tutorial
Tutorial on Automated Machine Learning at KDD 2020
sxjscience/KDD2020
sxjscience/longformer
Longformer: The Long-Document Transformer
sxjscience/neurips-2020-sevir
Code and model benchmarks for "SEVIR : A Storm Event Imagery Dataset for Deep Learning Applications in Radar and Satellite Meteorology"
sxjscience/NeuroNER
Named-entity recognition using neural networks. Easy-to-use and state-of-the-art results.
sxjscience/nlpcda
一键中文数据增强包 ; NLP数据增强、bert数据增强、EDA:pip install nlpcda
sxjscience/performer-pytorch
An implementation of Performer, a linear attention-based transformer, in Pytorch
sxjscience/reformer-pytorch
Reformer, the efficient Transformer, in Pytorch
sxjscience/TextBrewer
A PyTorch-based knowledge distillation toolkit for natural language processing
sxjscience/trax
Trax — Deep Learning with Clear Code and Speed
sxjscience/tvm
Open deep learning compiler stack for cpu, gpu and specialized accelerators