parallel-training
There are 4 repositories under parallel-training topic.
explosion/spacy-ray
☄️ Parallel and distributed training with spaCy and Ray
NoteDance/Note
Machine learning library, Distributed training, Deep learning, Reinforcement learning, Models, TensorFlow, PyTorch
Tikquuss/meta_XLM
Cross-lingual Language Model (XLM) pretraining and Model-Agnostic Meta-Learning (MAML) for fast adaptation of deep networks
jiankaiwang/distributed_training
This repository is a tutorial targeting how to train a deep neural network model in a higher efficient way. In this repository, we focus on two main frameworks that are Keras and Tensorflow.