parallel-training

There are 4 repositories under parallel-training topic.

  • explosion/spacy-ray

    ☄️ Parallel and distributed training with spaCy and Ray

    Language:Python5410710
  • NoteDance/Note

    Machine learning library, Distributed training, Deep learning, Reinforcement learning, Models, TensorFlow, PyTorch

    Language:Python53713
  • Tikquuss/meta_XLM

    Cross-lingual Language Model (XLM) pretraining and Model-Agnostic Meta-Learning (MAML) for fast adaptation of deep networks

    Language:Jupyter Notebook20343
  • jiankaiwang/distributed_training

    This repository is a tutorial targeting how to train a deep neural network model in a higher efficient way. In this repository, we focus on two main frameworks that are Keras and Tensorflow.

    Language:Jupyter Notebook6303