Pinned Repositories
builder
Continuous builder and binary build scripts for pytorch
captum
Model interpretability and understanding for PyTorch
CLIP
Contrastive Language-Image Pretraining
ConvNeXt
Code release for ConvNeXt model
kernl
Kernl lets you run Pytorch transformer models several times faster on GPU with a single line of code, and is designed to be easily hackable.
low_cost_robot
metrics
Machine learning metrics for distributed, scalable PyTorch applications.
minGPT
A minimal PyTorch re-implementation of the OpenAI GPT (Generative Pretrained Transformer) training
multimodal
TorchMultimodal is a PyTorch library for training state-of-the-art multimodal multi-task models at scale.
edward-io's Repositories
edward-io/builder
Continuous builder and binary build scripts for pytorch
edward-io/captum
Model interpretability and understanding for PyTorch
edward-io/CLIP
Contrastive Language-Image Pretraining
edward-io/ConvNeXt
Code release for ConvNeXt model
edward-io/kernl
Kernl lets you run Pytorch transformer models several times faster on GPU with a single line of code, and is designed to be easily hackable.
edward-io/low_cost_robot
edward-io/metrics
Machine learning metrics for distributed, scalable PyTorch applications.
edward-io/minGPT
A minimal PyTorch re-implementation of the OpenAI GPT (Generative Pretrained Transformer) training
edward-io/multimodal
TorchMultimodal is a PyTorch library for training state-of-the-art multimodal multi-task models at scale.
edward-io/pytorch
Tensors and Dynamic neural networks in Python with strong GPU acceleration
edward-io/pytorch-lightning
The lightweight PyTorch wrapper for high-performance AI research. Scale your models, not the boilerplate.
edward-io/recipes
Recipes are a standard, well supported set of blueprints for machine learning engineers to rapidly train models using the latest research techniques without significant engineering overhead.Specifically, recipes aims to provide- Consistent access to pre-trained SOTA models ready for production- Reference implementations for SOTA research reproducibility, and infrastructure to guarantee correctness, efficiency, and interoperability.
edward-io/rfcs
PyTorch RFCs (experimental)
edward-io/rich
Rich is a Python library for rich text and beautiful formatting in the terminal.
edward-io/stable-diffusion
edward-io/tnt
A lightweight library for PyTorch training tools and utilities
edward-io/torcheval1
A library that contains a rich collection of performant PyTorch model metrics, a simple interface to create new metrics, a toolkit to facilitate metric computation in distributed training and tools for PyTorch model evaluations.
edward-io/torchsnapshot-1
A light-weight library for adding fault tolerance to large-scale PyTorch distributed training workloads.
edward-io/triton
Development repository for the Triton language and compiler
edward-io/vision
Datasets, Transforms and Models specific to Computer Vision