Pinned Repositories
ESM-GearNet
ESM-GearNet for Protein Structure Representation Learning (https://arxiv.org/abs/2303.06275)
ColossalAI
Making large AI models cheaper, faster and more accessible
bert-tools
A set of simple scripts to download and process the Wikitext dataset as TFRecords and make index files for NVIDIA DALI.
BERTLocRNA
Using Large language model to predict localization
ColossalAI
Colossal-AI: A Unified Deep Learning System for Large-Scale Parallel Training
ColossalAI-Examples
Examples of training models with hybrid parallelism using ColossalAI
Colossalai-LanguageModel
JointProteinFolding
KnowReQA
PDB-Struct
Source code of the paper "PDB-Struct: A Comprehensive Benchmark for Structure-based Protein Design"
WANG-CR's Repositories
WANG-CR/PDB-Struct
Source code of the paper "PDB-Struct: A Comprehensive Benchmark for Structure-based Protein Design"
WANG-CR/bert-tools
A set of simple scripts to download and process the Wikitext dataset as TFRecords and make index files for NVIDIA DALI.
WANG-CR/BERTLocRNA
Using Large language model to predict localization
WANG-CR/ColossalAI
Colossal-AI: A Unified Deep Learning System for Large-Scale Parallel Training
WANG-CR/ColossalAI-Examples
Examples of training models with hybrid parallelism using ColossalAI
WANG-CR/Colossalai-LanguageModel
WANG-CR/denoising-diffusion-pytorch
Implementation of Denoising Diffusion Probabilistic Model in Pytorch
WANG-CR/EigenFold
EigenFold: Generative Protein Structure Prediction with Diffusion Models
WANG-CR/esm-diffusion
Evolutionary Scale Modeling (esm): Pretrained language models for proteins
WANG-CR/examples
A set of examples around pytorch in Vision, Text, Reinforcement Learning, etc.
WANG-CR/folding_tools
A collection of *fold* tools
WANG-CR/JointProteinFolding
WANG-CR/KnowReQA
WANG-CR/Homework3
WANG-CR/IFT6135_H2023_Programming
WANG-CR/IFT6390-kaggle1
WANG-CR/imagenet-tools
A set of simple scripts to process the Imagenet-1K dataset as TFRecords and make index files for NVIDIA DALI.
WANG-CR/Kaggle2
WANG-CR/Megatron-LM
Ongoing research training transformer language models at scale, including: BERT & GPT-2
WANG-CR/ncsn_excercise
Noise Conditional Score Networks (NeurIPS 2019, Oral)
WANG-CR/openfold
Trainable, memory-efficient, and GPU-friendly PyTorch reproduction of AlphaFold 2
WANG-CR/openwebtext
An open clone of the GPT-2 WebText dataset by OpenAI. Still WIP.
WANG-CR/PEER_Benchmark
PEER Benchmark, appear at NeurIPS 2022 Dataset and Benchmark Track (https://arxiv.org/abs/2206.02096)
WANG-CR/ProjectChineseChess
Site of Chinese chess for WEB
WANG-CR/RefineGNN
WANG-CR/Research-Paper-Reading-Template
A markdown template for taking notes to summarize research papers.
WANG-CR/training
Reference implementations of MLPerf™ training benchmarks
WANG-CR/training_results_v1.1
WANG-CR/WANG-CR.github.io
Github Pages template for academic personal websites, forked from mmistakes/minimal-mistakes
WANG-CR/WEB-ChineseChess