zixianwang2022's Stars
NVIDIA/Megatron-LM
Ongoing research training transformer models at scale
microsoft/tutel
Tutel MoE: An Optimized Mixture-of-Experts Implementation
lucidrains/mixture-of-experts
A Pytorch implementation of Sparsely-Gated Mixture of Experts, for massively increasing the parameter count of language models
huggingface/peft
🤗 PEFT: State-of-the-art Parameter-Efficient Fine-Tuning.
joonspk-research/generative_agents
Generative Agents: Interactive Simulacra of Human Behavior
tatsu-lab/stanford_alpaca
Code and documentation to train Stanford's Alpaca models, and generate the data.
chenfei-wu/TaskMatrix
openai/openai-cookbook
Examples and guides for using the OpenAI API
Akshayc1/Taxi-Trip-Time-Prediction
Predicting the total travel time from starting of ride to the destination in seconds.
gasolin/zhpy
周蟒, 用中文化 python 語法寫程式
mkandes/galyleo
A shell utility to help you launch Jupyter notebooks on high-performance computing systems in a simple, secure way.
ucsd-ets/cse151a-2020-sp-public
Public resources for CSE 151A in Spring 2020.
sdsc-hpc-training-org/hpc-training-2021
Repository for training material for the 2021 SDSC HPC User Training Course