Pinned Repositories
Emu
Emu Series: Generative Multimodal Models from BAAI
LLaMA-Factory
Efficiently Fine-Tune 100+ LLMs in WebUI (ACL 2024)
dont-stop-pretraining
Code associated with the Don't Stop Pretraining ACL 2020 paper
extreme-bert
ExtremeBERT is a toolkit that accelerates the pretraining of customized language models on customized datasets, described in the paper “ExtremeBERT: A Toolkit for Accelerating Pretraining of Customized BERT”.
icemoon-creative
Config files for my GitHub profile.
lihang-code
《统计学习方法》的代码实现
matlab_demos
it contains all the MATLAB demo code associated with my machine learning notes
TinyLlama
The TinyLlama project is an open endeavor to pretrain a 1.1B Llama model on 3 trillion tokens.
NExT-GPT
Code and models for NExT-GPT: Any-to-Any Multimodal Large Language Model
Telechat
icemoon-creative's Repositories
icemoon-creative/extreme-bert
ExtremeBERT is a toolkit that accelerates the pretraining of customized language models on customized datasets, described in the paper “ExtremeBERT: A Toolkit for Accelerating Pretraining of Customized BERT”.
icemoon-creative/dont-stop-pretraining
Code associated with the Don't Stop Pretraining ACL 2020 paper
icemoon-creative/icemoon-creative
Config files for my GitHub profile.
icemoon-creative/lihang-code
《统计学习方法》的代码实现
icemoon-creative/matlab_demos
it contains all the MATLAB demo code associated with my machine learning notes