Pinned Repositories
LLaMA-Factory
Unified Efficient Fine-Tuning of 100+ LLMs (ACL 2024)
Skywork
Skywork series models are pre-trained on 3.2TB of high-quality multilingual (mainly Chinese and English) and code data. We have open-sourced the model, training data, evaluation data, evaluation methods, etc. 天工系列模型在3.2TB高质量多语言和代码数据上进行预训练。我们开源了模型参数,训练数据,评估数据,评估方法。
ELMO
Long-Periodic-Data-Try
LSTNet_keras
Keras version of LSTNet
NAB
The Numenta Anomaly Benchmark
test
xDeepFM
CIF-Bench
yajunDai's Repositories
yajunDai/ELMO
yajunDai/Long-Periodic-Data-Try
yajunDai/LSTNet_keras
Keras version of LSTNet
yajunDai/NAB
The Numenta Anomaly Benchmark
yajunDai/test
yajunDai/xDeepFM