Pinned Repositories
RWKV-LM
RWKV (pronounced RwaKuv) is an RNN with great LLM performance, which can also be directly trained like a GPT transformer (parallelizable). We are at RWKV-7 "Goose". So it's combining the best of RNN and transformer - great performance, linear time, constant space (no kv-cache), fast training, infinite ctx_len, and free sentence embedding.
DeepSpeed
DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.
Qwen2-VL
Qwen2-VL is the multimodal large language model series developed by Qwen team, Alibaba Cloud.
AICITY2024_Track2_AliOpenTrek_CityLLaVA
Work at Aliyun, 2024.3~2024.4
CBLIP2
CTFL-Beta
The code for paper: Contrastive time–frequency learning for radar signal sorting
cudnn_repos
LPCL_beta
LPCL paper code(beta, not collated)
mPLUG-Owl
mPLUG-Owl🦉: Modularization Empowers Large Language Models with Multimodality
NSExperiment
Youngluc's Repositories
Youngluc/CTFL-Beta
The code for paper: Contrastive time–frequency learning for radar signal sorting
Youngluc/CBLIP2
Youngluc/mPLUG-Owl
mPLUG-Owl🦉: Modularization Empowers Large Language Models with Multimodality
Youngluc/AICITY2024_Track2_AliOpenTrek_CityLLaVA
Work at Aliyun, 2024.3~2024.4
Youngluc/cudnn_repos
Youngluc/LPCL_beta
LPCL paper code(beta, not collated)
Youngluc/NSExperiment