Pinned Repositories
gpt-neox
An implementation of model parallel autoregressive transformers on GPUs, based on the Megatron and DeepSpeed libraries
-BGM-
CCKS2021
Coffee-and-Binary-ree
数据结构可视化软件
foggy-frost-forest.github.io
Github Pages template for academic personal websites, forked from mmistakes/minimal-mistakes
gpt-neox
An implementation of model parallel autoregressive transformers on GPUs, based on the DeepSpeed library.
HIT_net_login
如何用python代码登录你哈深的校园网
transformers
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
transformers
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Qwen2-VL
Qwen2-VL is the multimodal large language model series developed by Qwen team, Alibaba Cloud.
foggy-frost-forest's Repositories
foggy-frost-forest/-BGM-
foggy-frost-forest/CCKS2021
foggy-frost-forest/Coffee-and-Binary-ree
数据结构可视化软件
foggy-frost-forest/foggy-frost-forest.github.io
Github Pages template for academic personal websites, forked from mmistakes/minimal-mistakes
foggy-frost-forest/gpt-neox
An implementation of model parallel autoregressive transformers on GPUs, based on the DeepSpeed library.
foggy-frost-forest/HIT_net_login
如何用python代码登录你哈深的校园网
foggy-frost-forest/transformers
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.