Pinned Repositories
ALBEF_copy
A Copy of Code for ALBEF: a new vision-language pre-training method
Awesome-Multimodal-Large-Language-Models
:sparkles::sparkles:Latest Papers and Datasets on Multimodal Large Language Models, and Their Evaluation.
BackdoorBench
Bahdanau-and-Transformer-on-NMT
DDA4220, AY22-23 Spring, Bahdanau and vanilla Transformer in PyTorch for NMT
Bunny
A family of lightweight multimodal models.
Chinese-LLaMA-Alpaca
中文LLaMA&Alpaca大语言模型+本地部署 (Chinese LLaMA & Alpaca LLMs)
g-h-chen.github.io
personal website
LLaVA
[NeurIPS'23 Oral] Visual Instruction Tuning (LLaVA) built towards GPT-4V level capabilities and beyond.
v0_g-h-chen.github.io
Github Pages template for academic personal websites, forked from mmistakes/minimal-mistakes
LLaVA
[NeurIPS'23 Oral] Visual Instruction Tuning (LLaVA) built towards GPT-4V level capabilities and beyond.
g-h-chen's Repositories
g-h-chen/ALBEF_copy
A Copy of Code for ALBEF: a new vision-language pre-training method
g-h-chen/Awesome-Multimodal-Large-Language-Models
:sparkles::sparkles:Latest Papers and Datasets on Multimodal Large Language Models, and Their Evaluation.
g-h-chen/BackdoorBench
g-h-chen/Bahdanau-and-Transformer-on-NMT
DDA4220, AY22-23 Spring, Bahdanau and vanilla Transformer in PyTorch for NMT
g-h-chen/Bunny
A family of lightweight multimodal models.
g-h-chen/Chinese-LLaMA-Alpaca
中文LLaMA&Alpaca大语言模型+本地部署 (Chinese LLaMA & Alpaca LLMs)
g-h-chen/g-h-chen.github.io
personal website
g-h-chen/LLaVA
[NeurIPS'23 Oral] Visual Instruction Tuning (LLaVA) built towards GPT-4V level capabilities and beyond.
g-h-chen/v0_g-h-chen.github.io
Github Pages template for academic personal websites, forked from mmistakes/minimal-mistakes