Pinned Repositories
R1-V
Witness the aha moment of VLM with less than $3.
ALSACE
The official implementation of "Mitigating Language-Level Performance Disparity in mPLMs via Teacher Language Selection and Cross-lingual Self-Distillation (NAACL 2024)"
clash-for-linux
Linux 端使用 Clash 作为代理工具
LACING
MIC
MMICL, a state-of-the-art VLM with the in context learning ability from ICL, PKU
MIC_tool
UltraEdit
Awesome-Multimodal-Next-Token-Prediction
[Survey] Next Token Prediction Towards Multimodal Intelligence: A Comprehensive Survey
FastV
[ECCV 2024 Oral] Code for paper: An Image is Worth 1/2 Tokens After Layer 2: Plug-and-Play Inference Acceleration for Large Vision-Language Models
PCA-EVAL
[ACL 2024] PCA-Bench: Evaluating Multimodal Large Language Models in Perception-Cognition-Action Chain
HaozheZhao's Repositories
HaozheZhao/MIC
MMICL, a state-of-the-art VLM with the in context learning ability from ICL, PKU
HaozheZhao/UltraEdit
HaozheZhao/MIC_tool
HaozheZhao/ALSACE
The official implementation of "Mitigating Language-Level Performance Disparity in mPLMs via Teacher Language Selection and Cross-lingual Self-Distillation (NAACL 2024)"
HaozheZhao/LACING
HaozheZhao/clash-for-linux
Linux 端使用 Clash 作为代理工具
HaozheZhao/FastV
Code for paper: An Image is Worth 1/2 Tokens After Layer 2: Plug-and-Play Inference Acceleration for Large Vision-Language Models
HaozheZhao/GunViolence_DataMining
HaozheZhao/HaozheZhao.github.io
AcadHomepage: A Modern and Responsive Academic Personal Homepage
HaozheZhao/PCA-EVAL
PCA-EVAL benchmark proposed in paper "Towards End-to-End Embodied Decision Making via Multi-modal Large Language Model: Explorations with GPT4-Vision and Beyond"
HaozheZhao/prophet
Tool for producing high quality forecasts for time series data that has multiple seasonality with linear or non-linear growth.
HaozheZhao/ShadowsocksR-Windows
【自用】Forked from shadowsocksr and shadowsocksrr
HaozheZhao/standard-readme
A standard style for README files
HaozheZhao/bilibili
该工程是 <tr>哔哩哔哩</tr> 的后台源代码仓库
HaozheZhao/openbilibili-backup
bilibili的后端代码
HaozheZhao/R1-V
Witness the aha moment of VLM with less than $3.