yxuansu/PandaGPT

About model size

David-Zeng-Zijian opened this issue · 1 comments

Excellent work! I just wonder if there is any way to load the model in more than one GPU because even the 7B model consumes more than 20G memory, which is larger than memory of one GPU.

Hi @David-Zeng-Zijian, the current model requires around 24GB memory to run on a single GPU. Some possible solutions include model quantization and model parallel. It would be great if you can share us some of your solutions to deploy PandaGPT across multiple GPUs.