Getting Cuda out of memory error
Closed this issue · 2 comments
isspek commented
Hi,
Thanks for the repository. I am running the code on a laptop, so the GPU memory is limited if I use open LLMs (e.g. extractor = MistralExtractor()
). Is there a way to setup batch size while using these models?
HuXiangkun commented
Hi @isspek , the extractors are not using batching for the extraction. The Mistral extractor has 7B parameters, so it is not able to deployed into a laptop, so I think you need a GPU to run the Mistral-based extractor.
isspek commented
Thanks for the clarification @HuXiangkun