Use with LLAMA
Opened this issue · 1 comments
ezzini commented
Can PICARD be used with LLAMA-like models?
nzw0301 commented
I suppose there are two blockers: the docker container's Python is too old, specifically, python 3.7, to use hugging face supporting llama v2 that needs python>=3.8 and the current code base assumes T5 architecture (seq2seq model) and its tokenizer.