ypeleg/llama

AttributeError: module 'llama' has no attribute 'LLaMATokenizer' when using example

benjamin32561 opened this issue · 7 comments

I am using google colab to run your example code, when I run the tokenizer = llama.LLaMATokenizer.from_pretrained(MODEL) I have an error, tried running the same notebook on my laptop, same error occurs.

installing the environment as described in the facebook repo solves the problem: https://github.com/facebookresearch/llama

Thats right!
thanks.

@kriskrisliu
But it still requires to receive the weights from Meta via the submission form, am I right?

I am having the same problem. Any solution?

installing the environment as described in the facebook repo solves the problem: https://github.com/facebookresearch/llama

I tried but it doesn't.

It is because of the imports, it seems messed up in the current version of this repo.
Change your import llama to this:
from llama.tokenization_llama import LLaMATokenizer and from llama.modeling_llama import LLaMAForCausalLM

And then use it like this (thus without the llama before each class name):
tokenizer = LLaMATokenizer.from_pretrained(MODEL)
and
model = LLaMAForCausalLM.from_pretrained(MODEL, low_cpu_mem_usage = True)