AssertionError: rotary_emb is not installed
bioone opened this issue · 5 comments
i try the fllow code:
`from transformers import AutoConfig, AutoModelForCausalLM
model_name = 'togethercomputer/evo-1-8k-base'
model_config = AutoConfig.from_pretrained(model_name, trust_remote_code=True)
model_config.use_cache = True
model = AutoModelForCausalLM.from_pretrained(
model_name,
config=model_config,
trust_remote_code=True,
)`
raise error :AssertionError: rotary_emb is not installed
however,the rotary_emb v0,1 installed,flash attentation also installed
can you share more info / stack trace? It might be the flash attention version.
I believe the source of this error is this line in flash-attention codebase. In my case, the import statement encapsulated in the try-except block failed with the following error: ModuleNotFoundError: No module named 'triton'
. Installing triton
fixed the problem.