refuel-ai/autolabel

[Bug]: How to set `max_length` and `max_new_tokens`?

Opened this issue · 1 comments

Ubuntu 22.04
Python 3.10
Job: Classificaiton
"provider": "huggingface_pipeline", "tokenzier":"google/gemma-7b",

and I got :

Error generating from LLM: Input length of input_ids is 1658, but `max_length` is set to 20. This can lead to 
unexpected behavior. You should consider increasing `max_length` or, better yet, setting `max_new_tokens.
Error generating from LLM: Input length of input_ids is 1646, but `max_length` is set to 20. This can lead to 
unexpected behavior. You should consider increasing `max_length` or, better yet, setting `max_new_tokens.
Error generating from LLM: Input length of input_ids is 1645, but `max_length` is set to 20. This can lead to 
unexpected behavior. You should consider increasing `max_length` or, better yet, setting `max_new_tokens.
        "params": {"max_new_tokens": 512,
                    "temperature": 0.1,
                  }

add params in the config file like this.