lyuchenyang/Macaw-LLM

Using pad_token, but it is not set yet.

Opened this issue · 0 comments

Hi, when I run "preprocess_data_supervised.py" by using llama-7b-hf tokenizer, it shows "Using pad_token, but it is not set yet" and "Truncation was not explicitly activated but max_length is provided a specific value,...".

Is it ok?
截屏2023-08-17 13 23 59