RUCAIBox/HaluEval

How did you determine that the max token length for truncated message was 2033?

Closed this issue · 2 comments

Whisht commented

Thanks for your work. I am trying to evaluate the hallucination of Vicuna-13B with your code. But the token exceeds the maximum context length w.r.t. 2048 tokens for the summarization task. I found that the max tokens for DaVinci is 2049. After scaling down the truncated length from 2033 to 1600, the evaluation procedure works well (for other values, I got the error that the message exceeds the maximum context length). This value is extremely lower than the settled 2033 in the context of only one token difference.

But I wonder how did you determine the max token length? So I can find a certain way to set the truncated token length.

Sorry for late response. We set the truncated token length according to the token length of the instruction (prompt1 in the code). So to set the truncated token length, you can precompute the token length of the instruction for a specific model.

Whisht commented

Thanks for your reply, I appreciate it.