AaronHeee/LLMs-as-Zero-Shot-Conversational-RecSys

About version issues encountered in running code!

LitGreenhand opened this issue · 4 comments

I want to ask about the fschat package used when running baize and vicuna models. When I run it, The program will say:
'from fastchat.conversation import get_default_conv_template, compute_skip_echo_len
ImportError: cannot import name 'get_default_conv_template' from 'fastchat.conversation' '.
So I'd like to ask about the specific version of the fschat package and the corresponding torch version

Hi @LitGreenhand, thanks for your interests in our work. It seems that these functions are deprecated in v0.2.28 (source), I think older versions should work fine. Please let me know if there are any other issues.

Thank you for your reply @AaronHeee, But as for the version of fschat, I have just tried from v.0.2.20 to v0.2.32, during which the problem has not been solved. Could you ask about the specific version of fschat?

Hi @LitGreenhand, my collaborator told me the version we used is 0.2.28. Please try this and let me know if there are any problems. If not, I would update the README.md file with this exact version to avoid confusion.

As there are no activities, we close this issue.