kerthcet opened this issue a year ago · 0 comments
Generally the API looks like
chat = ChatLLM( model_name_or_path="meta-llama/Llama-2-7b-chat-hf", # required task="text-generation", adapter=<path/to/adapter>, # optional )