OSU-NLP-Group/TravelPlanner

Code/Instructions incomplete??

lihkinVerma opened this issue · 4 comments

Hi Team, I am trying to run the sole-planning for mistral-7B model, but realized that there is API kept up and running at the local "http://localhost:8301/v1"
What are the steps to run those local APIs for mistral, mixtral and ChatGLM3 models??
Please mention them here or in README file

Hi Nikhil,

Thank you for your interest in our work.

We use fastchat to deploy our local models, and you can see the details here.
Please feel free to adjust the local port settings in the code to suit your specific setup.

I hope this can help you. Feel free to contact us if you have further questions.

Best,
Jian

Thankyou, it was helpful and I was able to run the mistral model (Open-Orca/Mistral-7B-OpenOrca). Did you used the same mistral model or not?

Is this detail mentioned in the paper or anywhere else on site or github. The only line I could see in paper is Section 4.1 mentioning "For all these models, we adopt the official instruction formats whenever available."

Thanks again for the help.

Hi,

Sorry to say that we didn't use this model. We use the official model provided by Mistral. Actually, all the models mentioned in the paper were evaluated using their official parametric weights, given the abundance of unofficial versions. However, it's totally fine if you are interested in these models. We are also glad to see more results on TravelPlanner.

Hope this can help you.

Best,
Jian

Thanks again for confirming the exact model repo, since in official mistral there are v0.1 and v0.2(that you used)