Inference on Custom Use Case w/o fine tuning
Closed this issue · 1 comments
Mrs-Hudson commented
Hi,
Thanks for sharing this work.
- What's the best way to perform a quick analysis on a custom domain w/o fine tuning (zero shot or few shot)?
How should I prepare my input dataset and prompts? - Is it possible to perform inference on the model itself without the retriever components etc (my schema is already small) .
- Is it possible to use vLLM for inference?
A notebook showing datatset prep and inference for a custom use case would be super helpful!
lihaoyang-ruc commented
- To quickly analyze a custom domain database, the optimal approach is to utilize our fine-tuned models for zero-shot inference. We have created a demonstration available at https://github.com/RUCKBReasoning/text2sql-demo, which provides guidance on leveraging your databases for inference. This repository is designed to assist individuals aiming to replicate the results outlined in our paper.
- Certainly, you can modify the specified lines to integrate support for your schema filter: https://github.com/RUCKBReasoning/text2sql-demo/blob/1f0d7b4a53a863f00434928f770053052b0615a9/text2sql.py#L139C9-L141C90
- Yes, CodeS is built upon StarCoder, which is supported by vLLM.