/batch-prompting

[EMNLP 2023 Industry Track] A simple prompting approach that enables the LLMs to run inference in batches.

Primary LanguagePython

Issues