ray-project/langchain-ray

possible langchain prompt template error

allanwakes opened this issue · 0 comments

Maybe I was wrong, but this line (line 72):
result = self.chain({"input_documents": search_results, "question": query}) in example open_source_LLM_retrieval_qa/serve.py
input_documents should be context, for context is the placeholder to use.