Add new recipe for `conversation-watsonx`
jweisz opened this issue · 1 comments
jweisz commented
We should implement a new recipe to demonstrate a conversational LLM experience with TJBot. Now that there are chat-tuned models available (e.g. llama-2-70b-chat
and granite-13b-chat-v1
), this should work well.
Also, use langchain
to drive the conversation! 😄
modcarroll commented
The code for this is working - I'm just having trouble testing the hardware so I'll continue to work on that