Variable Gets Reset After Change Event
Closed this issue ยท 11 comments
Describe the bug
I am creating an app that allows to user to switch between different LLMs for response through a chat interface. I have a dropdown that allows user to choose which LLM to use. I have a variable keeping track of the chosen LLM.
When the user makes a selection via the dropdown, I have verified that this variable is correctly updated. However, when my getModelResponse() function accesses the same variable, it seems to be using the old, unchanged value.
If this is not a bug, and instead a mistake on my part, please let me know what I can do to rectify it. I am brand new to gradio.
Have you searched existing issues? ๐
- I have searched and found no existing issues
Reproduction
A lot of code that was irrelevant to this issue was deleted, so you might see references to a function that is not defined. Assume that function works properly
import gradio as gr
supportedLLMs = {'a': 1, "b": 2}
NAME = "a"
def updateName(newName):
NAME = newName;
print("Changed model to " + NAME)
def getModelResponse(message, history):
return NAME;
Chatbot = gr.ChatInterface(getModelResponse, chatbot = gr.Chatbot(likeable=True, show_share_button=True, show_copy_button=True, bubble_full_width = False), examples = ["What is the capital of France?", "What was John Lennon's first album?", "Write a rhetorical analysis of Hamlet"])
with gr.Blocks() as interface:
with gr.Row():
with gr.Column():
Chatbot.render();
with gr.Column():
with gr.Accordion("Chatbot Configuration", open=True):
LLMChoice = gr.Dropdown(choices = list(supportedLLMs.keys()), label = "Chatbot name", value = "Llama7b", interactive = True, info = "Cumulative chooses the best (most correct) reponse from all the LLM responses to a prompt.")
LLMChoice.change(updateName, inputs = [LLMChoice], outputs = [])
interface.launch(share=True, debug = True);
Screenshot
Logs
No errors are thrown
System Info
Gradio Environment Information:
------------------------------
Operating System: Linux
gradio version: 4.29.0
gradio_client version: 0.16.1
Severity
Blocking usage of gradio
Could you modify the reproduction so we can run it locally please? We need to be able to run the reproduction in order to debug the issue.
@pngwn Thanks for the reply. The code is very, very long and I wanted to provide a MWE. Are you sure you want me to modify it?
By the way, I like your profile picture :)
Do you have a way of passing the dropdown value to the predict function? Have you tried using the additional_inputs
parameter in your original code? If you haven't, that might be why you're seeing the value change but not seeing any effects.
You can learn more about this in our Guides here : https://www.gradio.app/guides/creating-a-chatbot-fast#additional-inputs.
@yvrjsharma Thanks for the pointer. Will definitely look into it.
@pngwn Here's the full code. Let me know if you need anything else. I initially didn't share it because I thought a minimal working example would be better.
import gradio as gr
supportedLLMs = {'a': 1, "b": 2}
NAME = "a"
def updateName(newName):
NAME = newName;
print("Changed model to " + NAME)
def getModelResponse(message, history):
return NAME;
Chatbot = gr.ChatInterface(getModelResponse, chatbot = gr.Chatbot(likeable=True, show_share_button=True, show_copy_button=True, bubble_full_width = False), examples = ["What is the capital of France?", "What was John Lennon's first album?", "Write a rhetorical analysis of Hamlet"])
with gr.Blocks() as interface:
with gr.Row():
with gr.Column():
Chatbot.render();
with gr.Column():
with gr.Accordion("Chatbot Configuration", open=True):
LLMChoice = gr.Dropdown(choices = list(supportedLLMs.keys()), label = "Chatbot name", value = "Llama7b", interactive = True, info = "Cumulative chooses the best (most correct) reponse from all the LLM responses to a prompt.")
LLMChoice.change(updateName, inputs = [LLMChoice], outputs = [])
interface.launch(share=True, debug = True);
I meant a minimal reproduction that we can run, not your full code.
Could you simplify the example code please, at least to remove any external dependencies other than gradio.
Will close for now, can reopen if we get a minimal repro.
Sorry for the delay. Here is the minimal example, which I've also edited into the previous comments:
import gradio as gr
supportedLLMs = {'a': 1, "b": 2}
NAME = "a"
def updateName(newName):
NAME = newName;
print("Changed model to " + NAME)
def getModelResponse(message, history):
return NAME;
Chatbot = gr.ChatInterface(getModelResponse, chatbot = gr.Chatbot(likeable=True, show_share_button=True, show_copy_button=True, bubble_full_width = False), examples = ["What is the capital of France?", "What was John Lennon's first album?", "Write a rhetorical analysis of Hamlet"])
with gr.Blocks() as interface:
with gr.Row():
with gr.Column():
Chatbot.render();
with gr.Column():
with gr.Accordion("Chatbot Configuration", open=True):
LLMChoice = gr.Dropdown(choices = list(supportedLLMs.keys()), label = "Chatbot name", value = "Llama7b", interactive = True, info = "Cumulative chooses the best (most correct) reponse from all the LLM responses to a prompt.")
LLMChoice.change(updateName, inputs = [LLMChoice], outputs = [])
interface.launch(share=True, debug = True);
You need to pass the dropdown component as additional_inputs
to your predict function - getModelResponse
.
Couple tweaks to your code from above got it working for me -
import gradio as gr
supportedLLMs = {'a': 1, "b": 2}
NAME = "a"
def updateName(newName):
NAME = newName
print("Changed model to " + NAME)
def getModelResponse(message, history, llmchoice):
NAME = llmchoice
print("Changed model to " + NAME)
return NAME
chatbot = gr.Chatbot(likeable=True,
show_share_button=True,
show_copy_button=True,
bubble_full_width = False,
)
with gr.Blocks() as interface:
with gr.Row():
with gr.Column():
with gr.Accordion("Chatbot Configuration", open=True):
LLMChoice = gr.Dropdown(choices = list(supportedLLMs.keys()), label = "Chatbot name", value = "a", interactive = True, info = "Cumulative chooses the best (most correct) reponse from all the LLM responses to a prompt.")
LLMChoice.change(updateName, inputs = [LLMChoice], outputs = [])
with gr.Column():
bot = gr.ChatInterface(getModelResponse,
chatbot=chatbot,
additional_inputs=[LLMChoice],
examples = [["What is the capital of France?"], ["What was John Lennon's first album?"], ["Write a rhetorical analysis of Hamlet"]])
interface.launch(share=True, debug = True);
Thanks @yvrjsharma!
@yvrjsharma Thanks. This works. I wasn't using additional_inputs earlier because I was experiencing some problems with it (there was a bug report filed about it by someone else if I'm not mistaken), but looks like the problem with it was fixed.
Hello @yvrjsharma ,
Just a quick follow up here. How do you make the additional inputs accordion be on a separate column than the chat interface (as in side by side)?