BamChatLLM doesn't correctly deserialize if it was originally created via BAMChatLLM.fromPreset
Closed this issue · 3 comments
Describe the bug
When the class BAMChatLLM serializes itself, the createSnapShot function simply does a shallowCopy of the config.
This is problematic if the BAMChatLLM instance was created via BAMChatLLM.fromPreset, as the messagesToPrompt function is the loaded config references a local variable in the function in the class BAMChatLLMPreset.
I assume the same issue exists with the WatsonXChatLLM class.
To Reproduce
Steps to reproduce the behavior:
- Use the following typescript file (it is the example examples/agent/bee_reusable.ts modified to work with BAM):
import "dotenv/config.js";
import { BeeAgent } from "bee-agent-framework/agents/bee/agent";
import { DuckDuckGoSearchTool } from "bee-agent-framework/tools/search/duckDuckGoSearch";
import { UnconstrainedMemory } from "bee-agent-framework/memory/unconstrainedMemory";
import { BAMChatLLM } from "bee-agent-framework/adapters/bam/chat";
const llm = BAMChatLLM.fromPreset("meta-llama/llama-3-8b-instruct");
// We create an agent
let agent = new BeeAgent({
llm: llm,
tools: [new DuckDuckGoSearchTool()],
memory: new UnconstrainedMemory(),
});
// We ask the agent
let prompt = "Who is the president of USA?";
console.info(prompt);
const response = await agent.run({
prompt,
});
console.info(response.result.text);
// We can save (serialize) the agent
const json = agent.serialize();
// We reinitialize the agent to the exact state he was
agent = BeeAgent.fromSerialized(json);
// We continue in our conversation
prompt = "When was he born?";
console.info(prompt);
const response2 = await agent.run({
prompt,
});
console.info(response2.result.text);
- Set the GENAI_API_KEY environment variable to your BAM API key
- Run: yarn start
- See the error "template is not defined"
Expected behavior
The example should have worked normally.
Set-up:
- Bee version: [e.g. v0.0.27]
- Model provider [e.g. BAM]
Thanks for the report. The issue has been fixed by d7584ef.
The fix will be released in 0.0.28 (expect a release later today or tomorrow).
Thanks I just pulled from main and the problem is fixed.
Great. Released in v0.0.28.