langfuse/langfuse-python

[Langchain Integration] Support HuggingFaceHub as LLM

Closed this issue · 1 comments

A user got the following error when using the Langchain integration with HuggingFaceHub for LLMs.

ERROR:root:'model_name'
ERROR:root:run not found

Steps

  • investigate issue to find root cause
  • implement fix
  • Add additional tests to the langchain test suite

The user's implementation


def initialize_huggingface_llm(prompt: PromptTemplate, temperature: float, max_length: int) -> LLMChain:
    repo_id = "google/flan-t5-xxl"

    # Experiment with the max_length parameter and temperature
    llm = HuggingFaceHub(
        repo_id=repo_id, model_kwargs={"temperature": temperature, "max_length": max_length}
    )
    return LLMChain(prompt=prompt, llm=llm)

def generate_prompt() -> PromptTemplate:
    # You can play around with the prompt, see the results change if you make small changes to the prompt
    template = """Given the name of the country, give the languages that are spoken in that country. 
    Start with the official languages of the country and continue with the other languages of that country.
    Country: {country}?
    Languages: 
    """

    return PromptTemplate(template=template, input_variables=["country"])
if __name__ == '__main__':
    load_dotenv()

    handler = CallbackHandler(os.getenv('LANGFUSE_PUBLIC_KEY'),
                              os.getenv('LANGFUSE_SECRET_KEY'),
                              os.getenv('LANGFUSE_HOST'))

    # Try other values to see impact on results
    country = "belgium"
    country_max_length = 100
    country_temperature = 0.1

    country_prompt = generate_prompt()

    hugging_chain = initialize_huggingface_llm(prompt=country_prompt,
                                               temperature=country_temperature,
                                               max_length=country_max_length)
    
    print("HuggingFace")
    print(hugging_chain.run(country, callbacks=[handler]))