/llm-function-calling-demo

LLM Function Calling Demo for YouTube video πŸŽ₯

Primary LanguagePython

𝑓 Function Calling Demo Application

Demo function calling app for the YouTube video.

Watch the video πŸ‘‡

πŸ”¨ Setting up locally

Create virtualenv and install dependencies.

This step is not required if you are running in docker.

make setup

⚑️ Running the application

Make sure you have Ollama installed and running on your machine.

By default, the app uses mistral-nemo model but you can use Llama3.1 or Llama3.2.

Download these models before running the application. Update app.py to change the model if necessary.

Running locally

make run

Running in a container

make run-docker
⚠️ Does not work with Linux 🐧

Application running inside of the container uses a special DNS name host.docker.internal to communicate with Ollama running on the host machine.

However, this DNS name is not resolvable in Linux.

✨ Linters and Formatters

Check for linting rule violations:

make check

Auto-fix linting violations:

make fix

πŸ€Έβ€β™€οΈ Getting Help

make

# OR

make help