/llm_presentation_lux

Code for a presentation featuring zero shot classification that happened live on the 14th of may.

Primary LanguagePython

Zero shot classification from scratch with llama3

Démarrage des llm avec docker:

docker-compose up -d
docker exec -it ollama_presentation ollama run llama3:instruct

Installer l'environnement python:

Installer conda ou Mambaforge et faire

mamba env create -f conda_env.yml

conda activate presentation

et enfin python live_demo.py