/functional-chatbots

The repository for the Functional LLM Chatbots workshop from DjangoCon Europe 2024.

Primary LanguagePython

Functional Chatbots - DjangoCon Europe 2024

Showcase

README.md contains:

  • Setup instructions
  • Project details (tools, project structure)

PROGRESS.md includes:

  • Task at hand
  • Some context
  • Extra challenges
  • Next steps

Note: Read the PROGRESS.md file every time you switch branches.

Requirements

Tools

Knowledge

This workshop will be easier for you if you're familiar with:

  • Django
  • HTMX
  • OpenAI's API
  • TailwindCSS

I tried to keep things simple, but there are lots to cover in only 50 minutes.

Getting Started

  1. Clone the repository:
    git clone https://github.com/scriptogre/functional-chatbots.git
    
  2. Rename .env.example to .env
  3. Update GROQ_API_KEY with your GroqCloud API key
  4. Run docker compose up to start the project
  5. Open browser at http://localhost:8000

Project Structure

Warning: This project is unconventional. Enjoy the ride!

Django-Ninja

Django-Ninja

Wait, what?! You want to render templates with Django-Ninja?

Why Django-Ninja?

Why not?

  1. It's less verbose, with intuitive syntax inspired by FastAPI.
  2. It's more performant, thanks to being built on top of Starlette.
  3. It's still Django, so we can benefit from the included batteries when needed.

Besides, it uses Pydantic.

Instructor also uses Pydantic. This will come in handy later.

htmx

HTMX

We'll use htmx to easily add interactivity to our project, like updating chat messages, or creating/updating/deleting pizza orders - without writing any JavaScript.

Why htmx?

Grug Brain Developer

Grug from The Grug Brained Developer by Carson Gross (creator of htmx). Love the article.

complexity bad

JinjaX

JinjaX Logo

We'll use JinjaX in our templates, an experimental project that's essentially Jinja2 with JSX-like syntax for components.

Why JinjaX?

Because paired with htmx, we can do stuff like:

<ChatContainer
        hx-get="/chat-messages"
        hx-trigger="chatMessagesUpdated from:body"
>
   <ChatMessage role="user">
      I personally love the simplicity of templates with JinjaX.
   </ChatMessage>
</ChatContainer>  

Which is a joy to read and write.

Most importantly, it enables keeping behaviour (hx-* attributes) explicit, while abstracting structure.

I've written a blog about JinjaX, if you're curious.

JinjaX Blog Snippet

Similar projects include:

TailwindCSS

We'll use TailwindCSS for styling.

Why TailwindCSS?

Because paired with JinjaX, we can do stuff like:

<ChatContainer class="group">

   ...

   <ChatPlaceholder class="group-has-[.chat-message]:hidden" />

</ChatContainer>

Which is very expressive.

We can hide classes that are part of the component, while keeping context-specific classes visible.

By creating custom variants (like hover: or dark:), we can also do stuff like this:

<!-- This shows only when assistant generates responses -->
<ChatMessage class="hidden htmx-request-on-[#trigger-assistant]:block">
   Typing...
</ChatMessage>

CSS is very powerful nowadays.

Smooth transitions, animations, and even conditional displaying can be achieved with it (e.g. group-has-[.chat-message]:hidden).

TailwindCSS makes it easier to harness that power.

Tools Less JavaScript
htmx 80%
htmx + TailwindCSS 99%

GroqCloud API

Groq

We'll use GroqCloud's free API to interact with LLama 3 70B, an open-source model.

Why GroqCloud?

It's FAST.

Faster than any other LLM API I've used.

It's FREE.

Other services like OpenAI, Anthropic, or Google Gemini were paid. I didn't want you to pay for a workshop.

It's ENOUGH for our needs.

Their free tier offers 30 requests per minute. That's 1 request every 2 seconds.

Instructor

Instructor Logo

Instructor is a Python library that does the heavy lifting for getting structured responses from LLMs.

Why Instructor?

It has support for Groq's API, and it will save us from a lot of effort (and boilerplate).

Make sure to also check out this branch's PROGRESS.md file

Feedback

This was my first workshop.

I'd love to her your thoughts. I'd appreciate to know whether I should pursue this further or stop wasting people's time.

Scan this QR code to provide feedback

...please 👉👈

Feedback Form QR Code