Building a Virtual Assistant using LLMs takes a bit more work than just sending API calls to OpenAI.
As soon as you start implementing such system, you quickly realize you need to spend a lot time building a backbone of infrastructure and data-engineering boilerplate, before you can even work on your business logic.
Which means, you spend too much time on the non-differentiating elements of your products, and not enough on the differentiating things.
So the question is:
Is there a faster way to build real-world apps using LLMs?
… and the answer is YES!
Pathway is an open-source framework for high-throughput and low-latency real-time data processing.
Pathway provides the backbone of services and real-time data processing, on top of which you define your business logic in Python. So you focus on the business logic, and let Pathway handle the low-level details and data engineering boiler plate.
So you build real-world LLMOps app, faster.
Let’s see how to implement our virtual assistant using Pathway.
-
Install all project dependencies inside an isolated virtual env, using Python Poetry
$ make init
-
Create an
.env
file and fill in the necessary credentials. You will need an OpenAI API Key, and a Discord Webhook to receive notifications.$ cp .env.example .env
-
Run the virtual assistant
$ make run
-
Send the first request
$ make request
-
Simulate push of first event to the data warehouse
$ make push_first_event
-
Simualte push of the second event to the data warehouse
$ make push_second_event
🔜 Coming soon!
Join more than 10k subscribers to the Real-World ML Newsletter. Every Saturday morning.