Feature: Ability to configure custom integrations/APIs other than home assistant
Opened this issue · 1 comments
It would be really neat if Willow could be configured with custom / user defined integrations.
For example, I want to send the text that willow infers from voice directly to a locally run LLM (AI) endpoint for processing which will return a text response.
At present I can do this by going Willow/WAS/WIS -> Home Assistant -> Home Assistant Local AI / Open AI custom endpoint integration -> LLM service (and back again). Which works pretty well, but adds complexity and points of failure.
Quite a few LLM servers now provide Open AI compatible API endpoints (see also LiteLLM), however a way to configure a generic API integration would probably be more flexible.
Below is an example where I've connected an ESP Box to Willow/WIS/WAS, HA and a large language model (Mistral 7b) running on my home server. It gives me the ability to converse with an AI:
Willow-Mistral.mp4
This is already possible to an extent: In WAS general settings you can use the REST command endpoint 🙂 In the future we plan to introduce logic that will allow for things like Willow -> WIS -> HA/LLM/REST etc -> TTS for example.