/whisper

Experiments with speech-to-text using local OpenAI Whisper v2 model.

whisper

Experiments with speech-to-text using local OpenAI Whisper v2 model. The goal is to achieve a decent tool to speed up LLMs prompting and human-computer interaction as a whole via speech commands.

Disclaimer

I don't know Python at all (my thing is Golang) and this code is almost entirely generated by GPT-4. Expect bugs, glitches and issues.