Yes we can do it as this is very base example it is just a static video playing. But with little coding we can get emotions showing form different models like alpaca and we can showing different face movements with it's given value.
- Potato Pc
- Python
- If you want chat gpt then provide api key
- If not then you can connect it with other modals for uncensored features
- Eleven API key for AI to speak otherwise you can use pyttsx3
Just clone this repo provide openai key(You can link your own local model, just edit the code). Install all packages from requirements.txt and run main.py. It is ultra lite weighted so you can edit it, it is so simple in code and any beginner can understand it.
And don't forgot too generate images it can use local Stable Diffusion and DALL-E