This is a demo platform for building Ai powered user interfaces.
- Install Ollama and pull in the model weights for phi3 & llama3:instruct
1a. Serve Ollama in the background at address
http://localhost:11434
bundle install
x/serve
- Open your browser to
https://localhost:9292/
Note: The first web-socket handshake will error due to browser not liking our self-signed SSL certificate. Ignore warning for now and let the browser access your site.
- Ollama: LLM Service
- internlm: Chinese functional calling llm
- phi3: Microsoft LLM
- llama3:instruct Meta LLM
- HTMX: Page interactivity w/o javascript
- Web Awesome: UI Web Components