A Web App built with React as a Frontend and Express as a Backend. It uses Node LlamaCPP as a way to run an LLM locally and get financial predictions based on data from different APIs. Data store: MongoDB.
You need to setup the LLM you want to be using in the backend. The LLM should be saved as a .gguf file in the backend/models
folder.
- Validate LLM:
npx --no node-llama-cpp chat --model PATH-TO-MODEL-DIR
npx --no node-llama-cpp chat --model c:/projects/finai/backend/models/codellama-13b.Q3_K_M.gguf
- Or do a test request:
curl --location 'localhost:9000/api/llm' \
--header 'Content-Type: application/json' \
--data '{ "messages": "Hello there" }'
- Or call the express API endpoint
localhost:9000/api/llm
to see the result
Some examples for models and formats: LLMTypeDefinitions.json
Then you can run the below commands from the FinAI (main) directory and start the project:
npm i
npm start
This installs and starts both the FE and BE using the npm tool 'concurrently'. Or you can run the commands separately in the frontend / backend folders to have them running in separate instances/terminals.
- [✔]
React (Vite, Typescript)
- [✔]
Express API
- [✔]
Node LlamaCPP
- [✔]
TradingView API / Widgets
- [✔]
Axios
- [✔]
MongoDB
Free Chat / Text Inputs
Analysis Prompts
- specific prompts for analyzing financial dataPrice / Ticker Inputs
- a structured way of serving financial data to the LLMData Visualisation for Stocks / Crypto
- Charts / Diagrams / Tickers with live price updates from the TradingView API
Auth0
Auth and User ManagementLLM Fine-Tuning
and general Model-related options