LLM for multiple language (Rust, Python, and PHP) auto encoder
Evensi's business model requires heavy server calls and therefore a memory efficient language such as Rust is beneficial.
Make structued call to LLM and return result
command: cargo test tests_call_to_openai -- --nocapture