This Python script, achainflow.py
, simply consults a number of LLMs about a specific query. The script interacts with Groq and Ollama to gather and generate a final answer based on the responses from the other models.
-
The script defines functions to interact with the Groq and Ollama APIs for message completion and prompting the same question to different models.
-
It asynchronously consults advisors using different models and aggregates their advice to form a step-by-step plan.
-
The final model will take the advice from the other and will generate a final answers
-
Users can input a problem statement, which is then processed by a series of advisors to generate a final answer.
- Make sure to have streamlit installed in your Python environment.
streamlit run achainflow.py
-
Click the "Solve my problems." button to initiate the consultation process.
-
The script will display the distributed answer generated by the chain of advisors.
To ensure the script runs smoothly, create a requirements.txt
file with the necessary dependencies based on the code:
streamlit
groq
ollama
Get started today:
- Click the "Solve my problem" button and input your specific concern.
- Witness the collaborative power of AI as it generates a tailored solution plan.