Code, Documentation to follow along with the workshop.
- Run the notebooks that need GPU (high compute) on Google Colab/Kaggle Notebooks.
- Create new issue if you need help with anything (even prior or after the workshop). A sample issue is created.
- We have created a bunch of assignments for you to try things and submit - and will be introducing them along the workshop.
- Your feedback will help us improve material presented during the workshop. Fill in this Form (takes ~4 mins)
- Once you are done with the assignments, create a github repo with all the necessary files, make the repo public and fill in this form (takes ~4 mins.
- Commercial APIs
- Open-Source LLMs for Inference on GPUs (Run on GoogleColab or similar services)
- Open-Source LLMs fine-tuning on GPUs (Run on GoogleColab or similar services)
- Open-Source LLMs on Local Machines (Laptop/PCs) using Ollama
- LLM Apps using LangChain and Ollama
- LLM Apps with Frontend using LangChain + Ollama + Streamlit