Pinned Repositories
markat1's Repositories
markat1/MechAInic
markat1/ai-makers-midterm-project
markat1/ai-ml-for-data-professionals
markat1/applied-ml-demand-forecasting
markat1/applied-ml-ecommerce-analytics
markat1/applied-ml-speech-to-text
markat1/applied-ml-uber-eta-prediction
markat1/chainlit
Build Conversational AI in minutes ⚡️
markat1/chunking_evaluation
This package, developed as part of our research detailed in the Chroma Technical Report, provides tools for text chunking and evaluation. It allows users to compare different chunking methods and includes implementations of several novel chunking strategies.
markat1/creating-rag-with-llama-index
markat1/evaluation-of-rag-using-ragas
markat1/fine-tuning-workshop-scott-maven
markat1/data-science-handbook
This repo contains everything you would ever want to learn about data science.
markat1/Dtdl-CustomerCare
Customer Care assistant built with LLM agents to assist customers on general queries or about their past orders
markat1/green-screen-creator
Track an object in a video and add a green screen to the background.
markat1/Interactive-Dev-Environment-for-LLM-Development
Set up your local dev environment just like pro LLMOps practitioners.
markat1/jan
Jan is an open source alternative to ChatGPT that runs 100% offline on your computer. Multiple engine support (llama.cpp, TensorRT-LLM)
markat1/kernel-memory
RAG architecture: index and query any data using LLM and natural language, track sources, show citations, asynchronous memory patterns.
markat1/langchain-powered-rag
markat1/langChain_with_os_llm_and_LangSmith
markat1/langgraph-langsmith
markat1/llama.cpp
LLM inference in C/C++
markat1/ml-projects-maven-course
markat1/mlops
markat1/Practical-MLOps-assignment-3-model-deployment
markat1/PrepVector_ML_Demand_Forecasting
markat1/PrepVector_ML_group1
markat1/Raygun.Aspire.Hosting.Ollama
An Aspire component leveraging the Ollama container with support for downloading a model on startup
markat1/Streaming-LLM-Chat
Interactive chat application leveraging OpenAI's GPT-4 for real-time conversation simulations. Built with Flask, this project showcases streaming LLM responses in a user-friendly web interface.
markat1/systematically-improving-rag