1. Efficiently Train Large Language Models with LoRA and Hugging Face |
Details and code for efficient training of large language models using LoRA and Hugging Face. |
|
2. Fine-Tune Your Own Llama 2 Model in a Colab Notebook |
Guide to fine-tuning your Llama 2 model using Colab. |
|
3. Guanaco Chatbot Demo with LLaMA-7B Model |
Showcase of a chatbot demo powered by LLaMA-7B model. |
|
4. PEFT Finetune-Bloom-560m-tagger |
Project details for PEFT Finetune-Bloom-560m-tagger. |
|
5. Finetune_Meta_OPT-6-1b_Model_bnb_peft |
Details and guide for finetuning the Meta OPT-6-1b Model using PEFT and Bloom-560m-tagger. |
|
6.Finetune Falcon-7b with BNB Self Supervised Training |
Guide for finetuning Falcon-7b using BNB self-supervised training. |
|
7.FineTune LLaMa2 with QLoRa |
Guide to fine-tune the Llama 2 7B pre-trained model using the PEFT library and QLoRa method |
|
8.Stable_Vicuna13B_8bit_in_Colab |
Guide of Fine Tuning Vecuna 13B_8bit |
|
9. GPT-Neo-X-20B-bnb2bit_training |
Guide How to train the GPT-NeoX-20B model using bfloat16 precision |
|
10. MPT-Instruct-30B Model Training |
MPT-Instruct-30B is a large language model from MosaicML that is trained on a dataset of short-form instructions. It can be used to follow instructions, answer questions, and generate text. |
|
11.RLHF_Training_for_CustomDataset_for_AnyModel |
How train a Model with RLHF training on any LLM model with custom dataset |
|
12.Fine_tuning_Microsoft_Phi_1_5b_on_custom_dataset(dialogstudio) |
How train a model with trl SFT Training on Microsoft Phi 1.5 with custom |
|
13. Finetuning OpenAI GPT3.5 Turbo |
How to finetune GPT 3.5 on your own data |
|
14. Finetuning Mistral-7b FineTuning Model using Autotrain-advanced |
How to finetune Mistral-7b using autotrained-advanced |
|
15. RAG LangChain Tutorial |
How to Use RAG using LangChain |
|
16. Knowledge Graph LLM with LangChain PDF Question Answering |
How to build knowledge graph with pdf question answering |
|
17. Text to Knolwedge Graph with OpenAI Function with Neo4j and Langchain Agent Question Answering |
How to build knowledge graph from text or Pdf Document with pdf question Answering |
|
18. Convert the Document to Knowledgegraph using Langchain and Openai |
This notebook is help you to understand how easiest way you can convert your any documents into Knowledgegraph for your next RAG based Application |
|
19. How to train a 1-bit Model with LLMs? |
This notebook is help you to train a model with 1-bit and 2-bit quantization method using hqq framework |
|