3-billion-parameters

There are 1 repositories under 3-billion-parameters topic.

  • ryan-air/Alpaca-3B-Fine-Tuned

    In this project, I have provided code and a Colaboratory notebook that facilitates the fine-tuning process of an Alpaca 3B parameter model originally developed at Stanford University. The model was adapted using LoRA to run with fewer computational resources and training parameters and used HuggingFace's PEFT library.

    Language:Jupyter Notebook4300