/gemma2_2b_finetune_jp_tutorial

This repository demonstrates how to fine-tune the Google Gemma 2 2B model to improve its performance on Japanese instruction-following tasks. It serves as a practical guide for developers and researchers interested in adapting large language models for specific languages or domains using state-of-the-art techniques in 2024.

Primary LanguageJupyter NotebookMIT LicenseMIT

Stargazers