This repository contains the implementation of the book "Natural Language Processing with Transformers" by the brilliant minds at Hugging Face Lewis Tunstall, Leandro von Werra and Thomas Wolf
Each chapter from the book has been meticulously implemented and organized into separate folders for easy navigation and understanding.
Transformers have revolutionized the field of Natural Language Processing (NLP) with their state-of-the-art performance on a wide range of tasks. This repository aims to provide hands-on implementation of the concepts, techniques, and models discussed in the book, serving as a practical guide for both beginners and experienced practitioners in the field.
- Clone the Repository
git clone https://github.com/ishandutta0098/huggingface-transformers-book.git
cd huggingface-transformers-book
- Install Dependencies
Make sure you have Python 3.x installed. Then, install the required packages:
pip install -r requirements.txt
- Navigate to a Chapter
Each chapter's implementation is contained within its respective folder. Dive into a chapter:
cd <chapter_name>
huggingface-transformers-book
|
|- 01-hello-transformers
|
|- 02-text-classification [WIP]
|
|- 03-transformer-anatomy [TBD]
|
|- 04-multilingual-ner [TBD]
|
|- 05-text-generation [TBD]
|
|- 06-summarization [TBD]
|
|- 07-question-answering [TBD]
|
|- 08-efficient-transformers [TBD]
|
|- 09-dealing-with-few-to-no-labels [TBD]
|
|- 10-training-transformers-from-scratch [TBD]
|
└─ 11-future-directions [TBD]