/Fine-Tuning-BERT-for-E-commerce-Text-Classification-A-Multi-category-Approach

In this Kaggle project, we leverage the power of the BERT (Bidirectional Encoder Representations from Transformers) model for fine-tuned multi-category text classification in the context of E-commerce. Our dataset comprises product descriptions from four distinct categories - "Electronics," "Household," "Books," and "Clothing & Accessories.

Primary LanguageJupyter Notebook

Stargazers