/Long-Text-Classification-DistilBERT

This project uses a pre-trained transformer-based network (DistilBert) to predict the missing labels. It is done by training a classifier head on the DistilBert's bidirectional word representations. After training, the model should predict the missing label, given its corresponding textual features as the input.

Primary LanguagePython

Stargazers