/awesome-nlp-polish

A curated list of resources dedicated to Natural Language Processing (NLP) in polish. Models, tools, datasets.

MIT LicenseMIT

awesome-nlp-polish

A curated list of resources dedicated to Natural Language Processing (NLP) in polish. Models, tools, datasets.

Awesome NLP Polish Logo

Table of Contents:

Polish text datasets

Task oriented datsets

Raw texts

Models and Embeddings

Polish Transformer models

  • Polish Roberta Model - model was trained on a corpus consisting of Polish Wikipedia dump, Polish books and articles, Polish Parliamentary Corpus
  • PoLitBert - Polish RoBERTA model trained on Polish Wikipedia, Polish literature and Oscar. Major assumption is that quality text will give good model.
  • PolBert - Polish BERT model. Model was trained with code provided in Google BERT's github repository. Merge with huggingface/Transformers
  • Allegro HerBERT - Polish BERT model trained on Polish Corpora using only MLM objective with dynamic masking of whole words.
  • SlavicBert - multilingual BERT model -BERT, Slavic Cased: 4 languages(Bulgarian,Czech, Polish, Russian), 12-layer, 768-hidden, 12-heads, 110M parameters, 600Mb. There is also another SlavicBert model http://docs.deeppavlov.ai/en/master/features/models/bert.html but I have problems to convert it to pytorch.

Other models

Language processing tools and libraries

Papers, articles, blog post

Contribution

If you have or know valuable materials (datasets, models, posts, articles) that are missing here, please feel free to edit and submit a pull request. You can also send me a note on LinkedIn or via email:krzysztofsopyla@gmail.com.