deeppavlov/ru_sentence_tokenizer
A simple and fast rule-based sentence segmentation. Tested on OpenCorpora and SynTagRus datasets.
PythonApache-2.0
Issues
- 0
UnitTests -- using `__init__.py` at `tests` causes `tests` package appearance once project installed
#7 opened by nicolay-r - 0
Пример из книги Э. Сноудена
#6 opened by AlexeySlvv - 0
Segmenting issue after language switching
#5 opened by artemovae - 0
разделение после дат
#4 opened by kategerasimenko - 1
Splitting issue
#3 opened by StanislavGafarov