/Tokenizer

A tokenizer that takes a document as input and tokenizes it into words, sentences and paragraphs.

Primary LanguagePython

No issues in this repository yet.