Akirato/Lesk-Algorithm

ImportError: cannot import name 'PunktWordTokenizer'

Opened this issue · 1 comments

I am using Python version 3.4.0 and nltk version 3.2.5 but while compiling this code I am having this error "ImportError: cannot import name 'PunktWordTokenizer'" . I have downgraded my nltk version also to 3.0.2 but its showing the same error .

Try 'sent_tokenize' instead. 'PuntktWordTokenizer' is just a sentence spliter