Tokenization-in-NLP

Tokenization is a process where we assign tokens to words in a sentence. This allows us to find similarity between two sentences as well.