Humans use language as a sign of intelligence, babies than talk soon are considered smarter, and since 1950 that applies to computers too.
Natural languages (the ones that humans speak) can be studied as the composition of abstract entities, based in orthographic, morphological, syntactic a discursive analysis, offering an interesting way to analyze it programmatically.
This talk introduces some theoretical concepts related to natural language processing, starting with the words as basic units and extending to sentences, giving the basic concepts to understand the language as a relation between vectors.
Some examples will be shown using open-sourced libraries, like NLTK, Spacy and Rasa, to exemplify the use of those concepts and finally, we'll show simple personal chat assistant.