Word-Representation-and-Text-Classification-with-Neural-Networks
This project is divided into four parts. For all parts A to D, here is the breakdown of the tasks...
Part A: Word Embeddings with Word2Vec
- Pre-processing the training corpus
- Creating the corpus vocabulary and preparing the dataset
- Building a skip-gram neural network architecture
- Training the models
- Getting the Word Embeddings
- Exploring and visualizing your word embeddings using t-SNE
Part B: Basic Text Classification
- Developed a neural network classifier using one-hot word vectors, trained and evaluated
- Modify model to use a word embedding layer instead of one-hot vectors, and to learn the values of these word embedding vectors along with the model Model Summary
- Adapt model to load and use pre-trained word embeddings in- stead and train and evaluate it
- Improve the performance by adding another fully-connected layer to your network.
Part C: Using LSTMs for Text Classification
- Readying the inputs for the LSTM
- Building the model
- Plot the Training & Validation Accuracy
- Evaluating the model on the test data
- Extracting the word embeddings
- Visualizing the reviews
- Visualizing the word embeddings
Part D: A Real Text Classification Task