This project is originally based on "Disaster Response Pipeline" from Udacity. The goal is to analyze disaster data from Figure Eight to build a model for an API that classifies disaster messages.
There are mainly three parts in the project:
- Build the ETL pipeline (in the
data
folder) - Build the Machine Learning Pipeline (in the
models
folder) - Build the Flask Web Application (in the
app
folder)
-
Run the following commands in the project's root directory to set up your database and model.
- To run ETL pipeline that cleans data and stores in database
python data/process_data.py data/disaster_messages.csv data/disaster_categories.csv data/DisasterResponse.db
- To run ML pipeline that trains classifier and saves
python models/train_classifier.py data/DisasterResponse.db models/classifier.pkl
- To run ETL pipeline that cleans data and stores in database
-
Run the following command in the app's directory to run your web app.
python run.py
-
Go to http://0.0.0.0:3001/
-
app
| - template
| |- master.html # main page of web app
| |- go.html # classification result page of web app
|- run.py # Flask file that runs app -
data
|- disaster_categories.csv # data to process
|- disaster_messages.csv # data to process
|- process_data.py # The main ETL process is here
|- DisasterResponse.db # database to save clean data to -
models
|- train_classifier.py # The main ML process is here
|- classifier.pkl # saved model -
README.md