/Kaggle_Titanic

My first Kaggle experience. Predict survival on the Titanic. Logistic Regression, Decision Tree, SVM, Random Forest, Ada Boosting, Gradient Boosting are used.

Primary LanguageJupyter Notebook

Kaggle_Titanic Ranked Top 20%

This repo is about a classic Kaggle competition --- Titanic: Machine Learning from Disaster

The models used here including Logistic Regression, Decision Tree, SVM, Random Forest, Ada Boosting, Gradient Boosting. By applying cross validation, I found that Gradient Boosting method gives highest precision. Then by tuning the Gradient Boosting model, I found the best parameters which gives a precision of 85.30%.