/grid-search-cv

Optimising parameters for multiple machine learning algorithms using grid search cv

Primary LanguagePython

Algorithm Best Parameters Avg Precision Avg Recall Avg F1 Precision Score
SVC {'C': 100, 'gamma': 0.001, 'kernel': 'rbf'} 0.96 0.96 0.96 0.9701492537313433
DecisionTreeClassifier {'max_depth': 100, 'max_features': 'log2', 'min_samples_leaf': 5, 'min_samples_split': 10} 0.94 0.94 0.94 0.9411764705882353
MLPClassifier {'activation': 'tanh', 'alpha': 0.0001, 'hidden_layer_sizes': (10,), 'max_iter': 200} 0.95 0.95 0.95 0.9552238805970149
GaussianNB {} 0.90 0.90 0.90 0.9242424242424242
LogisticRegression {'fit_intercept': True, 'max_iter': 10, 'penalty': 'l1', 'tol': 0.0001} 0.96 0.96 0.96 0.9558823529411765
KNeighborsClassifier {'algorithm': 'ball_tree', 'n_neighbors': 10, 'p': 1, 'weights': 'uniform'} 0.96 0.96 0.96 0.9558823529411765
BaggingClassifier {'max_features': 0.5, 'max_samples': 1.0, 'n_estimators': 20, 'random_state': None} 0.97 0.97 0.97 0.9705882352941176
RandomForestClassifier {'criterion': 'entropy', 'max_depth': 200, 'max_features': 0.5, 'n_estimators': 20} 0.96 0.96 0.96 0.9701492537313433
AdaBoostClassifier {'algorithm': 'SAMME', 'learning_rate': 0.8, 'n_estimators': 200, 'random_state': None} 0.98 0.98 0.98 0.9850746268656716
GradientBoostingClassifier {'loss': 'deviance', 'max_depth': 3, 'max_features': 'log2', 'n_estimators': 200} 0.97 0.97 0.97 0.9705882352941176
XGBClassifier {'booster': 'gbtree', 'learning_rate': 0.1, 'max_delta_step': 0, 'min_child_weight': 1} 0.97 0.97 0.97 0.9705882352941176

The best algorithm among these was the AdaBoost classifier.