EPFL CS-433 Machine Learning - Project 1
EPFL Machine Learning course held an InClass Prediction Competition in Kaggle, which is a copy of the earlier official Kaggle competition by CERN. This repository contains our winning solution to this competition. Our team RED finally ranked 11 of 201 teams.
Detailed description of the course and the porject can be found in the course website and the course Github repository.
Team
: RED
Team Members
: Tao Sun, Xiao Zhou, Jimin Wang
In order to reproduce the result we submitted to Kaggle, please follow the instructions as following:
- Please make sure
Python 3.6
andNumPy>=1.15
are installed. - Kindly download dataset from Kaggle competition dataset, and put
train.csv
andtest.csv
into thedata\
folder. - Go to
script\
folder and runrun.py
. You will getsubmission.csv
for Kaggle in thesubmission\
folder.
cd script
python run.py
Helper functions to load raw csv data into NumPy array, generate class predictions and create an output file in csv format for submission to Kaggle.
Helper functions for data preprocessing, feature engineering and regression model training.
preprocessing
: Preprocess train/test data with methods below.standardize
: Standardize data set ignoring NaN.delta_angle_norm
,add_phi
: Add new phi features according to organizer's suggestions.apply_log1p
: Apply log normalization to long-tailed features.drop_useless
: Drop useless columns, including raw phi angles, columns with the same values, columns full of NaN.fill_missing
,fill_nan
: Mark missing values with NaN and then fill them with zero.
train_predict
: Train and predict each group using polynomial ridge regression.get_jet_index
: Get the index of three groups.build_poly_feature
: Build polynomial features for input data.
Six basic machine learning methods with some supported functions.
least_squares_GD
: Linear regression using gradient descent.least_squares_SGD
: Linear regression using stochastic gradient descent.least_squares
: Least squares regression using normal equations.ridge_regression
: Ridge regression using normal equations.logistic_regression
: Logistic regression using SGD.reg_logistic_regression
: Regularized logistic regression using SGD.
Notebook with the cross validation results and related functions.
Script to generate the same submission file as we submitted in Kaggle.