This is an example of how we can use a genetic algorithm in an attempt to find the optimal extreme gradient boosting parameters for classification tasks. This is still work in progress, check out the feature/xgb-instead-network branch! The code in the master and development branch have nothing to do with XGB yet (25th of April 2018).
This code is based upon an repository that used an genetic algorithm to evolve a neural network. That repository can be found here.