Example code fails using xgboost
Closed this issue · 1 comments
kambeitzlab commented
I tried to use mlrOverfit in an analysis using XGBoost.
library(mlrOverfit)
task = iris.task
learner = makeLearner("classif.xgboost", predict.type = "prob")
par.set = makeParamSet(makeDiscreteParam("max_depth",values=c(4,5)))
tune.control = makeTuneControlRandom(maxit = 20)
learner.tuned = makeTuneWrapper(learner = learner, resampling = hout, par.set = par.set, control = tune.control)
ro = resampleOverfit(learner = learner.tuned, task = task, resampling = cv5)
outer.errors = calcOuterPerformances(ro)
outer.errors = simulateOuterPerformance(outer.errors)
gg = plot(outer.errors, ro)
gg + facet_grid(iter~.)
However calcOuterPerformances fails with the error message:
Error in calcOuterPerformances(ro) :
Assertion on 'df' failed: Must have class 'data.frame', but has class 'factor'.
jakob-r commented
Thanks for reporting that bug. I adapted your MWE to actually (not) work correctly. I did not test against just having one tuning parameter. Will fix this now.
Side note: Is it just because it's a MWE or do you really just want to see the effect of tuning over two different parameter settings?