JuliaAI/DecisionTree.jl

Add support for specifying the `loss` used in random forests and AdaBoost model

ablaom opened this issue · 4 comments

As far as I can tell, the loss parameter is only exposed for single trees. I think this would be pretty easy to add to the ensemble models.

Issue raised at #211.

Also, it seems that loss is only available for classification trees - not regression trees.

Is it possible to repurpose the existing code for classification trees to run regression tasks? It would be convenient both for

  • regression tasks with one target and a custom loss, and

  • multi-target problems (the current implementation for regression trees does not allow for features that are not Float64 - i.e., single targets).

multi-target problems (the current implementation for regression trees does not allow for features that are not Float64 - i.e., single targets).

Do you mean features here or, rather, labels (aka target)?

labels as in this example

Right. Your interesting question is a little orthogonal to initial post, so addressing it here