ENH: add enet support (l1 + L2 reg for least squares)
mathurinm opened this issue · 4 comments
mathurinm commented
-
expose same parametrization as sklearn with
alpha
andl1_ratio
-
add new dual obj for enet, with soft thresholding
-
do not rescale theta if enet / or more simply use the "enet as rescaled Lasso" equivalence
-
coefficient updates : if enet, mutiply output of ST by 1 / (1 + alpha * (1 - l1_ratio) / lc[j])
-
add support for l1_ratio in solver
-
handle screening an priorization correctly
-
add ElasticNet and ElasticNetCV classes
-
add example in the doc
-
add it to benchmopt/benchmark_elasticnet and compare to sklearn and skglm
Badr-MOUFAD commented
- add support for l1_ratio in solver
- handle screening an priorization correctly
- add ElasticNet and ElasticNetCV classes (partially, need to add ElasticNetCV)
- add example in the doc
- add it to benchmopt/benchmark_elasticnet and compare to sklearn and skglm
Badr-MOUFAD commented
- add support for l1_ratio in solver
- handle screening an priorization correctly
- add ElasticNet and ElasticNetCV classes
- add example in the doc
- add it to benchmopt/benchmark_elasticnet and compare to sklearn and skglm (
benchopt
not working)
Badr-MOUFAD commented
Badr-MOUFAD commented
- add support for l1_ratio in solver
- handle screening an priorization correctly
- add ElasticNet and ElasticNetCV classes
- add example in the doc
-
add it to benchmopt/benchmark_elasticnet and compare to sklearn and skglm
With this being achieved, I think we are done with #230