mathurinm/celer

ENH: add enet support (l1 + L2 reg for least squares)

mathurinm opened this issue · 4 comments

  • expose same parametrization as sklearn with alpha and l1_ratio

  • add new dual obj for enet, with soft thresholding

  • do not rescale theta if enet / or more simply use the "enet as rescaled Lasso" equivalence

  • coefficient updates : if enet, mutiply output of ST by 1 / (1 + alpha * (1 - l1_ratio) / lc[j])

  • add support for l1_ratio in solver

  • handle screening an priorization correctly

  • add ElasticNet and ElasticNetCV classes

  • add example in the doc

  • add it to benchmopt/benchmark_elasticnet and compare to sklearn and skglm

  • add support for l1_ratio in solver
  • handle screening an priorization correctly
  • add ElasticNet and ElasticNetCV classes (partially, need to add ElasticNetCV)
  • add example in the doc
  • add it to benchmopt/benchmark_elasticnet and compare to sklearn and skglm
  • add support for l1_ratio in solver
  • handle screening an priorization correctly
  • add ElasticNet and ElasticNetCV classes
  • add example in the doc
  • add it to benchmopt/benchmark_elasticnet and compare to sklearn and skglm (benchoptnot working)

@mathurinm, regarding the

  • add example in the doc

I should get inspired from the celer/examples?

  • add support for l1_ratio in solver
  • handle screening an priorization correctly
  • add ElasticNet and ElasticNetCV classes
  • add example in the doc
  • add it to benchmopt/benchmark_elasticnet and compare to sklearn and skglm

With this being achieved, I think we are done with #230