mathurinm/celer

warnings in Logisticregression with prox newton

Opened this issue · 7 comments

Running pytest:

celer/tests/test_logreg.py::test_LogisticRegression[True]
  /home/mathurin/workspace/celer/celer/homotopy.py:311: ConvergenceWarning: Objective did not converge: duality gap: 0.841892524980679, tolerance: 0.005545177444479563. Increasing `tol` may make the solver faster without affecting the results much. 
  Fitting data with very small alpha causes precision issues.
    sol = newton_celer(

celer/tests/test_logreg.py::test_LogisticRegression[True]
  /home/mathurin/workspace/celer/celer/homotopy.py:311: ConvergenceWarning: Objective did not converge: duality gap: 23.09072008747797, tolerance: 0.006931471805599453. Increasing `tol` may make the solver faster without affecting the results much. 
  Fitting data with very small alpha causes precision issues.
    sol = newton_celer(

celer/tests/test_logreg.py::test_LogisticRegression[True]
  /home/mathurin/workspace/celer/celer/homotopy.py:311: ConvergenceWarning: Objective did not converge: duality gap: 2.1031969926275593, tolerance: 0.006931471805599453. Increasing `tol` may make the solver faster without affecting the results much. 
  Fitting data with very small alpha causes precision issues.
    sol = newton_celer(

@Badr-MOUFAD can you have a look ? You can start a PR with a script isolating and reproducing the issue (hopefully on a single alpha) on the data from the test

@mathurinm, we get the warning only if gap <= tol.
By playing with alpha and tol, I was able to reproduce the warning

celer/tests/test_logreg.py::test_reproduce_error[False]
  c:\users\hp\desktop\celer-repo\celer\celer\homotopy.py:313: ConvergenceWarning:

  Objective did not converge: duality gap: 1.1778627197155299e-08, tolerance: 2.0794415416798358e-15. Increasing `tol` may make the solver faster without affecting the results much.
  Fitting data with very small alpha causes precision issues.

However, I could not find a case where the duality gap is greater than 1, as in your example.

NB: note that I reproduced the warning by using unrealistic values of alpha and tol (alpha = 1e-10 and tol = 1e-16)

Did you take the same data and the same alpha as in the test ?

@mathurinm, I literally reused the code in celer/tests/test_LogisticRegression.

I do get the warning.
I was just testing things locally. so I didn't push any line of code.

@mathurinm,
We get the warning when we run check_estimator (the utils from sklearn).
I do know that it checks whether the estimator, in our case LogisticRegression, abides by the rules of sklearn.
Yet, I totally ignore what kinds of checks it does. So, I don't know which data causes the warning.

I pushed the code in this branch https://github.com/Badr-MOUFAD/celer/tree/conv-warning-issue
I have only commented the check_estimator lines in tests/test_logreg.
To reproduce just run pytest .\celer\tests\test_logreg.py