airspeed-velocity/asv

[DOCS] Dealing with old numpy dependencies

matteobachetti opened this issue · 3 comments

Hi, I'm trying to add tests of very old versions of our code at https://stingray.science/stingray-benchmarks/

Old versions depend on old Astropy and numpy versions, which have evolved considerably and, in turn, have their needs. Now, I'm getting an error which ends with

    < ModuleNotFoundError: No module named 'setuptools'

This does not change if, in my setup, I include setuptools among the dependencies of this environment:

    {"python": "3.8", "req": {"pip": "20.3.3", "numpy": "<1.20", "setuptools": "50", "astropy": "<5.0"}},

Are there any guidelines specifically for dealing with very old packages? Thanks in advance!

Sorry for the late reply, is this a problem with all the backends? (mamba / conda/virtualenv)?

How old is old? Using 4a28822 in the hashestobenchmark.txt (corresponding to v0.3) with the following asv.config.json:

modified   asv.conf.json
@@ -37,7 +37,7 @@
     // "install_command": ["in-dir={env_dir} python -mpip install {wheel_file}"],
     // "uninstall_command": ["return-code=any python -mpip uninstall -y {project}"],
     "install_command": [
-      "pip install . numba astropy<6.0 numpy<1.24"
+      "pip install ."
     ],
     // List of branches to benchmark. If not provided, defaults to "main"
     // (for git) or "default" (for mercurial).
@@ -57,7 +57,7 @@
     // If missing or the empty string, the tool will be automatically
     // determined by looking for tools on the PATH environment
     // variable.
-    "environment_type": "virtualenv",
+    "environment_type": "mamba",
 
     // timeout in seconds for installing any dependencies in environment
     // defaults to 10 min
@@ -68,7 +68,7 @@
 
     // The Pythons you'd like to test against.  If not provided, defaults
     // to the current version of Python used to run `asv`.
-    "pythons": ["3.11"],
+    "pythons": ["3.8"],
 
     // The list of conda channel names to be searched for benchmark
     // dependency packages in the specified order

seems to work alright:

asv run HASHFILE:hashestobenchmark.txt --skip-existing-successful
· Fetching recent changes.
· Creating environments
· Discovering benchmarks
·· Uninstalling from mamba-py3.8
·· Building 4a288222 <v0.3> for mamba-py3.8..........
·· Installing 4a288222 <v0.3> into mamba-py3.8..
· Running 14 total benchmarks (1 commits * 1 environments * 14 benchmarks)
[ 0.00%] · For stingray commit 4a288222 <v0.3>:
[ 0.00%] ·· Benchmarking mamba-py3.8
[ 3.57%] ··· Setting up benchmarks:19                                                      ok
[ 3.57%] ··· Running (benchmarks.NonUniformSuite.time_eventlist_creation_no_checks--)....
[17.86%] ··· Setting up benchmarks:71                                                      ok
[17.86%] ··· Running (benchmarks.PowerspectrumSuite.time_crossspectrum_from_events--)....
[32.14%] ··· Setting up benchmarks:48                                                      ok
[32.14%] ··· Running (benchmarks.UniformSuite.time_lightcurve_creation_no_checks--)..
[39.29%] ··· Setting up benchmarks_highmem:19                                              ok
[39.29%] ··· Running (benchmarks_highmem.PowerspectrumSuite.time_crossspectrum_from_events--)....
[53.57%] ··· benchmarks.NonUniformSuite.time_eventlist_creation_no_checks         2.16±0.01μs
[57.14%] ··· ...hmarks.NonUniformSuite.time_eventlist_creation_with_checks        1.27±0.01μs
[60.71%] ··· ...hmarks.NonUniformSuite.time_lightcurve_creation_from_times            247±1ms
[64.29%] ··· ...NonUniformSuite.time_lightcurve_creation_from_times_no_gti            249±3ms
[67.86%] ··· benchmarks.PowerspectrumSuite.time_crossspectrum_from_events             579±1ms
[71.43%] ··· benchmarks.PowerspectrumSuite.time_crossspectrum_from_lc              40.4±0.2ms
[75.00%] ··· benchmarks.PowerspectrumSuite.time_powerspectrum_from_events             547±2ms
[78.57%] ··· benchmarks.PowerspectrumSuite.time_powerspectrum_from_lc             4.30±0.04ms
[82.14%] ··· benchmarks.UniformSuite.time_lightcurve_creation_no_checks           2.23±0.07μs
[85.71%] ··· benchmarks.UniformSuite.time_lightcurve_creation_with_checks             383±7ms
[89.29%] ··· ..._highmem.PowerspectrumSuite.time_crossspectrum_from_events         5.18±0.03s
[92.86%] ··· ...arks_highmem.PowerspectrumSuite.time_crossspectrum_from_lc         5.20±0.02s
[96.43%] ··· ..._highmem.PowerspectrumSuite.time_powerspectrum_from_events            630±4ms
[100.00%] ··· ...arks_highmem.PowerspectrumSuite.time_powerspectrum_from_lc           558±20ms

Please reopen if this is unclear.

Thanks @HaoZeke, I will try as suggested!