Why are the versions set to be so old?
hmaarrfk opened this issue · 5 comments
oldest-supported-numpy/setup.cfg
Line 22 in 27a29b8
I understand there is some discussion about whether or not to drop python 3.6 support, but it seems that the pinnings here are quite outdated.
$ nep29 numpy
[('1.19.0', '2020-06-20 20:37:45.624482'),
('1.18.0', '2019-12-22 15:51:31.822488'),
('1.17.0', '2019-07-26 18:35:56.431887'),
('1.16.0', '2019-01-14 03:02:28.527716')]
seems to indicate that we should be supporting only as old as 1.16, but the pinnings here go back to 1.13 for 3.6 and 1.14 for 3.7.
@hmaarrfk - the intent here is to specify for each Python version the first version of Numpy for which there are wheels available, or in other words what the first ever supported version of Numpy was for each Python version. This ensures that regardless of whether or not other packages adopt NEP 29, things will always work (note that this package is intended to be used just at the point of building extensions not as a runtime dependency, and the ABI should be forward-compatible).
Once a Numpy version is set for a Python version, I think we shouldn't change it because it would change the build environment of packages whereas I think packages would benefit from a stable environment. It's also worth noting that this meta-package can be used even if you require a recent version of Numpy at runtime: https://github.com/scipy/oldest-supported-numpy#can-i-use-this-if-my-package-requires-a-recent-version-of-numpy
Do you have a good use case for updating say the Numpy version used for Python 3.6? Have you run into issues with the pinnings being old?
Having said all this I agree that the same of the package may be confusing since it could be interpreted as currently supported - maybe oldest-working-numpy would be a better name, or oldest-numpy-wheels.
And in addition, maybe there is scope for a oldest-nep29-numpy
meta-package which does stay in sync with NEP 29 recommendations?
To give a concrete example of why updating the pinnings once set could be problematic, let's say I release a version of mypackage v1.2 in October 2020, and it is set to build against Numpy 1.14 on Python 3.7 and has a minimum runtime Numpy of day 1.16. In six years, if someone tried to install v1.2 of the package on Python 3.7 (for whatever reason) we don't want it to suddenly start building with Numpy 1.20 since it would the produce a package that wouldn't work with Numpy 1.16 to 1.19 even though those were valid runtime versions. In other words we can't just suddenly change what versions of Numpy existing released packages are built with. So once set (for example we will set the pin to 1.19.3 for python 3.9) the pin can never be changed. Does this make sense?
I know there was one issue with packages built with 1.14 and run with 1.16 or something odd like that.
When you first pointed me to this package, i thought it was an "oldest currently supported numpy" package that abided by the guidelines of nep29.
There are often problems with building with such old packages. There could be some dormant bugs, untested configurations, and sometimes even builds that don't exist on pypi.
I guess I understand the point of this package, but maybe its not for me.
Thanks for the detailed explanation!