Suggestions about release cycle
Opened this issue · 6 comments
This is a follow up of #67.
We think that the release cycle as currently implemented between the Python bindings and the back-end of WhiteboxTools is somewhat misleading to the user, and could lead to tricky situations.
Indeed, as a consuming developer, we expect the version of the Python bindings available on PyPI to be in sync with the WhiteboxTools back-end. When using v2.3.0 of the Python bindings, we should be using v2.3.0 of the WhiteboxTools back-end. This is a strong expectation made by any developer using virtually any third-party library. It notably allows rolling back to an earlier version when a newly released version breaks (which was the case between v2.3.0 and 2.4.0). However, we discovered, while dealing with #67, that this was actually not the case when using the Python bindings of Whitebox, because the Python bindings dynamically download the latest version of the WhiteboxTools back-end binaries from the Whitebox server. This means that it was basically impossible for us to rollback to v2.3.0, because v2.3.0 of the Python bindings was downloading v2.4.0 of the WhiteboxTools back-end anyway.
We think that more proper designs could be implemented easily. Here are a couple options that we can think of:
- Package the WhiteboxTools back-end binaries directly in the package uploaded to PyPI. This allows the maintainer of the Python bindings to freeze/package the version of the back-end that's going to be used by the Python bindings, ensuring that they remain in sync.
- Compile the WhiteboxTools back-end binaries as part of the build process of the Python bindings, before uploading to PyPI. This is a slight variation of option 1 that starts from the sources of the back-end instead of the pre-built binaries.
- Ask the maintainer of the WhiteboxTools back-end (@jblindsay) to host versioned binaries on the Whitebox server, instead of only the latest version. This would allow the Python bindings to still download the binaries from the server, but at least use the version of the back-end that is in sync with that of the Python bindings.
We think that options 1 and 2 would be best, 3 being a bit more hacky and error-prone, in case there's an issue with the Whitebox server.
Let us know if we can help in any way. We think that this would be a major improvement of this (already) great toolbox, and a step that would take it closer to being production-ready in a lot of cases.
Yes, ideally, we want the pypi package version to the same as the WBT backend. I usually made a few patch releases (e.g., v2.3.1, v.2.3.2, v2.3.3) before make the minor release (e.g., v2.4.0). If we sync the pypi package version the WBT backend right away, then the pypi version will be newer than the WBT backend when we fix bugs and release a new version.
And yes, it would be great to host versioned binaries on the Whitebox server so that users can roll back to an earlier version of WBT binary. It is up to @jblindsay's decison. We discussed this issue previously at giswqs/whitebox-bin#1
Thanks a lot for the rapid answer!
If we sync the pypi package version the WBT backend right away, then the pypi version will be newer than the WBT backend when we fix bugs and release a new version.
I get that. Would it be feasible to adopt a versioning schema along the lines of v2.4.0-1
, v2.4.0-2
, etc., to make it extra clear that these are the bindings to be used with v2.4.0 of WBT back-end, with the rightmost digit being used for patches made internally to the Python bindings?
And yes, it would be great to host versioned binaries on the Whitebox server so that users can roll back to an earlier version of WBT binary. It is up to @jblindsay's decison. We discussed this issue previously at giswqs/whitebox-bin#1
Thanks for linking to that past discussion. I think it would definitely be useful to host these versioned binaries somewhere. I don't know what the current limitations are (legal, ownership, technical, other?), but it's a real bummer not to have access to it. We are working around this by compiling ourselves from the sources which are tagged with proper version numbers on GitHub. However, if that's something the Python bindings do not plan to do themselves, it would be a big downside for users in our opinion.
Let's wait for @jblindsay's opinion on this matter. Again, we'd like to offer our help if you think it would be relevant.
Good suggestion. I can generate new version following {major}.{minor}.{patch}-{pre_l}{pre_n}
. I use bump-my-version to bump versio.
I do keep a copy of previous WBT releases on my own, but I did not include them python bindings. I would prefer the versioned binaries to be hosted centrally on whiteboxgeo.com or the WBT GitHub repo if it is something @jblindsay willing to do.
Good suggestion. I can generate new version following
{major}.{minor}.{patch}-{pre_l}{pre_n}
.
That'd be perfect 👌.
I do keep a copy of previous WBT releases on my own, but I did not include them python bindings. I would prefer the versioned binaries to be hosted centrally on whiteboxgeo.com or the WBT GitHub repo if it is something @jblindsay willing to do.
Yes, @jblindsay's opinion will definitely be needed here.
Any reason why you discard the possibility to compile the WBT back-end within the Python bindings, and ship the resulting binaries in the package that gets uploaded to PyPI? I think this is what would give the most control/safety over what's getting packaged in the bindings (although, arguably, this is not the easiest option to implement).
The Python binding is noarch, meaning it is not os specific. User only will download the OS specific WBT binary during initialization rather than installation. It would be overwhelming to include all five binararies >100MB in the PyPI package. whitebox is being using by many other packages. We want the installation to be smooth and fast. I would prefer to pull the WBT binaries from a central location rather than including them in the PyPI package.
Just noting that previous binaries are available on PyPi for Whitebox Workflows.