availability on pypi
nschloe opened this issue ยท 13 comments
Ever thought about making z5 available on pypi?
It's been thought of, but is tricky due to the underlying compiled libraries. I think there might be some dynamic linking which makes building wheels hard.
Pyn5 is a very similar python binding to an alternative N5 implementation (in rust), with PyPI deployment as a design goal, if you're constrained to using pip-installable packages.
but is tricky due to the underlying compiled libraries.
Do you mean the dependencies or the libraries built by z5 itself?
Do you mean the dependencies or the libraries built by z5 itself?
Yes, the problem is in dynamic linking to some of the dependencies.
For a long time, we needed to link dynamically to boost. This is fortunately not the case anymore for
modern compilers that support c++ 17, which comes with std::filesystem
and replaces boost::filesystem
.
This still leaves dynamic linking to the compression libraries at runtime, but this is probably not such a big issue, because these tend to be more stable.
Also, one would need to download some build dependencies that are not available on pip (xtensor
/ xtensor-python
and pybind11
) during build time.
It would be great to have it on pip, and maybe one could use travis to auto-generate the wheels like here.
I will not have time to look into this in the foreseeable future though, but if someone wants to give this a shot I would be happy to try to help.
For a long time, we needed to link dynamically to boost.
Why was that ever a problem?
Also, one would need to download some build dependencies that are not available on pip (xtensor / xtensor-python and pybind11) during build time.
pybind11 is available from pypi. https://pypi.org/project/pybind11/
xtensor is a problem. I've submitted a report for xtensor-stack/xtensor-python#219.
For a long time, we needed to link dynamically to boost.
Why was that ever a problem?
As far as I am aware, boost is not available on pypi, so one would need to rely on the boost being available in the LD_LIBRARY_PATH
(or however these things work in windows...) and being the same as the version the library was build with, because for dynamic linking to boost the version needs to be pinned. Hope that makes sense.
pybind11 is available from pypi. https://pypi.org/project/pybind11/
Good to know.
xtensor is a problem. I've submitted a report for xtensor-stack/xtensor-python#219.
Ok thanks for raising this. Let's see what's the response is.
As far as I am aware, boost is not available on pypi,
That's right, but also it doesn't matter. Sometimes Python libraries have dependencies outside of the pypi realm. In those cases, the user just has to be told that this software needs to be installed. In many cases (like boost), this can be easily done via the system package manager.
(It's always wise to only depend on packages which can be installed easily. xtensor is not one of them.)
That's right, but also it doesn't matter. Sometimes Python libraries have dependencies outside of the pypi realm. In those cases, the user just has to be told that this software needs to be installed. In many cases (like boost), this can be easily done via the system package manager.
For dynamic linking to boost this is not a good option though, because one needs to have the same boost version that the library was build with: say the pypi version was build with 1.66. Then the system version needs to be exactly 1.66 as well, otherwise symbols won't match up.
In any case, as long as one can use C++17 this issue is gone.
It's always wise to only depend on packages which can be installed easily. xtensor is not one of them.
I agree, but in conda world it's very easy to install ;).
Also, it's only a build time dependency, so one could even install it via conda on the system that builds the wheels, it's not necessary to be available on the user's system.
say the pypi version was build with 1.66.
You wouldn't upload the wheel binaries but the source code and let pip build and link the package.
In any case, as long as one can use C++17 this issue is gone.
๐
I agree, but in conda world it's very easy to install ;).
In the conda world, yes.
Also, it's only a build time dependency, so one could even install it via conda on the system that builds the wheels, it's not necessary to be available on the user's system.
If you want to provide the wheels, no problem! (For non-native Python packages, I usually only upload the sources.tar.gz and let it build on the user system. This way, I avoid version incompatibilities like the boost scenario above.)
If you want to provide the wheels, no problem! (For non-native Python packages, I usually only upload the sources.tar.gz and let it build on the user system. This way, I avoid version incompatibilities like the boost scenario above.)
Ok, I see. This would indeed solve the linking issues. Probably a good idea to do it this way for z5 in any case in order to avoid problems with missing compression libraries.
But yes, in that case one would need to have xtensor / xtensor-python on pip. One would also need nlohmann_json, which is not available on pip, but available by some other package managers, see here.
(Unfortunately c++ stl is still missing json support and multiarrays, that's why the need for these dependencies arises.)
But yes, in that case one would need to have xtensor / xtensor-python on pip.
Well, I guess you can always put it on pypi right now and say in the readme that the user needs such and such libraries installed. (Most Python-C++ interfacing libraries do that, one of mine is pygalmesh which needs the huge C++ lib CGAL. -- A huge library, but fortunately installable with Debian/Ubuntu.)
While not necessary, it'd certainly be good for z5 if all required dependencies were easily installable.
I get that Boost isn't required anymore, but here's an example of building zmesh with statically linked Boost: https://github.com/seung-lab/zmesh/blob/master/.github/workflows/build_wheels.yml
Hey @constantinpape, I was wondering whether you are still interested in adding support for building wheels for PyPI. I am currently building a project depending on this and would be interested in having it on PyPI. I have experience with scikit-build-core
and cibuildwheel
, but I would only give it a shot if it had a chance of being merged.
Dear @dokempf,
absolutely. I would be very happy to have this available via PyPI. I don't have much time to help out, but I can review PRs and merge this.