Allow not copying `.so` files provided by other PyPI packages (e.g. `libtorch*.so` from `torch`)
XuehaiPan opened this issue ยท 15 comments
auditwheels
copies all external .so
files into <packagename>.lib
folder. Then packs them into the wheel file. Sometimes the current package (the developer what to audit) only has very small C++ extension, but it references a large .so
file. For example, the user builds a C++ extension for PyTorch, it references libtorch*.so
(some files up to 0.5GiB+). This makes the final wheel file extremely large (~2GiB).
We can add an option that if the .so
files are located inside site-packages
, then do not copy them but update the rpath
of the target.
For example, the user builds a C++ extension for PyTorch, it references libtorch_python.so
. auditwheel
can update the rpath
to:
$ORIGIN/../torch/lib # the relative path should be calculated carefully
where $ORIGIN/..
is site-packages
. And then adds torch
to the install dependency.
This would be helpful for NumPy/Scipy as well in order to consume a common openblas shared object. xref MacPython/openblas-libs#86. The delocate facility for macOS will need a similar fix, it is used in macOS wheel building by multibuild and cibuildwheel.
While this would work in simple environments but there's nothing that enforces the 2 packages to live in the same site-packages
folder.
e.g. in the following extract, all 3 packages live in different folders so the rpath
trick wouldn't work.
(cp310) root@bc22915b72a6:~# pip show scipy numpy psutil
Name: scipy
Version: 1.9.1
Summary: SciPy: Scientific Library for Python
Home-page: https://www.scipy.org
Author:
Author-email:
License: BSD
Location: /root/cp310/lib/python3.10/site-packages
Requires: numpy
Required-by:
---
Name: numpy
Version: 1.23.3
Summary: NumPy is the fundamental package for array computing with Python.
Home-page: https://www.numpy.org
Author: Travis E. Oliphant et al.
Author-email:
License: BSD
Location: /root/.local/lib/python3.10/site-packages
Requires:
Required-by: scipy
---
Name: psutil
Version: 5.9.0
Summary: Cross-platform lib for process and system monitoring in Python.
Home-page: https://github.com/giampaolo/psutil
Author: Giampaolo Rodola
Author-email: g.rodola@gmail.com
License: BSD
Location: /usr/lib/python3/dist-packages
Requires:
Required-by:
While this would work in simple environments but there's nothing that enforces the 2 packages to live in the same
site-packages
folder.
Good point. One way out of that is shipping the library with an importable Python component, so you can query the location (just like pybind11 and numpy do for their headers). And then preload the shared library. In fact, this is necessary anyway on Windows, because there is no rpath. NumPy and SciPy do it like that for the vendored OpenBLAS: https://github.com/scipy/scipy/blob/main/tools/openblas_support.py#L206
Or maybe even easier, ship it as a Python package that one can import, and the import pkg_with_sharedlib
itself will load the shared library. No need to mess with explicit preloading then.
If we leave the rpath mangling out of this PR, then all we require from auditwheel is to be able to extend the allowlist from a command line option like in #368
On second thought, maybe in @rgommer's scheme no support is required from auditwheel, since the loading done by import pkg_with_sharedlib
will not be noticed by auditwheel.
On second thought, maybe in @rgommer's scheme no support is required from auditwheel, since the loading done by import pkg_with_sharedlib will not be noticed by auditwheel.
even with @rgommers suggestion (I was about to post the same when I saw this had been updated), it needs an update from auditwheel (likely #368) because the c-extension will still reference the soname from pkg_with_sharedlib
sharedlib
which will either prohibit repairing the wheel if not found by the dynamic linker or go on with sharedlib
being grafted.
No need to mess with explicit preloading then.
Almost sure you meant that but, you still have to preload, it's just that you've moved the complexity of the actual job to a single place (pkg_with_sharedlib
) rather than consuming packages which only have to add import pkg_with_sharedlib
to preload the libraries.
I'm sure this works on glibc linux & windows. I've not tested this on musl linux or macOS.
Almost sure you meant that but, you still have to preload,
Kinda. I meant that that Python package could have a private function that uses for example the CPython C API and does a single dummy call in its __init__.py
file. That way you're sure the shared library is loaded, without having to mess with platform-specific things like WinDLL
.
OK, I guess something like:
pkg_with_sharedlib
|- __init__.py
|- _init.abi3.so
|- sharedlib.so.1
with __init__.py
just being import ._init
and _init.c
having a call into sharedlib.so.1
in its PyInit__init
function should work without having to mess with ctypes
.
yes, that seems reasonable. Then the package being examined by auditwheel does not explicitly link to sharedlib.so.1
, and no changes to auditwheel are needed.
no changes to auditwheel are needed.
no changes are needed for the pkg_with_sharedlib
but changes are required for the packages consuming sharedlib.so.1
from that package. They will link against sharedlib.so.1
and you can't just omit that in the DT_NEEDED section as otherwise, you'll get some undefined symbol error (e.g. Original error was: /usr/local/lib/python3.10/dist-packages/numpy/core/_multiarray_umath.cpython-310-aarch64-linux-gnu.so: undefined symbol: cblas_caxpy64_
) because importing c-ext is done (by default) with dlopen(RTLD_LOCAL)
.
You might be tempted to go back to a ctypes
preloading with RTLD_GLOBAL
which would not require auditwheel changes but I wouldn't, this would probably cause too many issues down the road.
Something like #368 is definitely needed for this to work.
They will link against
sharedlib.so.1
and you can't just omit that in the DT_NEEDED section
That's probably what @mattip meant. sharedlib.so.1
is already included as DT_NEEDED
in the extension module(s) of the package we're running auditwheel on (the build system has done that). So auditwheel should just not touch that entry. Which is indeed what gh-368 seems to do.
@mattip maybe it's worth building an openblas
wheel with the structure as in this comment above, and verify that a numpy wheel built with gh-368 and a runtime dependency on openblas
+ the needed import openblas
in numpy/_distributor_init.py
works as it should?
Good idea. I will work on this in the coming days.
Following #391 (comment), I tried:
package
โโโ nested
โ โโโ __init__.py
โ โโโ _C.so
โโโ __init__.py
to (add a new nest level without changing the package name and soname)
package
โโโ nested
โ โโโ __init__.py
โ โโโ _C
โ โโโ _C.so
โ โโโ _init.abi3.so
โ โโโ __init__.py
โโโ __init__.py
where:
# package/nested/__init__.py
from . import _init
and _init.abi3.so
is compiled by:
// _init.c
#include <Python.h>
PyMODINIT_FUNC PyInit__init(void) {
PyObject *m = PyImport_ImportModule("package.nested._C._C");
if (m == NULL) {
return NULL;
}
PyObject *sys_modules = PyImport_GetModuleDict();
PyDict_SetItemString(sys_modules, "package.nested._C", m); // change `sys.modules` entry back
return m;
}
command-line:
gcc --shared -Wall -O3 -fPIC -I$CONDA_PREFIX/include/python3.8 -L$CONDA_PREFIX/lib \
_init.c -o package/nested/_C/_init.abi3.so
EDIT: Everything works fine.
EDIT: For the members of _C.so
, their member.__module__
changes. For example, classes or functions defined in C extensions will have:
Before:
>>> import package.nested._C
>>> package.nested._C
<module 'package.nested._C' from '.../package/nested/_C.so'>
>>> package.nested._C.submodule
<module 'package.nested._C.submodule'>
>>> package.nested._C.MyClass
<class 'package.nested._C.MyClass'>
>>> package.nested._C.MyClass.__module__
'package.nested._C'
>>> package.nested._C.my_function.__module__
'package.nested._C'
After:
>>> import package.nested._C
>>> package.nested._C
<module 'package.nested._C._init' from '.../package/nested/_C/_init.abi3.so'>
>>> package.nested._C.submodule
<module 'package.nested._C._C.submodule'>
>>> package.nested._C.MyClass
<class 'package.nested._C._C.MyClass'>
>>> package.nested._C.MyClass.__module__
'package.nested._C._C'
>>> package.nested._C.my_function.__module__
'package.nested._C._C'
an extra level of the soname path is added to member.__module__
.