arm64/Apple Silicon compatibility
paxtonfitzpatrick opened this issue · 1 comments
Installing hypertools currently fails on Apple Silicon Macs (and arm64 architectures in general). There are a few reasons for this I've figured out so far:
- requirement:
scikit-learn>=0.19.1,!=0.22,<0.24
- scikit-learn started releasing arm64-compatible wheels with v1.0.2. We have
<0.24
pinned because the layout of thesklearn.decomposition
subpackage changed in v0.24, which was causing the example data (pre-builthypertools.DataGeometry
objects saved withdeepdish
) to fail to load because they were created with an older scikit-learn version. We should really either update those objects to a different format or recreate them altogether. - the temporary workaround for this is to build scikit-learn from the source:
- make sure the programs needed to compile the package are installed.
- MacOS:
xcode-select --install
- Debian/Ubuntu:
apt-get update && apt-get install --no-install-recommends libc6-dev gcc g++
- MacOS:
- install the Python packages required to build scikit-learn from source. These must be installed and built BEFORE building scikit-learn, with a separate pip-install command.
pip install "cython>=0.28.5" # required to build scikit-learn from source pip install "numpy>=1.10.4,<1.22" # must be installed and built BEFORE building scikit-learn
- install and build scikit-learn from
sdist
. Using--no-binary :all:
in place of--no-use-pep517
should theoretically also work, but was hanging for me on MacOS.pip install --no-use-pep517 "scikit-learn>=0.19.1,!=0.22,<0.24"
- Then you should be able to
pip install hypertools
. Note: if your numpy version changes from the one used to build scikit-learn, the package may fail to load
- make sure the programs needed to compile the package are installed.
- scikit-learn started releasing arm64-compatible wheels with v1.0.2. We have
- requirement: `umap-learn>=0.4.6
- umap-learn requires numba, which in turn requires llvmlite. Both numba and llvmlite release binary wheels on PyPI for x86 architectures only, so pip-installing either one from an ARM machine will try use the
sdist
, which will fail because llvmlite doesn't include a source distribution for LLVM - the only workaround for this is to install umap-learn (or just numba, or just llvmlite) via conda/mamba:
conda install -yc conda-forge "umap-learn>=0.4.6"
- umap-learn requires numba, which in turn requires llvmlite. Both numba and llvmlite release binary wheels on PyPI for x86 architectures only, so pip-installing either one from an ARM machine will try use the
Ultimately, we really should package and release hypertools on conda-forge. That would solve both these issues, as well as the issue with MacOS that requires installing hdf5
separately via homebrew.
If I come across any more issues related to this, I'll track those here as well.
As of llvmlite v0.38.1 and numba v0.55.2, both packages now support Apple Silicon! 🎉
MacOSX ARM64 binary wheels are now available on PyPI for Python 3.8+ & MacOS 10.14+ (llvmlite, numba).
Additionally, numba v0.55.0 added support for Python 3.10, which means hypertools v0.8.0 now supports Python 3.10 as well! 🎉
I've confirmed pip install hypertools
succeeds on my M1 Mac (MacOS 12.1) for Python 3.8, 3.9, and 3.10. Closing this issue as resolved.