opendap access crash only with pip installation in ubuntu-latest SO
aragong opened this issue ยท 27 comments
Hi, I am suffering crashes in my github action tests only when I use netCDF4 opendap access (I tried several datasets in different urls/servers) installed with pip install netCDF4
and ubuntu-latest
machines.
I supposse that could be an xarray related error but seems to be a problem in the installation related with netCDF4. view issue in xarray repository: pydata/xarray#7773
I also obtain this error if I use google colab platform that I suppose that use the same environment in ubunutu-latest SO.
I think that is related to this other opened issue... not sure! #1179
Thank you in advance!
To report a non-security related issue, please provide:
- the version of the software with which you are encountering an issue
- environmental information (i.e. Operating System, compiler info, java version, python version, etc.)
- a description of the issue with the steps needed to reproduce it
If you have a general question about the software, please view our Suggested Support Process.
Does ncdump work on that URL? If so, then it's a python interface issue, if not it's a netcdf-c lib issue.
I tried this in a google colab script
1 - I intalled netcdf4 using pip --> OK
2 - ncdump is not installed in the ubuntu by default --> apt install netcdf-bin
3 - ncdump to the url works fine in terminal
4 - Dataset class in python raise an error
5 - Xarray also raises an error
I am totally lost! thank you in advance! Is it something wrong with the installation of the ubuntu-latests? or when pip installs netcdf4 some requirement is lost?
I think that is related to this other opened issue... not sure! #1179
I think it is the same issue. The problem seems to be with the openssl used in the Linux wheel. (It does not happen in both macOS and Windows wheels.)
I rebuilt the linux wheels with a newer version of openssl (1.1.1l instead of 1.0.2). The wheels are at https://github.com/MacPython/netcdf4-python-wheels/releases/tag/v1.6.3rel - @aragong could you give one of these a try?
I tested them locally and no luck. I guess that Unidata/netcdf-c#2459 was closed prematurely. Something in the SSL CA cert is not OK.
I rebuilt the linux wheels with a newer version of openssl (1.1.1l instead of 1.0.2). The wheels are at https://github.com/MacPython/netcdf4-python-wheels/releases/tag/v1.6.3rel - @aragong could you give one of these a try?
I gave it a try in google colab and it is still crashing, same error (-68), I let you the link to the script here
Also I have one extra doubt:
Why now I have to install netcdf-bin with apt? I do not remember have to do this in google colab or github deploys...
Is there any change in linux SO deployments??? or is because the python interface was in charge of automatically installing this library in past? sorry about my ignorance.... ๐
Edit: Now installing your wheel I do NOT have to install netcdf-bin with apt... This behavior change again! I am nor even more confused! ๐
I added a test using the URL https://icdc.cen.uni-hamburg.de/thredds/dodsC/ftpthredds/hamtide//m2.hamtide11a.nc to the wheel building workflow, and I get failures on Ubuntu. However, adding the same test to the github actions workflow in the netcdf4-python repo, I don't get failures. Must be either be something in the Ubuntu environment, or the versions of the lib dependencies (most likely ssl or curl).
updating openssl to 3.0.1 and curl to 8.0.1 doesn't help. As @ocefpaf suggested it's most likely something to do with the CA cert process happening in libcurl (discussion at Unidata/netcdf-c#2459).
would really like to solve this, but I'm out of ideas right now. I'm tempted to blame a netcdf-c bug, save for the fact that it does work with curl installed via conda or apt-get on Ubuntu. This makes me think it still might be an issue with the way we are building ssl and/or curl.
Maybe we could try to reduce the scope of ocefpaf/netcdf4-win-wheels#6 (no macOS, ppc, and aarch) and build the Linux wheels that way too.
Try the following to get more info:
- export CURLOPT_VERBOSE=1
- ncdump -h https://icdc.cen.uni-hamburg.de/thredds/dodsC/ftpthredds/hamtide//m2.hamtide11a.nc
Thanks @DennisHeimbigner that suggestion helped. I was able to confirm that running ncdump with that URL right after the library is built works. Somehow after the libs are copied into the wheel and installed on another system the CA cert stuff no longer works. Maybe an environment variable needs to be set to point to the location of the certificates?
this is the error:
* Trying 134.100.239.238:443...
* Connected to icdc.cen.uni-hamburg.de (134.100.239.238) port 443 (#0)
* ALPN: offers http/1.1
* error setting certificate verify locations: CAfile: /etc/pki/tls/certs/ca-bundle.crt CApath: none
* error setting certificate verify locations: CAfile: /etc/pki/tls/certs/ca-bundle.crt CApath: none
* Closing connection 0
* error setting certificate verify locations: CAfile: /etc/pki/tls/certs/ca-bundle.crt CApath: none
Error:curl error: Problem with the SSL CA cert (path? access rights?)
curl error details:
/etc/pki/tls/certs is the location of the certificates on the manylinux distro, I think on Ubuntu they live in /etc/ssl/certs
I do not know how curl finds that certs location; you might do some googling to see if you can find out.
If that value is built-in to libcurl, then moving things would certainly cause a problem.
You might try the following:
- create a file ~/.ncrc
- Insert the following line in that file:
HTTP.SSL.CAPATH=<absolute path to the correct certs directory>
(see https://curl.se/libcurl/c/CURLOPT_CAPATH.html)
I notice the following options for building libcurl using Automake; I presume that CMake has similar options.
--with-ca-bundle=FILE Path to a file containing CA certificates
--with-ca-path=DIRECTORY Path to a directory containing CA certificates
--without-ca-path Don't use a default CA path
this seems relevent OSGeo/PROJ#2320
setting HTTP.SSL.CAPATH in .ncrc does not work - it's still looking for the certs in /etc/pki/tls/certs/ca-bundle.crt
Oops, there are two possibilities:
- the certs are in a directory, in which case use HTTP.SSL.CAPATH=<cert directory>
- all the certs are in a single file, in which case use HTTP.SSL.CAINFO=<cert bundle file path>
netcdf-c doesn't seem to be using ~/.ncrc.
>>> cat ~/.ncrc
HTTP.SSL.CAINFO=/etc/ssl/certs/ca-certificates.crt
>>> URL='https://icdc.cen.uni-hamburg.de/thredds/dodsC/ftpthredds/hamtide/m2.hamtide11a.nc'
>>> export CURLOPT_VERBOSE=1
>>> ls -l /etc/ssl/certs/ca-certificates.crt
-rw-r--r-- 1 root root 190243 Jan 17 11:39 ca-certificates.crt
>>> python -c "from netCDF4 import Dataset; nc=Dataset(\"${URL}\"); print(nc)"
* Trying 134.100.239.238:443...
* Connected to icdc.cen.uni-hamburg.de (134.100.239.238) port 443 (#0)
* ALPN: offers http/1.1
* error setting certificate verify locations: CAfile: /etc/pki/tls/certs/ca-bundle.crt CApath: none
* error setting certificate verify locations: CAfile: /etc/pki/tls/certs/ca-bundle.crt CApath: none
* Closing connection 0
* error setting certificate verify locations: CAfile: /etc/pki/tls/certs/ca-bundle.crt CApath: none
Error:curl error: Problem with the SSL CA cert (path? access rights?)
curl error details:
Warning:oc_open: Could not read url
Traceback (most recent call last):
File "<string>", line 1, in <module>
File "src/netCDF4/_netCDF4.pyx", line 2449, in netCDF4._netCDF4.Dataset.__init__
File "src/netCDF4/_netCDF4.pyx", line 2012, in netCDF4._netCDF4._ensure_nc_success
OSError: [Errno -68] NetCDF: I/O failure: 'https://icdc.cen.uni-hamburg.de/thredds/dodsC/ftpthredds/hamtide/m2.hamtide11a.nc'
Error: Process completed with exit code 1.
I had trouble setting all env vars and rc files with the previous wheels. I think this should be patched at netcdf-c, like proj did.
@aragong so far the only workaround that I can suggest is to copy or link the certs on your system to /etc/pki/tls/certs. On Ubuntu, I think this should work:
sudo mkdir -p /etc/pki/tls/certs
sudo ln /etc/ssl/certs/ca-certificates.crt /etc/pki/tls/certs/ca-bundle.crt
What version of netcdf-c are you using?
4.9.1
Put up a partial fix (PR Unidata/netcdf-c#2690) to at least
allow a workaround.
Using the fix in PR Unidata/netcdf-c#2690, plus the nc_rc_set function added in version 4.9.1 to set the cert path provided by certifi (see PR #1247), the tests now pass for the wheels. Once the PRs are merged, I can rebuilt the wheels with netcdf-c master and https opendap URLs should work with the linux wheels.
Hi again,
I success to run the example downgrading netcdf4 to v1.5.8 in a google collaboratory machine. I think this is the easiest workaround so far for inexperienced people like myself.
So far today for any version higher to netcdf4 v1.5.8 I suffer the same error. google colab example
The drawback of this downgrade is that pip can't manage my requirements installation for python 3.11. (see installation log)
`Run pip install "."
Processing /home/runner/work/pyteseo/pyteseo
Installing build dependencies: started
Installing build dependencies: finished with status 'done'
Getting requirements to build wheel: started
Getting requirements to build wheel: finished with status 'done'
Preparing metadata (pyproject.toml): started
Preparing metadata (pyproject.toml): finished with status 'done'
Collecting geopandas (from pyteseo==0.0.6)
Downloading geopandas-0.13.0-py3-none-any.whl (1.1 MB)
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ 1.1/1.1 MB 13.8 MB/s eta 0:00:00
Collecting owslib (from pyteseo==0.0.6)
Downloading OWSLib-0.29.1-py2.py3-none-any.whl (221 kB)
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ 221.1/221.1 kB 66.3 MB/s eta 0:00:00
Collecting xarray (from pyteseo==0.0.6)
Downloading xarray-2023.4.2-py3-none-any.whl (979 kB)
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ 979.5/979.5 kB 92.9 MB/s eta 0:00:00
Collecting dask (from pyteseo==0.0.6)
Downloading dask-2023.4.1-py3-none-any.whl (1.2 MB)
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ 1.2/1.2 MB 117.3 MB/s eta 0:00:00
Collecting netcdf4<=1.5.8 (from pyteseo==0.0.6)
Downloading netCDF4-1.5.8.tar.gz (767 kB)
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ 767.0/767.0 kB 114.0 MB/s eta 0:00:00
Installing build dependencies: started
Installing build dependencies: finished with status 'done'
Getting requirements to build wheel: started
Getting requirements to build wheel: finished with status 'error'
error: subprocess-exited-with-error
ร Getting requirements to build wheel did not run successfully.
โ exit code: 1
โฐโ> [41 lines of output]
Package hdf5 was not found in the pkg-config search path.
Perhaps you should add the directory containing `hdf5.pc'
to the PKG_CONFIG_PATH environment variable
No package 'hdf5' found
reading from setup.cfg...
HDF5_DIR environment variable not set, checking some standard locations ..
checking /home/runner/include ...
hdf5 headers not found in /home/runner/include
checking /usr/local/include ...
hdf5 headers not found in /usr/local/include
checking /sw/include ...
hdf5 headers not found in /sw/include
checking /opt/include ...
hdf5 headers not found in /opt/include
checking /opt/local/include ...
hdf5 headers not found in /opt/local/include
checking /usr/include ...
hdf5 headers not found in /usr/include
Traceback (most recent call last):
File "/opt/hostedtoolcache/Python/3.11.3/x64/lib/python3.11/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 353, in <module>
main()
File "/opt/hostedtoolcache/Python/3.11.3/x64/lib/python3.11/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 335, in main
json_out['return_val'] = hook(**hook_input['kwargs'])
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/hostedtoolcache/Python/3.11.3/x64/lib/python3.11/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 1[18](https://github.com/IHCantabria/pyteseo/actions/runs/4955870112/jobs/8865688283#step:4:19), in get_requires_for_build_wheel
return hook(config_settings)
^^^^^^^^^^^^^^^^^^^^^
File "/tmp/pip-build-env-_zm4a6be/overlay/lib/python3.11/site-packages/setuptools/build_meta.py", line 341, in get_requires_for_build_wheel
return self._get_build_requires(config_settings, requirements=['wheel'])
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/tmp/pip-build-env-_zm4a6be/overlay/lib/python3.11/site-packages/setuptools/build_meta.py", line 323, in _get_build_requires
self.run_setup()
File "/tmp/pip-build-env-_zm4a6be/overlay/lib/python3.11/site-packages/setuptools/build_meta.py", line 488, in run_setup
self).run_setup(setup_script=setup_script)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/tmp/pip-build-env-_zm4a6be/overlay/lib/python3.11/site-packages/setuptools/build_meta.py", line 338, in run_setup
exec(code, locals())
File "<string>", line 4[19](https://github.com/IHCantabria/pyteseo/actions/runs/4955870112/jobs/8865688283#step:4:20), in <module>
File "<string>", line 360, in _populate_hdf5_info
ValueError: did not find HDF5 headers
[end of output]
note: This error originates from a subprocess, and is likely not a problem with pip.
error: subprocess-exited-with-error
ร Getting requirements to build wheel did not run successfully.
โ exit code: 1
โฐโ> See above for output.
note: This error originates from a subprocess, and is likely not a problem with pip.
Notice: A new release of pip is available: 22.3.1 -> 23.1.2
Notice: To update, run: pip install --upgrade pip
Error: Process completed with exit code 1.`
@aragong new 1.6.4 linux wheels are now on pypi that should fix this problem. Can you try and verify that it indeed does work now?
@aragong new 1.6.4 linux wheels are now on pypi that should fix this problem. Can you try and verify that it indeed does work now?
Now runs properly but still, there is an error with one xarray dependency. I needed to install h5pyd before installing xarray [complete].
!pip install h5pyd xarray[complete]
I suppose that there is some change in this dependency that is not implemented in xarray, is it possible? (xarray[complete] should install all the dependencies including the necessary ones to access opendap)
I mention xarray original issue to be informed pydata/xarray#7773