Inconsistent `CoCiP` output with respect to cached `ERA5` data
Closed this issue · 6 comments
Description
The output of CoCiP
varies (predicting different contrails) depending on the state of my cached ERA5
data.
Details
- Version: 0.51.2
- OS: Windows 11
Steps to Reproduce
To reproduce and evaluate this issue I ran the same code without any changes 13 times. Before the 1st, 6th, and 11th run, I cleared my pycontrails cache at C:\Users\Name\AppData\Local\pycontrails\pycontrails\Cache
. These are the results:
- Clear pycontrails cache
- Re-run CoCiP
Output: 3926 contrail points - Re-run CoCiP
Output: 295 contrail points - Re-run CoCiP
Output: 295 contrail points - Re-run CoCiP
Output: 295 contrail points - Re-run CoCiP
Output: 295 contrail points - Clear pycontrails cache
- Re-run CoCiP
Output: 1925 contrail points - Re-run CoCiP
Output: 1359 contrail points - Re-run CoCiP
Output: 1359 contrail points - Re-run CoCiP
Output: 1359 contrail points - Re-run CoCiP
Output: 1359 contrail points - Clear pycontrails cache
- Re-run CoCiP
Output: 0 contrail points - Re-run CoCiP
Output: 4701 contrail points - Re-run CoCiP
Output: 4701 contrail points
Additional Notes
The only files that can be found in the pycontrails cache are ERA5 files such as 20230801-00-era5pl0.25reanalysis.nc
, likely indicating that this is a problem with the ERA5
cache rather than CoCiP
.
I will look into it and update if I find anything. I am happy to share any files/code needed to reproduce this.
Initialising ERA5
with cachestore=None
leads to more consistent results in that the CoCiP output changes every run (rather than only the first two runs).
I am not entirely familiar with ERA5/CoCiP, perhaps this is expected behaviour? But I am happy for the issue to be closed, using cachestore=None
works for my purposes.
This seems like very strange behavior to me.... CoCiP is a deterministic model, so you should get the same results as long as you're running it on the same meteorology data.
Could you check what version of the netCDF4 package you're using? We had issues with the most recent release (see #204) and just updated pycontrails dependencies to require netCDF4<1.7.0. That will change will be released in v0.52.0, most likely tomorrow (6/17).
If you're using netCDF4 1.6.*, then sharing a MWE would be great!
I am currently using netCDF4 1.6.5 so that should be okay.
I have also tried a fresh install of python to see if there were any issues with my current environment, but this did not seem to have changed anything. For this fresh install I did the following:
conda create -n debug_cocip python=3.9
conda activate debug_cocip
pip install git+https://github.com/contrailcirrus/pycontrails.git
pip install netcdf4==1.6.5
pip install "pycontrails[ecmwf]"
pip install "pycontrails[zarr]"
pip install "pycontrails[sat]"
I have shared an MWE via email but don't worry about it too much! It does seem likely to be an issue with netCDF4/xarray rather than pycontrails.
Thanks for the MWE! Unfortunately I'm not able to reproduce your issue: I get the same value from print(len(contrails))
each time I run the notebook.
FWIW, I'm getting 36218 from print(len(contrails))
. This is much larger than the values you listed, but I'm not sure whether the numbers you gave came from the same set of flights.
I'm going to close this issue, but I'll follow up by email and can always re-open it if we come across anything that needs fixing.
Thanks for testing out the MWE! I have been able to solve this now (also giving an output of 36218 consistently) by commenting out the following line, xr_kwargs.setdefault("lock", OPEN_WITH_LOCK)
found on line 744 in open_dataset
(link).
As per the xarray documentation, this seems to allow xarray's default behaviour of "By default, appropriate locks are chosen to safely read and write files with the currently active dask scheduler".
I am not sure why this was only an issue on my machine (perhaps a Windows 11 issue?), but I hope that can be of some help.
Weird, but glad you got it working and thanks for noting the solution!