Import Error
Closed this issue · 1 comments
Hi Slerch,
I am trying to run your code in Anaconda Python 3.8 version. I am experiencing the following import issues when the run_experiment.py and run_all.py files are run:
(1) run_experiment.py
Traceback (most recent call last):
File "run_experiment.py", line 6, in
from nn_src.imports import *
File "G:\ppnn-master\nn_postprocessing\nn_src\imports.py", line 4, in
from .losses import *
File "G:\ppnn-master\nn_postprocessing\nn_src\losses.py", line 9, in
from tensorflow import erf
ImportError: cannot import name 'erf' from 'tensorflow' (C:\Users\user\anaconda3\lib\site-packages\tensorflow_init_.py)
(2) run_experiment.py
Traceback (most recent call last):
File "run_experiment.py", line 6, in
from nn_src.imports import *
File "G:\ppnn-master\nn_postprocessing\nn_src\imports.py", line 25, in
limit_mem()
File "G:\ppnn-master\nn_postprocessing\nn_src\imports.py", line 22, in limit_mem
config = tf.ConfigProto()
AttributeError: module 'tensorflow' has no attribute 'ConfigProto'
(3)run_experiment.py
Traceback (most recent call last):
File "run_experiment.py", line 6, in
from nn_src.imports import *
File "G:\ppnn-master\nn_postprocessing\nn_src\imports.py", line 25, in
limit_mem()
File "G:\ppnn-master\nn_postprocessing\nn_src\imports.py", line 24, in limit_mem
keras.backend.tensorflow_backend.set_session(tf.Session(config=config))
AttributeError: module 'keras.backend' has no attribute 'tensorflow_backend'
(4) run_experiment.py
WARNING:tensorflow:From G:\ppnn-master\nn_postprocessing\nn_src\imports.py:26: The name tf.keras.backend.set_session is deprecated. Please use tf.compat.v1.keras.backend.set_session instead.
2021-11-12 16:24:21.778641: I tensorflow/core/platform/cpu_feature_guard.cc:142] This TensorFlow binary is optimized with oneAPI Deep Neural Network Library (oneDNN)to use the following CPU instructions in performance-critical operations: AVX AVX2
To enable them in other operations, rebuild TensorFlow with the appropriate compiler flags.
Traceback (most recent call last):
File "run_experiment.py", line 292, in
main(args)
File "run_experiment.py", line 29, in main
train_set, test_set = get_train_test_sets(
File "G:\ppnn-master\nn_postprocessing\nn_src\utils.py", line 87, in get_train_test_sets
raw_data = load_raw_data(data_dir, aux_dict, full_ensemble_t)
File "G:\ppnn-master\nn_postprocessing\nn_src\utils.py", line 157, in load_raw_data
rg = Dataset(data_dir + 'data_interpolated_00UTC.nc')
File "netCDF4_netCDF4.pyx", line 2358, in netCDF4._netCDF4.Dataset.init
File "netCDF4_netCDF4.pyx", line 1926, in netCDF4._netCDF4._ensure_nc_success
FileNotFoundError: [Errno 2] No such file or directory: b'/project/meteo/w2w/C7/ppnn_data/data_interpolated_00UTC.nc'
(5) run_all.py
Traceback (most recent call last):
File "run_all.py", line 14, in
raise Exception('Working directory not recognized.')
Exception: Working directory not recognized.
Moreover, what type of data is present in the data_dir = '/Volumes/STICK/data/ppnn_data/'
Could you please address the issues?
Please see the replication notebook (RL18_replication.ipynb) which I updated a few moments ago to ensure compatibility with newer Python/Tensorflow versions. The original code partly uses functionality from older Tensorflow versions and might not be compatible.
Regarding (4) and (5), the specific calls refer to datasets that were generated using the code in the folders data_retrieval
and data_processing
. Downloading the original dataset from TIGGE will likely be time-consuming, you may consider using the final dataset we provided for replication purposes at https://doi.org/10.6084/m9.figshare.13516301.v1