Problem loading data when calling lfads_train.sh
Closed this issue · 3 comments
Hello,
I have a problem when running LFADS that I would really appreciate to get any insight. When I execute the lfads_train.sh I get the following error messages:
Reading data from /home/common/jamengual/LFADS/lfads-run-manager/src/+LorenzExperiment/MUA_example/runs_target_pre_during_post_individual_sm_20_preT/exampleSingleSession/param_83EFvp/single_LFP_Doo_s1_pre/lfadsInput
loading data from /home/common/jamengual/LFADS/lfads-run-manager/src/+LorenzExperiment/MUA_example/runs_target_pre_during_post_individual_sm_20_preT/exampleSingleSession/param_83EFvp/single_LFP_Doo_s1_pre/lfadsInput with stem lfads
Cannot open /home/common/jamengual/LFADS/lfads-run-manager/src/+LorenzExperiment/MUA_example/runs_target_pre_during_post_individual_sm_20_preT/exampleSingleSession/param_83EFvp/single_LFP_Doo_s1_pre/lfadsInput/lfads_LFP_Doo_s1_pre.h5 for reading. (NOTE: THIS FILE EXISTS IN THIS PATH)
Traceback (most recent call last):
File "/home/common/jamengual/LFADS2/models/research/lfads//run_lfads.py", line 814, in
tf.app.run()
File "/home/amengual/anaconda3/envs/tensorflow/lib/python2.7/site-packages/tensorflow/python/platform/app.py", line 125, in run
_sys.exit(main(argv))
File "/home/common/jamengual/LFADS2/models/research/lfads//run_lfads.py", line 769, in main
datasets = load_datasets(hps.data_dir, hps.data_filename_stem)
File "/home/common/jamengual/LFADS2/models/research/lfads//run_lfads.py", line 740, in load_datasets
datasets = utils.read_datasets(data_dir, data_filename_stem)
File "/home/common/jamengual/LFADS2/models/research/lfads/utils.py", line 264, in read_datasets
data_dict = read_data(os.path.join(data_path,fname))
File "/home/common/jamengual/LFADS2/models/research/lfads/utils.py", line 217, in read_data
with h5py.File(data_fname, 'r') as hf:
File "/home/amengual/anaconda3/envs/tensorflow/lib/python2.7/site-packages/h5py/_hl/files.py", line 394, in init
swmr=swmr)
File "/home/amengual/anaconda3/envs/tensorflow/lib/python2.7/site-packages/h5py/_hl/files.py", line 170, in make_fid
fid = h5f.open(name, flags, fapl=fapl)
File "h5py/_objects.pyx", line 54, in h5py._objects.with_phil.wrapper
File "h5py/_objects.pyx", line 55, in h5py._objects.with_phil.wrapper
File "h5py/h5f.pyx", line 85, in h5py.h5f.open
IOError: Unable to open file (unable to lock file, errno = 37, error message = 'No locks available')
I am not a python expert (at all) and I would appreciate to know what can I do to solve this issue
Thanks a lot
Sorry for the rather delayed reply, have you figured out what the issue was? If not, can you describe how the HDF5 file was created? Was it with the Matlab run manager? This seems like an issue with the file not being readable by h5py, which suggests that something unusual happened when creating it.
Let me know if this is still an issue for you and we can dig into it!