lisa-lab/pylearn2

tables.exceptions.NoSuchNodeError: group ``/`` does not have a child named ``/Data``

roserustowicz opened this issue · 6 comments

I am trying to run /pylearn2/scripts/datasets/make_svhn_pytables.py, and am running into an issue in that code that uses /pylearn2/datasets/svhn.py

When it goes to create the new data that will be placed in the h5 folder, it looks for a child node that is hard coded to be in group '/' with child name '/Data', and the error shows that the group doesn't have this child name within the tree.

Why is this value hard coded? Does anyone know what will fix the error?

Thank you!

Here's the error and traceback ...


Please ignore the warning produced during this MAT -> Pytables
conversion for the SVHN dataset. If you are creating the
pytables for the first time then no files are modified/over-written,
they are simply written for the first time.


/usr/local/lib/python2.7/dist-packages/pylearn2/datasets/svhn.py:61: UserWarning: Because path is not same as PYLEARN2_DATA_PATH be aware that data might have been modified or pre-processed.
warnings.warn("Because path is not same as PYLEARN2_DATA_PATH "
Traceback (most recent call last):
File "make_svhn_pytables.py", line 33, in
test = SVHN('test', path=local_path)
File "/usr/local/lib/python2.7/dist-packages/pylearn2/datasets/svhn.py", line 89, in init
data = self.h5file.get_node('/', "Data")
File "/usr/local/lib/python2.7/dist-packages/tables/file.py", line 1594, in get_node
node = self._get_node(nodepath)
File "/usr/local/lib/python2.7/dist-packages/tables/file.py", line 1542, in _get_node
node = self._node_manager.get_node(nodepath)
File "/usr/local/lib/python2.7/dist-packages/tables/file.py", line 437, in get_node
node = self.node_factory(key)
File "/usr/local/lib/python2.7/dist-packages/tables/group.py", line 1160, in _g_load_child
node_type = self._g_check_has_child(childname)
File "/usr/local/lib/python2.7/dist-packages/tables/group.py", line 395, in _g_check_has_child
% (self._v_pathname, name))
tables.exceptions.NoSuchNodeError: group / does not have a child named /Data
Closing remaining open files:/data/pylearn2/datasets/SVHN/format2/h5/test_32x32.h5...done

Hi rmr8019,

I wonder if you have managed to solve this issue yet? If you did, what did you do? Because I also encountered the same issue. Thanks!

@vtseng I did solve the issue, but I unfortunately don't remember how exactly I fixed it. I recall deleting the h5 directory that it attempted to make and then re-running the make_svhn_pytables.py function at some point and it seemed to work. .. I wish I could be more helpful than that but I can't seem to remember the specifics.

I just re-ran it again from the beginning and it didn't give me any issues. Did you download the data yourself or use the download_svhn.sh script to do so? If you did it yourself, I would suggest instead to use the script that is provided, by doing the following ...

If you navigate to ../pylearn2/scripts/datasets and run ...
./download_svhn.sh
and then
python make_svhn_pytables.py

I also had to make some changes to my bashrc file ... I added the following lines with vim ~/.bashrc:
export PYLEARN2_VIEWER_COMMAND="eog --new-instance"
export PYLEARN2_DATA_PATH=/data/pylearn2/datasets
export SVHN_LOCAL_PATH=/data/pylearn2/datasets/SVHN/format2/

where the '/data/' part of those paths would be the path on your computer to some pylearn directory where you store the data that comes out of running the two svhn scripts

@rmr8019 I really appreciate your prompt comment. Previously, I downloaded the data on my own and then ran make_svhn_pytables.py (with the PATHs you mentioned).

Following your comment, I also tried running the download_svhn.sh, but still got similar error " /Data does not have a child named X ".

I wonder if you use Python2.x or Python3.x? Because I'm suspecting this issues results from different version of Python.
Thanks!

@vtseng I'm using Python2.7. Did you also try to delete the h5 directory that it creates before you re-ran it? If I'm able to remember more specifically what I changed, I'll be sure to get back to you.

@rmr8019 The issue was solved. I think the key is to delete the h5 directory after fixing other compatibility issues. Thank you so much for your help!

@vtseng Great! I'll close the issue.