jupyterlab/jupyterlab-hdf5

"File not found" error

mohammadsamani opened this issue · 8 comments

When I double-click on an .h5 file in the standard file browser, I get this error message

image

And the "browse HDF" tab does not show any files or folders to me.

image

I am running JupyterHub Version 2.2.4. and Python 3.6.9 (default, Jul 17 2020, 12:50:27)

Some extra info would be helpful for fixing this. What happens if you:

  • change the file ending to .hdf5?

  • open the file by instead right clicking and selecting "Open"?

Hi
I have the same problem here. It doesn't work for .h5 nor for .hdf5. I also tried the "Data Explorer" from the dataregistry-extension. Also, right-click "open" doesn't help.
I'm running:

python                    3.7.8           h6f2ec95_1_cpython    conda-forge
jupyter_client            6.1.7                      py_0    conda-forge
jupyter_core              4.6.3            py37hc8dfbb8_2    conda-forge
jupyterlab-hdf            0.4.1                    pypi_0    pypi
jupyter-lab               1.1.3

I think this is rather a bug as an enhancement.
Any help would be much appreciated.
Thanks a lot

@BenMoon @bitagoras @mohammadsamani Sorry for getting back late to this.

The error can appear if the server extension is not enabled/present. Can you run jupyter serverextension list to see if jupyterlab_hdf is present and loaded ?

Note: The extension should work with files ending with .h5 since 0.4.0

Thanks for the reply.

$/var/opt/jupyterhub/bin/jupyter serverextension list 
config dir: /var/opt/jupyterhub/etc/jupyter
    jupyterlab  enabled 
    - Validating...
      jupyterlab 2.2.4 OK
    jupyterlab_hdf  enabled 
    - Validating...
    -  jupyterlab_hdf 0.4.1 OK

Now I can view the names of the datasets in the files, but when I double click on them, it shows me this error message instead. I tried 10 different files. Of course they are all created in the same way.

File Load Error for GCu
{"message": "Unhandled error", "reason": null, "traceback": "Traceback (most recent call last):\n File \"/var/opt/jupyterhub/lib/python3.6/site-packages/tornado/web.py\", line 1703, in _execute\n result = await result\n File \"/var/opt/jupyterhub/lib/python3.6/site-packages/tornado/gen.py\", line 191, in wrapper\n result = func(*args, **kwargs)\n File \"/var/opt/jupyterhub/lib/python3.6/site-packages/jupyterlab_hdf/baseHandler.py\", line 105, in get\n self.finish(json.dumps(self.manager.get(path, uri, row, col)))\n File \"/usr/lib/python3.6/json/__init__.py\", line 231, in dumps\n return _default_encoder.encode(obj)\n File \"/usr/lib/python3.6/json/encoder.py\", line 199, in encode\n chunks = self.iterencode(o, _one_shot=True)\n File \"/usr/lib/python3.6/json/encoder.py\", line 257, in iterencode\n return _iterencode(o, 0)\n File \"/usr/lib/python3.6/json/encoder.py\", line 180, in default\n o.__class__.__name__)\nTypeError: Object of type 'int32' is not JSON serializable\n"}
These datasets can easily be opened by Python like this:

with h5py.File(f"{datafolder}/{filename}", 'r') as f:
        gCu = np.array(f['GCu'])

I am giving up on the extension for now.

All right, that is an easy fix but I am interested in how you could get such an error.

Do you have an example file available somewhere that I could check for tests ?

You can download one such file from here.
This is made by IgorPro.

Thanks ! I managed to reproduce the error with the version 0.4.1 of the extension but it works fine on version 0.5.0:
image

Update the extension to the latest version and you should be good to go 👍

Solved!
Unfortunately, it was missing (while the extension manager allowed to disable or to enable):

(base) F:\>jupyter serverextension list
config dir: C:\prog\anaconda3\etc\jupyter
    jupyterlab enabled
    - Validating...
      jupyterlab 2.2.6 ok

(base) F:\>

But now it is working. I assume that I only installed it via the extension manager but did not install the module. After installing it with pip install jupyterlab_hdf everything worked.
Great tool! I am very excited that you even can view variable length arrays (although, a transposed view would be nice here).