Accessing localtileserver from remote Jupyter environment
giswqs opened this issue ยท 12 comments
The localtileserver works nicely locally. However, I have not been able to make it work in a cloud environment. I have tested it multiple cloud env without success, such as https://binder.pangeo.io, https://mybinder.org, https://streamlit.io/cloud and Google Colab. Below is the environment.yml
I used to create the env. It would be nice to have a working env that users can launch a notebook to test localtileserver with a simple click.
name: tileserver
channels:
- conda-forge
dependencies:
- gdal=3.2.2
- pip
- pip:
- geopandas
- leafmap
- localtileserver
I'm not sure if this is something that can be addressed by localtileserver
... I've been meaning to outline this issue and see if there are ways around it.
In brief, localtileserver
works by launching a webserver on a local port (local to wherever Python is running). If you are running Jupyter(lab) in a remote environment, then its not going to be possible to access the port on which localtileserver
is running from the Jupyter front end without some hackery or magic. The ipywidgets comm models are what are supposed to be used in place of serving data over an arbitrary port like this.
The TileLayer
in ipyleaflet or folium both expect a slippy maps tile URL in the form https://.../z/x/y.png
. In order to make this all work in a remote Jupyter environment, we would have to create a whole new widget model on top of the ipywidgets.TileLayer
model and an underlying TileLayer
in LeafletJS itself to pull tiles not from a URL but from a memory object or something.
This would get very complicated really quickly... localtileserver
is a little hobby project of mine and I'm not sure if I will have the availability to add support for this
I'm going to ping @martinRenou here to see if:
- Would there be any interest from the
ipyleaflet
team to implement something like this? At least adding a TileLayer that can load tiles from an object in memory rather than a URL. - Do you have any further insight on how to approach this problem?
Otherwise, what you can do is have your tile server running on another remote server with a publically visible URL. This is exactly what my team and I have built in ResonantGeoData and is demonstrated in this PR: ResonantGeoData/ResonantGeoData#603
Thanks for the insight. It makes a lot of sense. This feature is just my wish-list. No worries if it can't be implmented.
Just curious, what kind of memory object could potentially make this work? Would the ipyleaflet LocalTileLayer be useful in this case? I would be happy to look into it if there is a path forward.
Ahhh, LocalTileLayer
is interesting... I was not aware of that and I'll look into seeing if I might actually be able to get this to work with LocalTileLayer
I don't think LocalTileLayer
would even work in a remote Jupyter(lab) environment as-is. It simpy sets the file path as the URL of the tile layer:
The client web browser would not be able to access that.
Just curious, what kind of memory object could potentially make this work?
We would have to make an ipywidget/comm that could pass the bytes of a PNG image. Then I'd have to refactor localtileserver
a bit to have a version that doesn't run on in a webserver but just has a get_tile(x, y, z)
mehtod that is called by the ipywidget model.
Would the following code be relevant?
https://github.com/giswqs/leafmap/blob/master/leafmap/leafmap.py#L1400
image = Image.open(url)
f = BytesIO()
image.save(f, ext)
data = data.decode("ascii")
url = "data:image/{};base64,".format(ext) + data
Yep, that's what we'd do, but we'd need ipyleaflet to be able to take each one of those as tiles in the TileLayer
This looks promising as a way to proxy the local web server over to Jupyter on MyBinder: https://jupyter-server-proxy.readthedocs.io/en/latest/arbitrary-ports-hosts.html
Jupyter ServerProxy
seems interesting. I will look into it.
FYI, #32 does provide a work around by using a remotely hosted instance
Thank you very much for implementing this.