Is there a hard limit on FlatGeoBuf size?
mtravis opened this issue · 6 comments
I seem to be finding that there is a hard limit on the file size of FlatGeobufs.
Anything up to around 20GB is ok but with files bigger than that I get a segmentation error. I don't see the same problem when using geojson.
Is your dataset open source somewhere so we can try to reproduce this, and what are your computer specs, OS etc? Does it fail every time at the same place?
@bdon the data isn't open source but I have a fgb based on open data that I can share (though need to work out a way of doing that) or you could try one you have that is fairly large e.g >15GB
The command I'm running is tippecanoe -o ovm.mbtiles ovm.fgb
And I'm seeing this output from the off with large files but it works fine normally.
For layer 0, using name "ovmfgb"
detected indexed FlatGeobuf: assigning feature IDs by sequence
Segmentation fault (core dumped)
These are the specs of my machine.
OS: Pop!_OS 22.04 LTS x86_64
CPU: 11th Gen Intel i7-1165G7 (8) @ 4.700GHz
GPU: Intel TigerLake-LP GT2 [Iris Xe Graphics]
Memory 32GB
I'm seeing the issues on different remote servers (all Ubuntu 20.04) too so it isn't just a local issue.
if you have reproduced the segfault with that 15GB file can you upload it to R2 or a similar service (cheap), or if you want to keep the link private you can email me the link at brandon at protomaps.com and I will take a look
@bdon thanks for the fix. I was able to update locally and recompile - first time for me :)
I'll close the one now.
Oh nice! I was about to say that I had hit this too - I was just breaking up my flatgeobuf's and then reading in multiple ones, which worked fine. I'm not up for compiling locally, but will try it out when it's released.