Failing with large datasets
trasher opened this issue · 3 comments
Hello,
I'm trying to get a local Nominatim instance with all France data. When I use your Dockerfile verbatim, all goes well: build is OK, I'm able to run the container, and so on.
But when I replace Monaco data With France ones; I got the following output:
CREATE INDEX
CREATE INDEX
Setup finished.
ApplyLayer exit status 1 stdout: stderr: write /var/lib/postgresql/9.3/main/base/24576/27371.7: read-only file system
The only thing that changes between those 2 tests is the URL used for the wget (http://download.geofabrik.de/europe/france-latest.osm.pbf). Do you have any idea what could be wrong?
Thank you!
Just a precision: the build runs for 36 hours before failing...
How much disc space do you have available? Are you using a virtual machine with less disc space? I'm using the Germany data without issues, but it takes a few days before the import is done.
After the fail, free disk space is about 80Gio. But I think ve found the issue...
% sudo docker info
[...]
Storage Driver: devicemapper
[...]
Data Space Used: 107.4 GB
Data Space Total: 107.4 GB
Data Space Available: 0 B
After some searches on the web, it appears that with devicemapper driver (which is the default on our hots) will create disk files of about 100Gio. There is no "docker" way to override this.
So, indeed, we do not have enough free space.
What storage driver are you using for Germany? How much space is required for this import?
Thank you :)
I changed it internally to move the PostGreSQL DB folder to a volume mounted on the host. This was originally due to having an easier backup possibility in our backup infrastructure. This coincidentally seems to fix the devicemapper issue as well.
Gonna need to find some time to update the Dockerfile in this repo.