appelmar/gdalcubes

Stack overflow: Error : C stack usage * is too close to the limit

Closed this issue · 9 comments

Occasionally, creating a data cube crashes with a stack overflow and/or a corrupt result file. Unfortunately, this happens mostly after performing several operations but there is no reproducible example yet. Some tests to find out the source include the following:

  1. The behavior was never observed outside of R (i.e. when using the simple gdalcubes command line client to create cubes).
  2. Disabling the progress bar does not solve the issue
  3. Using RcppThread for creating C++ threads does not solve the issue

It seems that increasing the maximum stack size with ulimit -s can solve the issue

This might be caused by the GDAL error handler defined by the sf package. The GDAL error handler (set with
CPLSetErrorHandler) might be called from threads created by gdalcubes but the error handler calls Rf_warning() (see https://github.com/r-spatial/sf/blob/5dde39c8261dc6b202e6cde9d41ad3bf0f46aa3a/src/gdal.cpp#L29), which is not thread-safe. However, a single-thread test, where all computations run in the main thread should clarify further.

Commit e79f4ed fixes the issue by calling CPLPushErrorHandler() before threads are started (or as a first call within a thread). However, the issue will remain open until changes are merged into master.

edzer commented

Fantastic news that you resolved this!!

edzer commented

Time for a new CRAN release? @rhijmans this might also be of interest to terra development.

Yes, planning to submit to CRAN next week

v0.6.0 is now on CRAN (binary package builds may take some more days)