segfault on jobs with many moderatley sized tasks
Closed this issue · 2 comments
This example reported by users segfaults in version 2.0.0 of the package:
library(doRedis)
tasks = lapply(1:200, function(i){raw(length = 256^3)})
registerDoRedis(queue = "RJOBS")
ans = foreach( task = tasks, .verbose=TRUE) %dopar% { 1 }
I'm investigating the error now, but I have traced it to a bug in Redis pipelining in either the redux package or most likely the libhiredis C library that redux depends on (version 0.13.3-2 included in many Debian and Ubuntu packages). The dependence on redux is new in version 2 of doRedis.
As a work-around, a 2.0.1 version of the package will be released that temporarily avoids pipelining and avoids this bug. That version will be a little slower at submitting jobs with lots of tasks, however.
More soon...
The underlying issue is in the hiredis library, and will be fixed in the upcoming 1.0 release of hiredis. See this issue and associated pull requests: redis/hiredis#827
An improved version of doRedis work-around is in process and will be submitted to CRAN.
fixed now in newer hiredis versions