IMSMWU/RClickhouse

streaming through a table

Opened this issue · 1 comments

Imagine hte following:

res=dbSendQuery(con,queryText)
while(nrow(x<-dbFetch(res,50))){
........

}

Right now, this could complain about a memory error. However, having asked Clickhouse, it's not clear the results should be stored in memory? Not sure why we can't stream through the results sequentially in R.

This package does not support server side cursors or "sequential streaming" as you call it. dbSendQuery returns a "result" which is fetched as a huge chunk. The second dbFetch parameter does not define the batch-size to load, it defines the max amount to load. Thus, it does not work as expected. However, such a feature would be nice to have and we will consider it for future enhancements. And of course: feel free to provide a pull request.