Segfault when transferring objects >2gb
Closed this issue · 9 comments
Strange issue, rkdb seems to segfault with a memory not mapped error (bringing down the entire R process) when transferring objects larger than about 2gb. This is using a 64-bit linux machine and the professional version of kdb+/q.
Can you provide a reproducible example ?
Is this a reproducible issue for anyone else or just specific to my particular instance?
Currently, none of the client libraries support message sizes >2GB. This includes c.o which is used by R interface.
You would need to split your data and fetch in chunks.
It doesn't crash for me
> qcon <- rkdb::open_connection()
> out <- rkdb::execute(qcon, "til 312500000")
Error in rkdb::execute(qcon, "til 312500000") : Error from kdb+: `limit
What version of kdb+ are you using and what version of rkdb?
I cannot reproduce this. Is there any specific command line params you have for kdb+ process? Any errors on kdb side before R crashes?
Could you install latest version of rkdb from github and see if it still happens?
Did it work for you with 3.5?
Still segfaults unfortunately. Must be something with my particular environment. Appreciate the help.