dscape/clarinet

Error: Max buffer length exceeded: textNode - What is the limit, can I raise it and if not what alternatives do I have.

Opened this issue · 2 comments

I do have very large string values in my documents, so this is probably expected. Do I have any way of avoiding this ( I think some of the nodes could be > 64M, I know it sounds strange but basically I'm my JSON file is a database dump that could contain large text objects, or large binary objects that are HEXBINARY encoded.

Error: Max buffer length exceeded: textNode
Line: 1
Column: 523
Char:
at error (C:\Development\YADAMU\Oracle\node\node_modules\clarinet\clarinet.js:324:10)
at checkBufferLength (C:\Development\YADAMU\Oracle\node\node_modules\clarinet\clarinet.js:108:13)
at CParser.write (C:\Development\YADAMU\Oracle\node\node_modules\clarinet\clarinet.js:650:7)
at CStream.write (C:\Development\YADAMU\Oracle\node\node_modules\clarinet\clarinet.js:253:20)
at RowParser._transform (C:\Development\YADAMU\Oracle\node\cImport.js:358:21)
at RowParser.Transform._read (_stream_transform.js:190:10)
at RowParser.Transform._write (_stream_transform.js:178:12)
at doWrite (_stream_writable.js:410:12)
at writeOrBuffer (_stream_writable.js:394:5)
at RowParser.Writable.write (_stream_writable.js:294:11)

Same issue as #38?

Yes, missed that.

I actually have a proposed fix here...

https://github.com/markddrake/clarinet

It introduces a new 'event' to hand back a chunk of the value when the buffer is exceeded. This ensure that clainet's memory usage should remain flat when parsing, and allows the consumer to decide how to marshall the chuncks.

I need to modify my code that this behavoir it controlled by an option..