kriszyp/cbor-x

TypeError: Cannot create property 'slowReads' on number '0'

Opened this issue · 1 comments

I get randomly this error trying to deserialize a single big tree object using DecoderStream. (v1.5.8)

TypeError: Cannot create property 'slowReads' on number '0'
    at createStructureReader (**/node_modules/cbor-x/decode.js:500:21)
    at recordDefinition (**/node_modules/cbor-x/decode.js:916:19)
    at read (**/node_modules/cbor-x/decode.js:380:7)
    at read (**/node_modules/cbor-x/decode.js:388:31)
    at checkedRead (**/node_modules/cbor-x/decode.js:202:16)
    at Decoder.decode (**/node_modules/cbor-x/decode.js:150:12)
    at Decoder.decodeMultiple (**/node_modules/cbor-x/decode.js:167:28)
    at DecoderStream._transform (**/node_modules/cbor-x/stream.js:41:26)
    at DecoderStream.Transform._write (node:internal/streams/transform:175:8)
    at writeOrBuffer (node:internal/streams/writable:392:12)
    at _write (node:internal/streams/writable:333:10)
    at DecoderStream.Writable.write (node:internal/streams/writable:337:10)
    at IncomingMessage.ondata (node:internal/streams/readable:809:22)
    at IncomingMessage.emit (node:events:517:28)
    at addChunk (node:internal/streams/readable:368:12)
    at readableAddChunk (node:internal/streams/readable:341:9)
    at IncomingMessage.Readable.push (node:internal/streams/readable:278:10)
    at HTTPParser.parserOnBody (node:_http_common:131:24)
    at TLSSocket.socketOnData (node:_http_client:541:22)
    at TLSSocket.emit (node:events:517:28)
    at addChunk (node:internal/streams/readable:368:12)
    at readableAddChunk (node:internal/streams/readable:341:9)
    at TLSSocket.Readable.push (node:internal/streams/readable:278:10)

It seems to disappear when the DecoderStream is created with { useRecords: false }.

Related with that I have some questions:

  1. Record extensions are embedded in the final object? Eg. If I have 2 Encoder instances (with useRecords: true) in 2 separated processes, can one decode what the other just encoded without issues? - I think the use case of using CBOR to store into a cache general mixed values.
  2. What is the main difference between using { useRecords: false } and { mapAsObjects: true }?
  3. There are any advantage in use Streams for encode/decode single big object tree?

I get randomly this error trying to deserialize a single big tree object using

Do you have any more information on how to reproduce this?

Record extensions are embedded in the final object?

Yes, they are, unless you have enabled shared structures (by providing a getStructures)

Eg. If I have 2 Encoder instances (with useRecords: true) in 2 separated processes, can one decode what the other just encoded without issues?

Yes, they should be able to.

What is the main difference between using { useRecords: false } and { mapAsObjects: true }?

useRecords: false will disable the record extension. mapAsObjects: true will make sure that standard CBOR maps are decoded as plain JS objects instead of Maps.

There are any advantage in use Streams for encode/decode single big object tree?

No, not really, every object that is encoded, is encoded as a single buffer synchronously (unless you are encoding an async iterator with encodeAsIterable).