joncrlsn/dque

Implement lazy decoding

neilisaac opened this issue · 1 comments

Dequeue has variable latency since it may advance firstSegment, resulting in all elements of the next file getting decoded synchronously. This will hold the mutex for an extended period of time, blocking Enqueue operations, and may delay the consumer unnecessarily.

Instead of Peek() (interface{}, error) and Dequeue() (interface{}, error) we could have

Peek(interface{}) error
Dequeue(interface{}) error

This emulates the API from json.Decoder.Decode, removing the need for providing an object builder.

This would allow storing []byte arrays for each object rather than decoded objects. This may also reduce memory due to gob's encoding format, depending on the application.

A further optimization to consider is seeking within the file, rather than loading the whole file into memory.

This is not critical for my current application, but is worth discussing/considering.

On second thought, reducing memory use is important for one of the applications I have in mind (storing media payloads in the queue), where storing a whole segment in memory is undesirable. Using a small max segment size may work, but is not idea.

The interface suggestion is not strictly related to lazy decoding.