eshaz/codec-parser

Implementation as a transform stream

albertojorge1983 opened this issue · 1 comments

It would be safe to asume that this would work?
Because I'm getting weird result on my end.

var codecParserTransform = new Transform({
  transform: function (chunk, encoding, cb) {
    for (const frame of this.codecParser.parseChunk(chunk)) {
      this.push(frame.data)
    }

    cb()
  }
})

someReadableAudio.pipe(codecParserTransform).on('data', (frame) => { 
 someWritableStream.write(frame)
})

Or there is a better way of doing this??
Thanks.

codec-parser should work fine wrapped as a transform stream and it should work fine reading partial chunks of data no matter what the size.

Unless you're seeing some specific codec parser error, or the data is being parsed incorrectly, I would guess the issue is with how your transform stream is implemented. You might check this out to help with implementing the transform stream: https://nodejs.org/api/stream.html#implementing-a-duplex-stream