Question regarding creating transport pipeline
rluvaton opened this issue · 3 comments
rluvaton commented
In the docs the following code example is shown for Creating a transport pipeline
import build from 'pino-abstract-transport'
import { pipeline, Transform } from 'stream'
export default async function (options) {
return build(function (source) {
const myTransportStream = new Transform({
// Make sure autoDestroy is set,
// this is needed in Node v12 or when using the
// readable-stream module.
autoDestroy: true,
objectMode: true,
transform (chunk, enc, cb) {
// modifies the payload somehow
chunk.service = 'pino'
// stringify the payload again
this.push(`${JSON.stringify(chunk)}\n`)
cb()
}
})
pipeline(source, myTransportStream, () => {})
return myTransportStream
}, {
// This is needed to be able to pipeline transports.
enablePipelining: true
})
}
- Why
objectMode: true
and notwritableObjectMode: true
? as we read object from source but we write string - Why need to call
pipeline
if theworker-pipeline.js
already does that:
Lines 28 to 35 in 1aacfd2
mcollina commented
Why objectMode: true and not writableObjectMode: true? as we read object from source but we write string
backward compatibility and good habits. Those did not exist some time ago.
Why need to call pipeline if the worker-pipeline.js already does that:
because A Duplex is created: https://github.com/pinojs/pino-abstract-transport/blob/b6d7973a80b115b9bad3098efb669279ff9667c9/index.js#L70
rluvaton commented
Thanks!