nobuoka/GifWriter.js

Implement streaming

jimmywarting opened this issue · 0 comments

I don't know what the structure gif binaries have, but unless gif requires some headers of how many frames are included in the beginning i would like to request a streaming functionality if it would be possible

In browser you would have to do it with ReadableStream
I would like for this to be able to run in the browser, so i would like to give you an example of what i would like to have

// Since the stream api don't include Readable stream we would
// instead have to use the readable stream. It works the same way roughly 

var frameReader = new ReadableStream({
  // gifWriter is gona consume this readableStream
  // and for each time gifWriter call `rs.getReader().read()` 
  // it will call this pull function and ask for a new frame
  pull(controller) {
    if ( last_frame() ) return controller.close() // end the gif

    var url = 'http://example.com/frame_02'
    return fetch(url).then(res => res.arrayBuffer())
  }
})

var gifReadableStream = omggif.gifWriter(frameReader)







// How you would consume the stream

// higher level of abstraction to concatenate all chunks
new Response(gifReadableStream).arrayBuffer().then(buffer => {
  var blob = new Blob([buffer], {type: "image/gif"})
  // do something with the blob
})

// lower level to consume all chunks
var reader = gifReadableStream.getReader()
var pump = () => {
  reader.read().then(result => {
    // one chunk is the same as if you would add a frame and
    // get back what has been compiled into a gif frame
    var chunk = result.value
 
    // store or upload chunk in filesystem or the server

    // calling pump here would actually result in the pull function above being executed
    // this would mean omggif don't withheld any memory as everything is getting 
    // piped from a input stream to a output stream
    if (!result.done) pump() 
  })
}
pump()

This would allow you to save more memory and build much larger gifs