orangewise/s3-zip

Error when sending large files

Closed this issue · 2 comments

Using s3-files as a dependency of s3-zip.

I am attempting pipe a large collection of files that can include one large file with size ranging from 500MB to 5GB. On data transfer, when I attempt to pipe a file larger than ~2GB, I get the following error:

buffer.js:269
    throw err;
    ^

RangeError [ERR_INVALID_OPT_VALUE]: The value "2246554069" is invalid for option "size"
    at Function.allocUnsafe (buffer.js:291:3)
    at Function.concat (buffer.js:473:23)
    at bufferConcat (D:\Development\speedwise-platform-app-web\node_modules\s3-files\node_modules\concat-stream\index.js:117:17)
    at ConcatStream.getBody (D:\Development\speedwise-platform-app-web\node_modules\s3-files\node_modules\concat-stream\index.js:64:42)
    at ConcatStream.<anonymous> (D:\Development\speedwise-platform-app-web\node_modules\s3-files\node_modules\concat-stream\index.js:37:51)
    at ConcatStream.emit (events.js:194:15)
    at ConcatStream.EventEmitter.emit (domain.js:441:20)
    at finishMaybe (D:\Development\speedwise-platform-app-web\node_modules\s3-files\node_modules\readable-stream\lib\_stream_writable.js:624:14)
    at endWritable (D:\Development\speedwise-platform-app-web\node_modules\s3-files\node_modules\readable-stream\lib\_stream_writable.js:643:3)
    at ConcatStream.Writable.end (D:\Development\speedwise-platform-app-web\node_modules\s3-files\node_modules\readable-stream\lib\_stream_writable.js:571:22)
    at PassThrough.onend (_stream_readable.js:629:10)
    at Object.onceWrapper (events.js:277:13)
    at PassThrough.emit (events.js:194:15)
    at PassThrough.EventEmitter.emit (domain.js:441:20)
    at endReadableNT (_stream_readable.js:1103:12)
    at process._tickCallback (internal/process/next_tick.js:63:19)

The code:

router.get('/:mongoId/version/:versionId/download', (req, res) => {

  const { mongoId, versionId } = req.params

  res.setTimeout(1000 * 60 * 25)
  getVersionFilesForZipFolder(mongoId, versionId)
    .then(({ ids, fileLocations }) => {
      downloadZipDatasetVersion(ids, fileLocations)
      .pipe(res)
    })
})

function downloadZipDatasetVersion(ids, filePaths) {

  const params = { s3: MY_S3_INSTANCE, bucket: MY_BUCKET, debug: true }
  const folder = 'MY_FOLDER/'

  return s3Zip.archive(params, folder, ids, filePaths)
}

ids is an array of file references for s3

[
  'file-name-in-s3-1',
  'file-name-in-s3-2',
  'file-name-in-s3-3',
  'file-name-in-s3-4',
]

and fileLocations is an array of objects that look as follows:

[
  { name: '/dir1/originalfilename1.txt' },
  { name: '/dir2/originalfilename2.txt' },
  { name: '/dir3/originalfilename3.txt' },
  { name: '/dir4/originalfilename4.txt' },
]

I have the same issue with large files

has anyone had any luck finding a workaround to this issue?