`writeFileListToStream` might hang when the writable stream becomes full (asar does not work between Node 15.2.0 and 15.4.0)
weedz opened this issue ยท 9 comments
This problem occurs when using node@15.2+ under Linux (Pop!_OS 20.10). Also tested on Mac OS 11.1 (node 15.1, 15.2 and 15.4) but no problem there. EDIT: Was able to reproduce this on the Mac setup. Might just be that the Macbook is to slow to trigger this problem reliably...
This might be the relevant change in node nodejs/node#35348.
This is reproducible (at least on my machine) in my repository https://github.com/weedz/git-good. Running the command
$ npx electron-forge package
finishes when using node@15.1, but it hangs when using node@15.2 (also tested with version 15.4).
I then changed writeFileListToStream
to wait for out.writableNeedDrain
to become false. Now electron-forge package
finishes when using node@15.2, but this could probably be solved in a better way..
+ function sleep(ms) {
+ return new Promise((resolve, reject) => setTimeout(resolve, ms));
+ }
const writeFileListToStream = async function (dest, filesystem, out, list, metadata) {
for (const file of list) {
+ while (out.writableNeedDrain) {
+ await sleep(10);
+ }
if (file.unpack) { // the file should not be packed into archive
const filename = path.relative(filesystem.src, file.filename)
await copyFile(`${dest}.unpacked`, filesystem.src, filename)
} else {
await streamTransformedFile(file.filename, out, metadata[file.filename].transformed)
}
}
return out.end()
}
Might be a bug in Node?
Documentation for readable.pipe() reads:
... The flow of data will be automatically managed so that the destination Writable stream is not overwhelmed by a faster Readable stream.
Is this still the case when piping multiple Readable streams to the same Writable stream?
This seems to be happening for me as well when attempting to deploy https://github.com/yakyak/yakyak against node v15.4.0. It's not exactly the same I think, because randomly through the asar generating process it'll just stop and fail and not really hang - however, it also won't continue processing files in the for loop.
The workaround fixes it for me, but as you've said this really should be properly fixed in Node.js. If I run strace it also works properly, most likely because it slows down the process quite a lot, so it definitely seems like the output buffer needs some time to settle.
I've added some log output to asar, and here are the relevant last lines when attempting to deploy yakyak:
writeFileListToStream {
filename: '/tmp/electron-packager/linux-x64/yakyak-linux-x64/resources/app/node_modules/esprima/dist/esprima.js',
unpack: false
}
streamTransformedFile /tmp/electron-packager/linux-x64/yakyak-linux-x64/resources/app/node_modules/esprima/dist/esprima.js
streamTransformedFile done
stream.close
stream.data
stream.data
stream.data
stream.data
stream.data
stream.end
writeFileListToStream done
writeFileListToStream {
filename: '/tmp/electron-packager/linux-x64/yakyak-linux-x64/resources/app/node_modules/esprima/LICENSE.BSD',
unpack: false
}
streamTransformedFile /tmp/electron-packager/linux-x64/yakyak-linux-x64/resources/app/node_modules/esprima/LICENSE.BSD
streamTransformedFile done
stream.close
[02:04:08] The following tasks did not complete: deploy:linux-x64:nodep
[02:04:08] Did you forget to signal async completion?
@HomerSp Looked into why it seems like electron-forge package
hangs rather than just exiting; The cli runs a spinner in the terminal window with packagerSpinner = ora('Packaging Application').start();
. If you comment out this line the program exits (with code 0 and without any form of errors) instead of waiting for packagerSpinner.succeed()
.
Looked at your deploy
script and it seems like you pragmatically call packager
which of course does not run the spinner and thus just exits when the problem occurs.
Also ran strace on electron-forge package
: https://gist.github.com/weedz/c856712a21881d00edfdd33043d89b8d, think this is the relevant output. I can't see any difference but I don't really know what to look for.
EDIT: Ran strace on a script which pragmatically runs packager
but this worked properly just as you observed, was alot slower so the stream probably had time to properly drain.
@weedz I've done some more testing, and I've found that ultimately the problem happens because of this change from that PR above: https://github.com/nodejs/node/pull/35348/files/3dd02b3c3c675d86c6be17b6fdb8e8994a1d65ed#diff-040c1f5a53844e600d40b33c4624f1fe39fcf2f8d62c76ca3fc5ea5442231469
Basically, when the output buffer is full the pipe opens up in a paused state, but it's never resumed which causes the problem we're seeing. If I revert just that block of code it works as expected again.
Another way to get it to work, apart from your while loop solution above is to call stream.resume after the pipe call.
Hopefully they can revert that part of the pr for the next release so it'll work as it should again.
This issue has now been fixed in nodejs/node#36563
Awesome!
Closing this issue since the bug has been resolved in Node ๐
For the record, the affected versions of Node.js were 15.2.0 through 15.4.0 inclusive. The fix mentioned was released in Node.js 15.5.0.