On v1.4.1 when slow delay is triggered the http request end up hanging up indefinitly and never being resolved
ouraios opened this issue · 19 comments
On v1.4.1 when slow delay is triggered the http request end up hanging up indefinitly and never being resolved
I'm on Windows with NodeJS v16.15.0
From what i saw the issue is that on this line we are sending the req
object and with this object being sent the on-finished
library trigger immediatly the onFinished event because the req.complete
is already set (complete on req mean that the parsing of the request is done but not the http context being resolved source )
There are 2 possible solutions :
- Using the
res
object instead of thereq
one on the line linked above - Not using
on-finished
library at all and replacing line linked above withreq.on('close', () => {
I personnaly prefer the second solution because it means One less dependency to have so much cleaner to me ...
I know @netdgi tried listening for close
& end
initially - c787b59
@ntedgi can you share why that didn't work?
It looks like on-finished listens for end
and finish
on the req, as well as error
and close
on the underlying socket: https://github.com/jshttp/on-finished/blob/master/index.js#L105-L115
Having this same issue. My config is rather simple:
const slowdownConfig = {
windowMs: 60000,
delayAfter: 15,
delayMs: 1000,
maxDelayMs: 1000,
};
app.use(slowdown(slowdownConfig));
Downgrading to 1.4.0 immediately fixes the problem.
Using docker image Bullseye node 16 (so 16.17.0)
And after how many requests the issue happens ?
@ntedgi The first request triggering the delay is enough to make the issue happen, i've found the explanation and the solution juste have to chose which solution (both listed on my first message) is best for the package
I just set up this test server: https://gist.github.com/nfriedly/c3cf7a992a9543cd2e71df72843ae014
And then I connected to it with curl
, and I couldn't get it to misbehave.
Either the request would complete at the appropriate time, or else if I killed the curl command (either with command-c, or via force-quit from the macos activity monitor), the server would correctly drop the request.
If I switched to v1.4.0 it would incorrectly try to send a response after the request was closed.
This was tested on macos 12.6
Can you provide example code & steps to reproduce the timeout issue you're seeing?
Ok, I spent some time this weekend stabilizing the existing tests so that they now appear to pass reliably on windows, mac, and linux.
The one thing I want to add, that I haven't gotten to yet, is a "real world" integration test that spins up a server and makes actual HTTP requests. I have a feeling that that will be necessary to narrow this down.
Anything you can do to get me a reproducible test case would be appreciated.
I'm on express@4.18.1. I will try to get a minimal demonstration of it. I know it's hard to fix things without an example. Thanks for following up right away.
I am unable to replicate on a windows 10 machine with node 16.13.0, express 4.18.1 and express-slow-down 1.4.1. I tried adding in an async route handler wondering if that was related but can't get the hang that we experienced in production after upgrading. When I can, I will try to run it against a linux machine to see if the OS is a factor.
@nfriedly about the “real word integration test” i can work on it
i think on the following steps :
-
Building node docker image
-
Running an express server
-
Curl a simple request and asserting the response
please supply desire version for node and express
also if needed all kind of os travis.ci can supply test for
an (array of node version)X (os)
Maybe "real world" isn't the best description; I was thinking of something that still runs inside of jest (so it would run on node 14,16, and 18 on windows, mac, and linux in CI) - but it would use an actual HTTP server and actual HTTP requests, both in node.js - possibly in the same process, possibly in separate processes.
@nfriedly before proceeding with writing the test i would like to get you feedback about the following structure
#33
https://github.com/nfriedly/express-slow-down/pull/33/files#diff-6bb01a9664f0d2819bc4e2290b685dddc6dfb10fd786e7fa30933ae5ec57212b
@ntedgi @nfriedly @danursin I finally was able to reproduce the bug :
Apparently its the body-parser causing the "issue" because it reads the request body and this was triggering the request on-finished event, which is perfectly normal (thats the whole point of request on-finished).
Steps to reproduce
npm i express express-slow-down body-parser
const express = require("express");
const slowDown = require('express-slow-down');
const bodyParser = require('body-parser')
const app = express();
// parse application/x-www-form-urlencoded
app.use(bodyParser.urlencoded({ extended: false, limit: '50mb' }))
// parse application/json
app.use(bodyParser.json({ limit: '50mb'}))
app.use(slowDown({
windowMs: 60000,
delayAfter: 5,
delayMs: 1000,
maxDelayMs: 10000,
}));
// All controllers should live here
app.post("/", function rootHandler(req, res) {
res.end("Hello world!");
});
app.listen(3000);
curl --request POST \
--url http://localhost:3000/ \
--header 'Content-Type: application/json' \
--data '{
"test": "thisistestjson"
}'
How to fix :
Refer to my 2 solutions on the first post of this issue for possible fix solutions.
Aah, file uploads! That makes sense, and wasn't something I thought to test. Thank you!
It isnt file uploads actually, just doing a post request with some content in the body make it bug.
Ive just copy/pasted the body-parser configs i had in my project's config but if you remove the limit
attribute which is suggesting its file uploads, the bug will still happen 👍
I just realized part of the issue here is that express-slow-down should really come before body-parser, because otherwise it's not really providing any protection against a malicious user uploading tons of data.
But, nonetheless, it shouldn't hang just from being run late.
I'm adding a test, dropping on-finished
, and switching the library to res.on("closed', ...)
. I'll get a point release out soon.
v1.4.2 is now published to npm with the fix, please test and confirm that it's working for you.
working perfectly !! Nice work !!