request/request

howto avoid - (node) warning: possible EventEmitter memory leak detected. 11 listeners added. Use emitter.setMaxListeners() to increase limit.

framlin opened this issue Β· 59 comments

I get a TRACE that tells me, that I should increase the listener-limit.
How can I call emitter.setMaxListeners() for request or how can I avoid the problem?

are you sure this is an issue with request?

It happens when max-redirects gets exceeded. Another trace here:

(node) warning: possible EventEmitter memory leak detected. 11 listeners added. Use emitter.setMaxListeners() to increase limit.
Trace
at Request.EventEmitter.addListener (events.js:168:15)
at Request.EventEmitter.once (events.js:189:8)
at Request.init (/app/node_modules/request/main.js:334:8)
at ClientRequest. (/app/node_modules/request/main.js:589:12)
at ClientRequest.g (events.js:185:14)
at ClientRequest.EventEmitter.emit (events.js:88:17)
at HTTPParser.parserOnIncomingClient as onIncoming
at HTTPParser.parserOnHeadersComplete as onHeadersComplete
at Socket.socketOnData as ondata
at TCP.onread (net.js:402:27

somewhere you've got a bug. this is a guard in core.

I can force this to happen by using jar:false when redirects exceed the limit.

var request = require('request');

// This URL causes max redirect to be exceeded - check it out on http://www.rexswain.com/httpview.html
var testUrl = 'http://www.nytimes.com/2012/09/19/us/politics/in-leaked-video-romney-says-middle-east-peace-process-likely-to-remain-unsolved-problem.html?hp';

request({uri: testUrl, jar: false}, function(error, response, body) {
    if (!error && response.statusCode == 200) {
        console.log(body);    
    }
})

If I run the code above I get this trace:

(node) warning: possible EventEmitter memory leak detected. 11 listeners added. Use emitter.setMaxListeners() to increase limit.
Trace
at Request.EventEmitter.addListener (events.js:168:15)
at Request.EventEmitter.once (events.js:189:8)
at Request.init (C:\Users\Test\Documents\node_modules\request\main.js:334:8)
at ClientRequest. (C:\Users\Test\Documents\node_modules\request\main.js:589:12)
at ClientRequest.g (events.js:185:14)
at ClientRequest.EventEmitter.emit (events.js:88:17)
at HTTPParser.parserOnIncomingClient as onIncoming
at HTTPParser.parserOnHeadersComplete as onHeadersComplete
at Socket.socketOnData as ondata
at TCP.onread (net.js:402:27)
(node) warning: possible EventEmitter memory leak detected. 11 listeners added. Use emitter.setMaxListeners() to increase limit.
Trace
at Request.EventEmitter.addListener (events.js:168:15)
at Request.start (C:\Users\Test\Documents\node_modules\request\main.js:687:8)
at Request.end (C:\Users\Test\Documents\node_modules\request\main.js:927:28)
at Request.init (C:\Users\Test\Documents\node_modules\request\main.js:381:12)
at process.startup.processNextTick.process._tickCallback (node.js:244:9)

If I take out jar:false or use a custom jar it works fine!
I'm using node v0.8.6 and request v2.11.4

j0ni commented

I'm seeing this on Heroku but not locally too. The trace is slightly different.

2012-10-10T04:54:42+00:00 app[runner.1]: (node) warning: possible EventEmitter memory leak detected. 11 listeners added. Use emitter.setMaxListeners() to increase limit.
2012-10-10T04:54:42+00:00 app[runner.1]: Trace
2012-10-10T04:54:42+00:00 app[runner.1]:     at Socket.EventEmitter.addListener (events.js:176:15)
2012-10-10T04:54:42+00:00 app[runner.1]:     at Socket.EventEmitter.once (events.js:197:8)
2012-10-10T04:54:42+00:00 app[runner.1]:     at ClientRequest.<anonymous> (/app/node_modules/request/main.js:518:27)
2012-10-10T04:54:42+00:00 app[runner.1]:     at ClientRequest.g (events.js:193:14)
2012-10-10T04:54:42+00:00 app[runner.1]:     at ClientRequest.EventEmitter.emit (events.js:93:17)
2012-10-10T04:54:42+00:00 app[runner.1]:     at HTTPParser.parserOnIncomingClient [as onIncoming] (http.js:1461:7)
2012-10-10T04:54:42+00:00 app[runner.1]:     at HTTPParser.parserOnHeadersComplete [as onHeadersComplete] (http.js:111:23)
2012-10-10T04:54:42+00:00 app[runner.1]:     at Socket.socketOnData [as ondata] (http.js:1366:20)
2012-10-10T04:54:42+00:00 app[runner.1]:     at TCP.onread (net.js:403:27)

This error is pretty unhelpful :(

This is using v0.8.11 and v2.11.4.

this is strange.

that line adds an error listener to the connection object if there isn't one already added, which means we aren't adding a listener multiple times.

on this line:

https://github.com/mikeal/request/blob/master/main.js#L688

we remove that listener, but we check for it on the req instead of the response, which should be the same object but it's a strange inconsistency.

we don't add the listener until the response callback, which means when there are many request instances we shouldn't be stacking up error listeners on pending connection objects.

my guess, is that the leak is actually somewhere else, that many error listeners are getting attached to the connection object, which is pooled and only gets destroyed if there are no pending requests, and that this happens to be the next line to add a listener. basically, we need to find out what the other 10 listeners are.

@donkeir your issue is different. yours has to do with many, many, redirects and the fact that we don't cleanup the pipe listener.

Yep, thanks Mikeal. I'd had a quick look but couldn't see any obvious cause. The multiple redirects / jar issue isn't a big problem for me as I'm using a custom jar which stops it happening.

j0ni commented

Thanks @mikeal - so I have managed to make this go away, though I don't understand why.

A URL which was causing this to happen on Heroku is:

http://www.weather.gov/forecasts/xml/sample_products/browser_interface/ndfdBrowserClientByDay.php?lat=33.752886&lon=-116.055617&format=24+hourly&numDays=4&startDate=2012-10-10T14:12:36.205Z

This actually returns a permanent redirect, and when I changed the URL to match the new form the errors went away. No redirects, no issue I guess.

j0ni commented

BTW @mikeal if you're still interested in getting to the bottom of this, can you tell me how I inspect the collection of listeners attached to the connection object from outside the module? I can't instrument the module itself on Heroku I don't think (which is where this is happening).

i would definitely like to know. you can get an array of listeners by doing connection.listeners('error')

@mikeal, and rest I think I got this solved.
First, you can reproduce this easily against http://news.ycombinator.com

Using wireshark I could see the actual host headers are lowercase host, which causes the webserver on the other end to answer with a 301.

Once I set to 'Host' everything works OK.

Will offer a pull later on

i'm not changing this default because one server is broken, and it is broken.

the node convention has been lower case headers, when people check headers they assume they are lowercase (not the best practice but it has to be acknowledged)

the real fix for this is to do the header check caseless, and not to add a new Host header if one is already in uppercase, so if your server is broken you can set this header yourself and we'll use it.

also, is this redirect loop really neverending? why isn't the redirect threshold getting hit?

Looking at the RFC syntax I assumed in the specific case of Host, it overrides the general notion of "header field names are case-insensitive', but if you're saying it's not so, then I understand.

At least this spills some more light on the problem.

The loop is a never ending one, yes (host: .. => redirect 301 => host ... => redirect 301 and so on). The threshold indeed gets hit and stops it. Also, because this is a never ending loop, it should make no sense increasing the threshold.

Thanks

I came to the same conclusion as @jondot in #346.

I'd say it's worthwhile being "nice" and either:

  1. always send a capitalized Host header, or
  2. allow developers to set a capitalized Host header like so: request({ headers: { Host: 'www.example.com' } }) - right now this results in 2 Host header entries like so: headers: { host: 'www.example.com', Host: 'www.example.com' }

The problem with request is that it always sends a lowercase host header even if you explicitly set a capitalized Host header. That makes it unusable in cases of "bad" servers like news.ycombinator.com.

@chuyeow yes, i'd like a patch that checks if the host header is already sent as an option and maintains the casing. i don't want to change the default to "Host" though because when other people check for that header in node they tend to do it in lowercase.

rodw commented

@mikeal, I can put together a pull request for this, but before I do I wanted to confirm what you're looking for.

It looks like there is already logic in place to add a host header if and only if there isn't one already. That is, line 184 of main.js (version 2.12.0) reads:

if (!self.headers.host) { 

(and is followed by a block that computes and adds the appropriate host header.)

Is it sufficient to add a check for Host as well as host here, i.e.,

if (!self.headers.host && !self.headers.Host) { 

or are you looking for something more robust/flexible/complicated?

Looking at the source a little more closely, I note that you're already doing something similar (checking for both down-case and Title-Case headers) in a few other places (e.g., with Content-Length at lines 293 and 513, Content-Type at lines 340 and 350).

Frankly this is really working around a bug (or at least uncommonly inflexible interpretation of the HTTP spec, although section 4.2 of RFC-2616 seems to be saying header names are case insensitive) on the server side and while there may be others, news.ycombinator.com is the only example I know of in the wild. But this is a relatively unobtrusive work-around (and news.yc is a relatively popular site: Alexa puts it among the top 2000 in the US, top 3000 world-wide).

(A consistent but substantially more complicated approach might be for node-request to treat all headers as case-insensitive and preserve the case of any headers set by the caller. But given that the header names are used as keys in a map, that would be a substantial change. You'd want something semantically equivalent to headers[/host/i] (rather than headers['host']). It's certainly doable but would require a lot more change to the existing code.)

rodw commented

Also, independent of the Host vs. host issue, has this actually uncovered a minor listener leak? The node.js error reports that we've added more than 10 listeners to the same EventEmitter instance (I think), in this case the HTTP response (I think). I suspect one listener is being added each time we follow a redirect. Should that be happening? Should some existing listener be removed when the request is being re-initialized around lines 554-596?

Keep getting the error.

(node) warning: possible EventEmitter memory leak detected. 11 listeners added. Use emitter.setMaxListeners() to increase limit.
Trace
at Request.EventEmitter.addListener (events.js:175:15)
at Request.EventEmitter.once (events.js:196:8)
at Request.init (/Users/boris/projects/clutchretail/bta/pricetracker/node_modules/request/index.js:376:8)
at Request.onResponse (/Users/boris/projects/clutchretail/bta/pricetracker/node_modules/request/index.js:780:10)
at ClientRequest.g (events.js:192:14)
at ClientRequest.EventEmitter.emit (events.js:96:17)
at HTTPParser.parserOnIncomingClient as onIncoming
at HTTPParser.parserOnHeadersComplete as onHeadersComplete
at Socket.socketOnData as ondata
at TCP.onread (net.js:404:27)

This is the latest npm version.

Also getting this error. Stacktrace:

(node) warning: possible EventEmitter memory leak detected. 11 listeners added. Use emitter.setMaxListeners() to increase limit.
Trace
at Request.EventEmitter.addListener (events.js:175:15)
at Request.EventEmitter.once (events.js:196:8)
at Request.init (/Users/deus/code/twitter-contentify/node_modules/request/index.js:376:8)
at Request.onResponse (/Users/deus/code/twitter-contentify/node_modules/request/index.js:780:10)
at ClientRequest.g (events.js:192:14)
at ClientRequest.EventEmitter.emit (events.js:96:17)
at HTTPParser.parserOnIncomingClient [as onIncoming] (http.js:1527:7)
at HTTPParser.parserOnHeadersComplete [as onHeadersComplete] (http.js:111:23)
at Socket.socketOnData [as ondata] (http.js:1430:20)
at TCP.onread (net.js:404:27)

+1

(node) warning: possible EventEmitter memory leak detected. 11 listeners added. Use emitter.setMaxListeners() to increase limit.
Trace
at Request.EventEmitter.addListener (events.js:175:15)
at Request.EventEmitter.once (events.js:196:8)
at Request.init (D:\Server\htdocs\fmcsaCrawler\node_modules\request\index.js:376:8)
at Request.onResponse (D:\Server\htdocs\fmcsaCrawler\node_modules\request\index.js:780:10)
at ClientRequest.g (events.js:192:14)
at ClientRequest.EventEmitter.emit (events.js:96:17)
at HTTPParser.parserOnIncomingClient as onIncoming
at HTTPParser.parserOnHeadersComplete as onHeadersComplete
at Socket.socketOnData as ondata
at TCP.onread (net.js:404:27)

+1

(node) warning: possible EventEmitter memory leak detected. 11 listeners added. Use emitter.setMaxListeners() to increase limit.
Trace
at Request.EventEmitter.addListener (events.js:175:15)
at Request.EventEmitter.once (events.js:196:8)
at Request.init (D:\Server\htdocs\fmcsaCrawler\node_modules\request\index.js:376:8)
at Request.onResponse (D:\Server\htdocs\fmcsaCrawler\node_modules\request\index.js:780:10)
at ClientRequest.g (events.js:192:14)
at ClientRequest.EventEmitter.emit (events.js:96:17)
at HTTPParser.parserOnIncomingClient as onIncoming
at HTTPParser.parserOnHeadersComplete as onHeadersComplete
at Socket.socketOnData as ondata
at TCP.onread (net.js:404:27)

+1

(node) warning: possible EventEmitter memory leak detected. 11 listeners added. Use emitter.setMaxListeners() to increase limit.
Trace
    at EventEmitter.addListener (events.js:160:15)
    at EventEmitter.Parser.setup (/Users/jonathons/code/crowdtro/node_modules/zombie/node_modules/html5/lib/html5/parser.js:2459:17)
    at EventEmitter.Parser.parse_fragment (/Users/jonathons/code/crowdtro/node_modules/zombie/node_modules/html5/lib/html5/parser.js:2399:8)
    at HtmlToDom.appendHtmlToElement (/Users/jonathons/code/crowdtro/node_modules/zombie/node_modules/jsdom/lib/jsdom/browser/htmltodom.js:95:13)
    at Object.innerHTML (/Users/jonathons/code/crowdtro/node_modules/zombie/node_modules/jsdom/lib/jsdom/browser/index.js:460:17)
    at /javascripts/jquery.min.js:4:6719
    at at (/javascripts/jquery.min.js:4:5136)
    at st.setDocument (/javascripts/jquery.min.js:4:6686)
    at at (/javascripts/jquery.min.js:4:20427)
    at /javascripts/jquery.min.js:4:20587
    at /javascripts/jquery.min.js:5:28399
    at Contextify.sandbox.run (/Users/jonathons/code/crowdtro/node_modules/zombie/node_modules/jsdom/node_modules/contextify/lib/contextify.js:12:24)
    at window._evaluate (/Users/jonathons/code/crowdtro/node_modules/zombie/lib/zombie/windows.js:271:25)
    at Object.HTML.languageProcessors.javascript (/Users/jonathons/code/crowdtro/node_modules/zombie/lib/zombie/scripts.js:20:21)
    at Object.define.proto._eval (/Users/jonathons/code/crowdtro/node_modules/zombie/node_modules/jsdom/lib/jsdom/level2/html.js:1295:47)
    at Object.loaded (/Users/jonathons/code/crowdtro/node_modules/zombie/lib/zombie/jsdom_patches.js:134:27)
    at /Users/jonathons/code/crowdtro/node_modules/zombie/node_modules/jsdom/lib/jsdom/level2/html.js:51:20
    at Object.item.check (/Users/jonathons/code/crowdtro/node_modules/zombie/node_modules/jsdom/lib/jsdom/level2/html.js:280:11)
    at /Users/jonathons/code/crowdtro/node_modules/zombie/node_modules/jsdom/lib/jsdom/level2/html.js:298:12
    at /Users/jonathons/code/crowdtro/node_modules/zombie/lib/zombie/resources.js:147:16
    at Request._callback (/Users/jonathons/code/crowdtro/node_modules/zombie/lib/zombie/resources.js:335:16)
    at Request.self.callback (/Users/jonathons/code/crowdtro/node_modules/zombie/node_modules/request/main.js:120:22)
    at Request.EventEmitter.emit (events.js:98:17)

BUMP

Does anyone knows hot to fix this problem?
I've been dealing with this for some days and the only solution i found was to search and replace all the
request."whatever" or request(url)
with
request({ uri: url , headers: { "User-Agent": "Mozilla/5.0 (X11; Linux i686) AppleWebKit/537.31 (KHTML, like Gecko) Chrome/26.0.1410."}}, func...

I really hope it helps, do a find in folder with sublime or any other and its quite easy, you can try to modify the options as you want, it works like a charm.

rodw commented

@lbertenasco: The "too many listeners" error message is a bit of a red herring. The underlying issue is that you are stuck in a redirect loop.

That is, the server doesn't like something about your request, and is responding with a redirect message (generally an HTTP 301 or 302 status). Your client resubmits the request, but didn't change the thing that the server didn't like, and so receives another redirect response. After 10 cycles through that loop, you hit the "too many listeners" error.

One could bump up the number of listeners that are allowed (as suggested in the error message), but that will only delay the inevitable. After N cycles through the loop, you will hit the "too many listeners" error, for any value of N.

One could (and I'll assert, should) change the code of the request module so that it no longer adds a new listener every time a redirect is followed, but you'll just be replacing one error condition with another. You'll either hit some other natural limit (e.g., the depth of the calling stack) or your code will "hang" in an endless loop (just like 10 GOTO 10 would in Basic).

Ultimately what you need to do is address the redirect loop.

In your case, it looks like the server you are hitting is sending a 301 or 302 response because it doesn't recognize the web browser (user-agent) you're using. Explicitly setting the user-agent header to look like Google Chrome fixes this problem.

Without seeing the actual HTTP response, it is hard to tell if this is an error on the server-side or the client-side.

For instance, if the server is responding with 302 message that redirects the client to the same URL, but expects to receive a user-agent string the second time, this is a mistake on the servers' part.

Alternatively, if the server is responding with a 302 message that redirects the client to some sort of "your browser is not supported" page, but the client (the request module) is resubmitting the request to the original URL, then the bug is on the client's side.

Similarly, in the HackerNews example cited above, the server is looking for a Host header but request is only sending a host header. This causes a redirect loop unless steps are taken to manually add a Host, rather than host, header. (In this case, I believe the news.ycombinator.com server is in error--it should treat header names as case-insensitive--but I presume it is easier to add a little work-around on the client side than to convince Paul Graham to update the news.yc server.)

TL/DR: The request library could handle this a bit better, but at the end of the day you'll need to figure out why you've encountered a redirect loop, and find a way to work around it.

+1

We are seeing this same error - but strangely only in 1 of our 4 environments. And not all the time.
We are seeing some as-yet-unexplained performance problems in this environment - not sure yet which is cause and symptom...

(node) warning: possible EventEmitter memory leak detected. 11 listeners added. Use emitter.setMaxListeners() to increase limit.
Trace
    at Socket.EventEmitter.addListener (events.js:175:15)
    at Socket.EventEmitter.once (events.js:196:8)
    at ClientRequest.<anonymous> (/pathtoourapp/2.2.58.2/server/node_modules/request/main.js:561:27)
    at ClientRequest.g (events.js:192:14)
    at ClientRequest.EventEmitter.emit (events.js:96:17)
    at HTTPParser.parserOnIncomingClient [as onIncoming] (http.js:1582:7)
    at HTTPParser.parserOnHeadersComplete [as onHeadersComplete] (http.js:111:23)
    at Socket.socketOnData [as ondata] (http.js:1485:20)
    at TCP.onread (net.js:404:27)

Looking a bit closer, we don't appear to have any problem with retry-loops.
What I do see is that the number of listeners for a given connection can get into the hundreds - all handling different (get) requests. Our system does get a bit busy making http requests - the queued requests can get into the 70s - but behaviour is not what I was expecting...

I don't know how request works internally, but I'm queuing up ~180 requests for a screen scraper and I'm getting this error. It looks like a lot of error event listeners are building up in request.js on line 625. I added console.log(response.connection.listeners('error').length); before this line and the number built up quickly, fluctuated a bit and peaked at 40.

Here's how I'm calling request (inside a loop):

request({
    url: url,
    timeout: 60000,
    jar: jar,
    headers: {
        'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_8_3) AppleWebKit/537.31 (KHTML, like Gecko) Chrome/26.0.1410.65 Safari/537.31'
    }
}, function (error, response, body) {
});
(node) warning: possible EventEmitter memory leak detected. 11 listeners added. Use emitter.setMaxListeners() to increase limit.
Trace
    at Socket.EventEmitter.addListener (events.js:160:15)
    at Socket.Readable.on (_stream_readable.js:689:33)
    at Socket.EventEmitter.once (events.js:179:8)
    at Request.onResponse (/Users/Roland.Warmerdam/projects/scraper/node_modules/request/request.js:626:25)
    at ClientRequest.g (events.js:175:14)
    at ClientRequest.EventEmitter.emit (events.js:95:17)
    at HTTPParser.parserOnIncomingClient [as onIncoming] (http.js:1688:21)
    at HTTPParser.parserOnHeadersComplete [as onHeadersComplete] (http.js:121:23)
    at Socket.socketOnData [as ondata] (http.js:1583:20)
    at TCP.onread (net.js:525:27)

πŸ‘ to @mikeal 's earlier comment.

  • assume lowercase use of host by default
  • if user code uses capitalization Host:
    • maintain capitalization when making requests
    • when capitalized, agnostic code should still be able to ask for host and get the value for Host

Do I have that right?


Edit: Rephrased for clarity.

@Zearin that should be how things work now. a few months back a lot of caseless functions went in to handle that.

This memory leak crap should be fixed in node 0.12 with the http agent changes.

Hi, I simply loop through 20-30 URL's to fetch data and get this error. I tried to set process.setMaxListeners(0). I also tried to create request object in my loop. How to get the right event emitter to increase the maximum (in my case maybe 30 requests - as node is async listeners are added before the requests are made actually - if it would be sync it would request each URL after the other ...). I double checked as long my loop has only 10 URLS the "request" module is happy and no errors shown. But in my use case I can't use it ...

As far I read documentation - the only thing you need to do is "setMaxListeners" on your event Emitter - or allow users to specifiy it like
request(url, callback, 100) // if we loop through 100 urls ... OR provide such a function:
request.getEventEmitter().setMaxListeners (100)

This actually seems to cause memory leak. My script ran out of memory after 5 hours of runtime.

(node) warning: possible EventEmitter memory leak detected. 11 listeners added. Use emitter.setMaxListeners() to increas
e limit.
Trace
    at Socket.EventEmitter.addListener (events.js:160:15)
    at Socket.Readable.on (_stream_readable.js:689:33)
    at Socket.EventEmitter.once (events.js:185:8)
    at Request.onResponse (D:\Dropbox\projects\domains2\node_modules\request\request.js:713:25)
    at ClientRequest.g (events.js:180:16)
    at ClientRequest.EventEmitter.emit (events.js:95:17)
    at HTTPParser.parserOnIncomingClient [as onIncoming] (http.js:1688:21)
    at HTTPParser.parserOnHeadersComplete [as onHeadersComplete] (http.js:121:23)
    at Socket.socketOnData [as ondata] (http.js:1583:20)
    at TCP.onread (net.js:527:27)

Found also this thread in net module nodejs/node-v0.x-archive#5108

+1

(node) warning: possible EventEmitter memory leak detected. 
11 listeners added. Use emitter.setMaxListeners() to increase limit.
Trace
    at Socket.EventEmitter.addListener (events.js:160:15)
    at Socket.Readable.on (_stream_readable.js:689:33)
    at Socket.EventEmitter.once (events.js:185:8)
    at Request.onResponse (../request/request.js:713:25)
    at ClientRequest.g (events.js:180:16)
    at ClientRequest.EventEmitter.emit (events.js:95:17)
    at HTTPParser.parserOnIncomingClient [as onIncoming] (http.js:1688:21)
    at HTTPParser.parserOnHeadersComplete [as onHeadersComplete] (http.js:121:23)
    at Socket.socketOnData [as ondata] (http.js:1583:20)
    at TCP.onread (net.js:527:27)

I solved this issue by changing maxRedirects to 3 instead of 10.

Maybe if the default was 8 this error would not be thrown?

I am also getting this error on nytimes.com and it appears to be only 3 or 4 redirects.

This bug only shows up today, as I understand it, when too many http clients are open in the pool, which is managed by Core and is a Core bug (and fixed in 0.11).

same here

(node) warning: possible EventEmitter memory leak detected. 11 drain listeners a
dded. Use emitter.setMaxListeners() to increase limit.
Trace
    at WriteStream.addListener (events.js:179:15)
    at WriteStream.Readable.on (_stream_readable.js:671:33)
    at Request.Stream.pipe (stream.js:65:8)
    at Request.pipe (C:\Users\abhilash\Desktop\CHARLES\gerrit_checkout\March_17_
2015\nodejs.folders.io\node_modules\request\request.js:1329:34)
    at Fio.post (C:\Users\abhilash\Desktop\CHARLES\gerrit_checkout\March_17_2015
\nodejs.folders.io\api.js:9:12661)
    at onList (C:\Users\abhilash\Desktop\CHARLES\gerrit_checkout\March_17_2015\n
odejs.folders.io\folders\folders-stub.js:9:759)
    at onList (C:\Users\abhilash\Desktop\CHARLES\gerrit_checkout\March_17_2015\n
odejs.folders.io\test\listen.js:51:10)
    at Object.<anonymous> (C:\Users\abhilash\Desktop\CHARLES\gerrit_checkout\Mar
ch_17_2015\nodejs.folders.io\test\listen.js:41:5)
    at Object.Conduit._targetStep.fn (C:\Users\abhilash\Desktop\CHARLES\gerrit_c
heckout\March_17_2015\nodejs.folders.io\node_modules\postal\node_modules\conduit
js\lib\conduit.js:40:43)
    at Object.next (C:\Users\abhilash\Desktop\CHARLES\gerrit_checkout\March_17_2
015\nodejs.folders.io\node_modules\postal\node_modules\conduitjs\lib\conduit.js:
70:33)
(node) warning: possible EventEmitter memory leak detected. 11 error listeners a
dded. Use emitter.setMaxListeners() to increase limit.
Trace
    at WriteStream.addListener (events.js:179:15)
    at WriteStream.Readable.on (_stream_readable.js:671:33)
    at Request.Stream.pipe (stream.js:99:8)
    at Request.pipe (C:\Users\abhilash\Desktop\CHARLES\gerrit_checkout\March_17_
2015\nodejs.folders.io\node_modules\request\request.js:1329:34)
    at Fio.post (C:\Users\abhilash\Desktop\CHARLES\gerrit_checkout\March_17_2015
\nodejs.folders.io\api.js:9:12661)
    at onList (C:\Users\abhilash\Desktop\CHARLES\gerrit_checkout\March_17_2015\n
odejs.folders.io\folders\folders-stub.js:9:759)
    at onList (C:\Users\abhilash\Desktop\CHARLES\gerrit_checkout\March_17_2015\n
odejs.folders.io\test\listen.js:51:10)
    at Object.<anonymous> (C:\Users\abhilash\Desktop\CHARLES\gerrit_checkout\Mar
ch_17_2015\nodejs.folders.io\test\listen.js:41:5)
    at Object.Conduit._targetStep.fn (C:\Users\abhilash\Desktop\CHARLES\gerrit_c

+1
(node) warning: possible EventEmitter memory leak detected. 11 listeners added. Use emitter.setMaxListeners() to increase limit.
Trace:
at Request.EventEmitter.addListener (events.js:160:15)
at Request.init (/home/ubuntu/workspace/megatalog-crawler/node_modules/request/request.js:667:8)
at Redirect.onResponse (/home/ubuntu/workspace/megatalog-crawler/node_modules/request/lib/redirect.js:149:11)
at Request.onRequestResponse (/home/ubuntu/workspace/megatalog-crawler/node_modules/request/request.js:1108:22)
at ClientRequest.EventEmitter.emit (events.js:95:17)
at HTTPParser.parserOnIncomingClient as onIncoming
at HTTPParser.parserOnHeadersComplete as onHeadersComplete
at Socket.socketOnData as ondata
at TCP.onread (net.js:527:27)

Same issue:

(node) warning: possible EventEmitter memory leak detected. 11 listeners added. Use emitter.setMaxListeners() to increase limit.
Trace
at Request.addListener (events.js:160:15)
at Request.init (/var/www/it/src/node_modules/request/request.js:667:8)
at Redirect.onResponse (/var/www/it/src/node_modules/request/lib/redirect.js:149:11)
at Request.onRequestResponse (/var/www/it/src/node_modules/request/request.js:1108:22)
at ClientRequest.emit (events.js:95:17)
at HTTPParser.parserOnIncomingClient as onIncoming
at HTTPParser.parserOnHeadersComplete as onHeadersComplete
at Socket.socketOnData as ondata
at TCP.onread (net.js:527:27)

πŸ‘ and I'm using v0.12.2

Having the same issue.

I'm getting this issue also when using request with .pipe()

happy to increase the number of listeners, but can I do that if I'm using pipes?

@mikeal 2 yrs ago you mentioned, "we don't cleanup the pipe listener", do you have an example of how I can cleanup the pipe listener manually?

I'm piping the return to a number of data cleansers and tasks but as soon as I hit 11 request.get() calls my script falls over.

[someArray of urls]

for ( i < someArray.length){
  request.get(someUri)
  .pipe(someCleansingFunction)
 .pipe(someSavingFunction)
 .how( do I cleanup the pipe listener now that I've finished cleansing & saving?)
}

+1

(node) warning: possible EventEmitter memory leak detected. 11 listeners added. Use emitter.setMaxListeners() to increase limit.
Trace
at PoolConnection.EventEmitter.addListener (events.js:160:15)
at /home/jeferson/workspacejs/syncserver/src/prod/dbconnection.js:32:20
at Ping.onOperationComplete as _callback
at Ping.Sequence.end (/home/jeferson/workspacejs/syncserver/node_modules/mysql/lib/protocol/sequences/Sequence.js:96:24)
at Ping.Sequence.OkPacket (/home/jeferson/workspacejs/syncserver/node_modules/mysql/lib/protocol/sequences/Sequence.js:105:8)
at Protocol._parsePacket (/home/jeferson/workspacejs/syncserver/node_modules/mysql/lib/protocol/Protocol.js:274:23)
at Parser.write (/home/jeferson/workspacejs/syncserver/node_modules/mysql/lib/protocol/Parser.js:77:12)
at Protocol.write (/home/jeferson/workspacejs/syncserver/node_modules/mysql/lib/protocol/Protocol.js:39:16)
at Socket. (/home/jeferson/workspacejs/syncserver/node_modules/mysql/lib/Connection.js:96:28)
at Socket.EventEmitter.emit (events.js:95:17)

+1, I have the same problem with pipe...

+1, happens often with nytimes.com links for me. Is there a work around for this bug?

Aug 31 11:14:41 x app/web.1: (node) warning: possible EventEmitter memory leak detected. 11 listeners added. Use emitter.setMaxListeners() to increase limit.
Aug 31 11:14:41 x app/web.1: Trace
Aug 31 11:14:41 x app/web.1: at Request.addListener (events.js:160:15)
Aug 31 11:14:41 x app/web.1: at Request.start (/app/node_modules/request/request.js:947:8)
Aug 31 11:14:41 x app/web.1: at Request.end (/app/node_modules/request/request.js:1732:10)
Aug 31 11:14:41 x app/web.1: at end (/app/node_modules/request/request.js:704:14)
Aug 31 11:14:41 x app/web.1: at Object._onImmediate (/app/node_modules/request/request.js:718:7)
Aug 31 11:14:41 x app/web.1: at processImmediate as _immediateCallback
Aug 31 11:14:41 x app/web.1: [Error: Exceeded maxRedirects. Probably stuck in a redirect loop http://www.nytimes.com/2015/06/28/magazine/confessions-of-a-seduction-addict.html?_r=4]

Hi @mikeal,

This still happens with just basic GET requests, no pipes, no magic fairy dust, etc.

Regardless of what server we talk to, and what issues that server has, it should not cause a problem in our client code. If we connect to a server and it streams back random binary, and nothing of HTTP protocol at all, I would expect that the err response be populated. We should not be kicking off error messages and warnings. Also, ignoring the leak with maxListeners(0) is not a solution. Its pretending the problem doesn't exist as our servers slowly crash.

What about this... what if we just set maxListeners(maxRedirects + 1)? It seems as though everyone reporting this issues thinks its a redirect related problem. And, if I set maxRedirects to < 10, it never happens. If its left at default of 10, always happens. Just a thought.

+1 @travischoma same here. @ahildoer agree. We really should be able to find a way to talk to almost every server out there.

Not sure if it's the same issue that everyone else is facing here but this state is reproable (at least on Node 0.12.6) by:

  1. Setting http.globalAgent.maxSockets = 1 (or a similarly low #)
  2. Making lots of requests to the same server

What i see is 1 socket connection (or however many you set in #1) and then as requests get queued, this exception will eventually get triggered.

My case is somewhat artificial as I'm setting maxSockets to be low. But i believe that pre node 0.12, the default size was 5 and in theory, this error could occur whenever there are significantly more requests than available sockets?

I've overcome this by setting maxRedirects:5 which raises an error correctly and doesn't crash and burn

I have this same issue when I am trying to implement SSE with nodejs.

(node) warning: possible EventEmitter memory leak detected. 11 notify listeners added. Use emitter.setMaxListeners() to increase limit.
Trace
at EventEmitter.addListener (events.js:179:15)

removeAllListeners('data') looks like fixed the issue for me.
Be careful, I am not sure it has no side effects.

function getPage(pageURL) {
  return new Promise(function(resolve, reject) {
    let chunks = []
    let reqPipe = request({
      uri: pageURL,
      method: 'GET'
    }).pipe(iconv)

    reqPipe.on('data', function(chunk) {
      chunks.push(chunk)
    }).once('end', function() {
      var str = Buffer.concat(chunks).toString()
      reqPipe.removeAllListeners('data')
      resolve(str)
    })
  })
}

I am doing an npm install with a local repository and getting a lot of these warnings. What is the solution?

npm install -g git+http://stash/scm/ss/generator-lambda.git

(node) warning: possible EventEmitter memory leak detected. 11 error listeners added. Use emitter.setMaxListeners() to increase limit. Trace at TLSSocket.addListener (events.js:239:17) at TLSSocket.Readable.on (_stream_readable.js:680:33) at Request.<anonymous> (C:\Users\rmclaughlin\AppData\Roaming\npm\node_modules\npm\node_modules\npm-registry-client\lib\request.js:153:7) at emitOne (events.js:77:13) at Request.emit (events.js:169:7) at ClientRequest.<anonymous> (C:\Users\rmclaughlin\AppData\Roaming\npm\node_modules\npm\node_modules\request\request.js:791:10) at emitOne (events.js:82:20) at ClientRequest.emit (events.js:169:7) at tickOnSocket (_http_client.js:523:7) at onSocketNT (_http_client.js:535:5)

I am using npm ver 3.10.7

I'm having this same problem with nytimes.com. Was anyone able to fix this?

@fishcharlie Same issue here with NYT. Lowering the number of redirects (like @Psopho and @chmac suggest) seems like a viable workaround.

Adding {maxRedirects: 5} option to the function call prevents the memory leak issue & instead populates the err argument in the callback function.

Didn't know this was still relevant issue in 2017. I made a library to try to have it working with request 4 years ago. You can try to see if it helps.

Didn't know this was still relevant issue in 2017

It’s 2018 πŸ˜‹