Issue using with HapiJS 16.1.0
rjbez17 opened this issue · 13 comments
When using HapiJS 16.1.0 there is an issuing parsing the form. All documentation says to pass the request into the parser. However, I've found that Multiparty really expects the headers to be request.payload.headers instead of request.headers.
Doing the following:
form.parse(request, function(err, fields, files) {
console.log(err);
console.log(fields);
console.log(files);
reply();
});
Throws an error here: https://github.com/pillarjs/multiparty/blob/master/index.js#L145
While doing this:
form.parse(request.payload, function(err, fields, files) {
console.log(err);
console.log(fields);
console.log(files);
reply();
});
Errors here: https://github.com/pillarjs/multiparty/blob/master/index.js#L160
I am able to get this working only by doing the following:
obj.headers = request.headers;
var form = new multiparty.Form();
form.parse(obj, function(err, fields, files) {
console.log(err);
console.log(fields);
console.log(files);
reply();
});
The code should either be updated to support passing in the request or documentation should be updated to reflect what is needed to be passed into the parser. I'm happy to submit a PR for either, but need to know which is preferred.
I'm not sure what Hapi gives you, but in the example and in the docs, it says you need to pass in the Node.js request, as in the Node.js core IncomingMessage object.
So we document that you need to pass in the "node.js request", and reading the HapiJS API docs, they say you can get the "node.js request" as request.raw.req
I've tried this as well and it doesn't work:
{ Error: stream ended unexpectedly
at Form. (/web/node_modules/multiparty/index.js:754:24)
at emitNone (events.js:91:20)
at Form.emit (events.js:185:7)
at finishMaybe (_stream_writable.js:515:14)
at endWritable (_stream_writable.js:525:3)
at Form.Writable.end (_stream_writable.js:490:5)
at onend (_stream_readable.js:511:10)
at _combinedTickCallback (internal/process/next_tick.js:67:7)
at process._tickDomainCallback (internal/process/next_tick.js:122:9) status: 400, statusCode: 400 }
Hm, weird. I've never used HapiJS, so if you have any pointers or information, that would be awesome
An off-hand guess would be that something else already read all the data off the stream before you handed the request stream to multiparty, so maybe that may help diagnose the issue.
From reading https://hapijs.com/api HapiJS has multipart form parsing built-in, perhaps that may be the better path to go down instead of trying to figure out how to tie in this module?
It's configured to not touch it (https://hapijs.com/api#route-options parse = false). I am able to get it working if I add the headers to payload and pass that in. I suppose I'll just use that work around for now.
From my understanding of the docs, even with parse: false
, Hapi will still read the data from the raw request stream, which is what this module needs. It streams it into the payload stream object, so by the time this module needs to read from the raw stream, part or all of the data has been loaded into the request.payload object and off the request.raw.req object, which seems to be what is causing the issue here.
For reference to the OP, I ran into the exact same problem and it can be circumvented but its not nice, at all.
As part of Hapis request lifecycle it takes a Raw Payload and puts it into an object. Even setting
parse: false, output: stream, multipart: false
on the route configuration isnt enough to stop Hapi interfering with the payload. The way I did it was to add a request extension using the extension point onPostAuth.
If you have any prerequisite route handlers assigned (as I did) I had to hack these and move them up the chain to execute before onPostAuth (well, before my request handler which now executes at this point)
I should also add, I was using the busboy parser but will be moving to multiparty as it would crash the entire process tree, with no error, simply by sending a form (and not a file).
Set the onPostAuth extension handler for the required route and use the Raw request object:
request.raw.req
- Hapi wont have touched it by this point, that is the native Node object.
If you dont do this and upload a massive file it streams the whole thing into memory before passing it to the configured route handler. This despite the docs EXPLICITLY stating that doesn't occur; I will create an issue/suggestion on the Hapi issue log tonight after I've finished!
So, in summary, are you confirming that there is a bug that is preventing the integration with this module or in Hapi?
Not exactly. The expected result of the following configuration for a particular route:
parse: false, output: stream, multipart: false
Would be to disable parsing of a payload and that the handler would receive a raw stream of that data as it is being uploaded. This does not occur the entire payload will be read into memory then it is passed to the route handler as a raw stream.
In order to do streaming uploads one needs to place a handler at the onPostAuth
extension in order for a raw stream to be processed as it is being uploaded.
My point would be that the documentation is not at all clear on this as it explicitly states it will store everything in memory if you DONT set output to stream and parse to false. This clearly isnt true.
Additionally, integration of this module is fine. It shouldnt be needed though, and is used in the onPostAuth
extension point handler in order to get streaming uploads working properly without consuming all the available memory on the machine!
Hope this clarifies it for you.
Thanks for the clarification, @spuddleziz . Any thoughts on what the next steps the multiparty
module should take?
Closing since I never heard back.