elixir-waffle/waffle_ecto

Handling large file uploads - stream from file instead of binary

Opened this issue · 3 comments

I need to allows users to upload roughly ~150mb files to S3, not very often, but since waffle_ecto is using the binary upload, it loads the entire file into memory. I'd rather not pay for more memory when Waffle has support for streaming a file to S3 if the file is from disk and not a binary.

Is there a way I can configure waffle_ecto to stream to local disk, then do a stream upload to S3? That way I don't have 150mb loaded into memory for every concurrent upload. Using gigalixir I'd essentially have to pay for $10 per concurrent upload I want to support without crashing due to running out of memory.

Am I correct in thinking there are 2 upload portions? The user first uploads the file when they submit the multipart form to my phoenix app. Then waffle uploads it to S3 and then once it has a path, saves it into Ecto. My app is running out of memory near the end of the first user upload portion. I guess I just have to increase the memory of the server, no way to avoid receiving all the bytes at once.