MorrisLLC/grunt-war

Out of memory error - FATAL ERROR: CALL_AND_RETRY_2 Allocation failed - process out of memory Abort trap: 6

Closed this issue · 25 comments

Get the following error when I invoke : 'grunt war'

FATAL ERROR: CALL_AND_RETRY_2 Allocation failed - process out of memory
Abort trap: 6

seems to occur just after the lines,

adding node_modules/grunt-contrib-imagemin/node_modules/imagemin/node_modules/imagemin-optipng/node_modules/optipng-bin/node_modules/bin-wrapper/node_modules/download/node_modules/decompress/node_modules/adm-zip/test/assets/attributes_test/New folder/hidden.txt
adding node_modules/grunt-contrib-imagemin/node_modules/imagemin/node_modules/imagemin-optipng/node_modules/optipng-bin/node_modules/bin-wrapper/node_modules/download/node_modules/decompress/node_modules/adm-zip/test/assets/attributes_test/New folder/hidden_readonly.txt

I'm running on Mavericks with
node: v0.10.29
grunt-cli v0.1.13
grunt v0.4.5

regards,
Mo

I'll investigate this issue over the weekend. I'm also testing with Mavericks.

Hi Robert,
Thanks for the update
Regards
Mo

Sent from my iPhone

On 3 Jul 2014, at 15:12, Robert Morris notifications@github.com wrote:

I'll investigate this issue over the weekend. I'm also testing with Mavericks.


Reply to this email directly or view it on GitHub.

Any success with this? I'm getting a similar issue and not sure what to do about it.

It always seems to hang at the point of:

adding node_modules/grunt-war/node_modules/node-zip/node_modules/jszip/node_modules/zlibjs/test/util.js

And I get and error at the end:

FATAL ERROR: JS Allocation failed - process out of memory
Abort trap: 6

I'll try and put in an option to build the war on disk instead of in-memory tonight. My guess is that the size war that is being built must be very large. It might be 9pm CST or a little but later before you see an update..

I just pushed version 0.3.3 to NPM. I got rid of a dependency so that I have more fine grain control over the generation of the WAR. This allowed me to bump a version of another project I needed for compression. I'm hoping but skeptical that this will resolve your issue. Let me know. I'm going to push another version that I'm sure will resolve this problem but it requires more work than I could wrap up last night.

You should also consider using node streams to avoid reading the files into memory. wearefractal/vinyl-fs would probably be very helpful in a stream implementation.

The issue still exists in version 0.3.3. Grunt memory usage goes up from around 150 MiB to 1.5 GiB in a few seconds after the war task has started.

After exclusion of a large library (MathJax) the task now finishes but still needs a lot of memory (~1 GiB).

I finally have some options on how to fix. stay tuned..

Hi,
I would also really appreciate a solution. I also have this problem.

Working it now.. hope to release later this afternoon.

@rllola: Thanks, nice post!

I'm having the same issue as well
Integrated this plugin 2 days ago so I see there is no solution for it yet?

Did you configure your gruntfile correctly ? Because it is not really an issue, if you have this error it is because you didn't indicate the folder with the build.

war: {
      target: {
          options: {
              war_dist_folder: '<%= yeoman.jenkins %>',
              war_verbose: true,
              war_name: 'ROOT'
          },
          files: [
              {
                  expand: true,
                  cwd: '<%= yeoman.jenkins %>/dist',
                  src: ['**'],
                  dest: ''
              }
          ]
      }

I have to say it did work well for this configuration until I decided I need another folder to be in the WAR archieve which apparntly made it bigger in memory

Try this and be sure you have build correctly your project before :

war: {
      target: {
          options: {
              war_dist_folder: '<%= yeoman.jenkins %>/dist',
              war_verbose: true,
              war_name: 'ROOT'
          },
          files: [
              {
                  expand: true,
                  cwd: '<%= yeoman.jenkins %>/dist',
                  src: ['**'],
                  dest: '.'
              }
          ]
      }

Nope. same error:
FATAL ERROR: JS Allocation failed - process out of memory
Aborted (core dumped)
Weird it's only 10MB all together, not something big to make it out of memory

wait missed the '.' thing

Still same error :(

FATAL ERROR: JS Allocation failed - process out of memory
Aborted (core dumped)

is different to :

Out of memory error - FATAL ERROR: CALL_AND_RETRY_2 Allocation failed - process out of memory Abort trap

It is not the error that we are talking about in this issue for this one I don't know. You should open a new issue with your error in it if doesn't exist.

OK thanks anyway

What is the total file size of all the files you want to war up?

I'm working on a version that will allow multi-gigabyte war files but it's tricker than I originally imagined.

On Oct 26, 2014, at 5:40 AM, Avien notifications@github.com wrote:

I'm having the same issue as well
Integrated this plugin 2 days ago so I see there is no solution for it yet?


Reply to this email directly or view it on GitHub.

It's ok I managed to understand what went wrong
I accidentally copied all the files twice , each file was copied with its revisions version and apparently it build up too much memory
Working well now thanks

Sent from my iPhone

On 26 באוק 2014, at 21:56, Robert Morris notifications@github.com wrote:

What is the total file size of all the files you want to war up?

I'm working on a version that will allow multi-gigabyte war files but it's tricker than I originally imagined.

On Oct 26, 2014, at 5:40 AM, Avien notifications@github.com wrote:

I'm having the same issue as well
Integrated this plugin 2 days ago so I see there is no solution for it yet?


Reply to this email directly or view it on GitHub.

Reply to this email directly or view it on GitHub.

Finally.. just pushed build 0.4.0 that will handle a huge number of source files in a project. This should fix Out Of Memory Errors for very large projects. This version streams writes to disk incrementally instead of trying to build the entire WAR in memory.

I've been able to war up massive projects with the changes I put in to address this problem.