java.lang.OutOfMemory errors when using site plugin
dbarvitsky opened this issue · 3 comments
Hi! We are using site-maven-plugin to share the binaries with contractors. So far it has been working great. The configuration is as follows:
<plugin>
<groupId>com.github.github</groupId>
<artifactId>site-maven-plugin</artifactId>
<version>0.12</version>
<configuration>
<message>Maven artifacts for ${project.groupId}:${project.artifactId} ${project.version}</message>
<noJekyll>true</noJekyll>
<outputDirectory>${build.temp.artifact.location}</outputDirectory>
<branch>refs/heads/zoom</branch>
<includes>
<include>**/*</include>
</includes>
<excludes>
<exclude>**/*-test.*</exclude>
</excludes>
<repositoryName>maven-mirror</repositoryName>
<repositoryOwner>zoominfo</repositoryOwner>
<dryRun>false</dryRun>
<merge>true</merge>
</configuration>
<executions>
<execution>
<goals>
<goal>site</goal>
</goals>
<phase>deploy</phase>
</execution>
</executions>
</plugin>
This is mid-size multi-module project. Recently, however, we started getting out of memory errors on certain projects:
[INFO] Creating 78 blobs
...
java.lang.OutOfMemoryError: Java heap space
at java.util.Arrays.copyOfRange(Arrays.java:2694)
at java.lang.String.<init>(String.java:203)
at java.lang.String.substring(String.java:1913)
at java.io.StringWriter.write(StringWriter.java:112)
at com.google.gson.stream.JsonWriter.string(JsonWriter.java:538)
at com.google.gson.stream.JsonWriter.value(JsonWriter.java:404)
at com.google.gson.internal.bind.TypeAdapters$13.write(TypeAdapters.java:353)
at com.google.gson.internal.bind.TypeAdapters$13.write(TypeAdapters.java:337)
at com.google.gson.internal.bind.TypeAdapterRuntimeTypeWrapper.write(TypeAdapterRuntimeTypeWrapper.java:68)
at com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$1.write(ReflectiveTypeAdapterFactory.java:89)
at com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$Adapter.write(ReflectiveTypeAdapterFactory.java:195)
at com.google.gson.Gson.toJson(Gson.java:586)
at com.google.gson.Gson.toJson(Gson.java:565)
at com.google.gson.Gson.toJson(Gson.java:520)
at com.google.gson.Gson.toJson(Gson.java:500)
at org.eclipse.egit.github.core.client.GitHubClient.toJson(GitHubClient.java:384)
at org.eclipse.egit.github.core.client.GitHubClient.sendParams(GitHubClient.java:611)
at org.eclipse.egit.github.core.client.GitHubClient.sendJson(GitHubClient.java:633)
at org.eclipse.egit.github.core.client.GitHubClient.post(GitHubClient.java:757)
at org.eclipse.egit.github.core.service.DataService.createBlob(DataService.java:115)
at com.github.maven.plugins.site.SiteMojo.createBlob(SiteMojo.java:289)
at com.github.maven.plugins.site.SiteMojo.execute(SiteMojo.java:356)
at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:134)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:207)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:116)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:80)
at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build(SingleThreadedBuilder.java:51)
at org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:128)
at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:307)
at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:193)
The artifact it is trying to push consists of several consolidated jars, 200M each. We tried running it with maven options as follows, but apparently the blob is too big:
export MAVEN_OPTS="-XX:MaxPermSize=512m -Xmx2048m"
After doing some debugging it looks like the plugin is trying to load the whole commit into memory. I understand it is being transcoded into something else and probably gets bigger. I also understand that this is egit problem, they choose to pull the entire commit into memory instead of streaming it.
Increasing it further does not help much, the OS itself starts running out of memory:
Java HotSpot(TM) 64-Bit Server VM warning: INFO: os::commit_memory(0x00000007be1e9000, 568422400, 0) failed; error='Cannot allocate memory' (errno=12)
After quick look into the code I couldn't come up with a reasonable patch for this. I think ideal solution would be to stream the contents rather than loading the whole thing into memory. I am not familiar with egit and not sure if it has this capability... Alternatively, one can split large commits into separate files and thus make it work, however, it is a cludge, and it can potentially create inconsistent commits.
So for now we are going to switch off the plug-in, cache artifacts somewhere else and push them into github manually via regular git commands. If you need my help testing this, I'd be more than happy to assist.
The versions are:
Apache Maven 3.3.9 (bb52d8502b132ec0a5a3f4c09453c07478323dc5; 2015-11-10T11:41:47-05:00)
Maven home: /usr/local/maven/default
Java version: 1.7.0_75, vendor: Oracle Corporation
Java home: /usr/java/jdk1.7.0_75/jre
Default locale: en_US, platform encoding: UTF-8
OS name: "linux", version: "4.1.13-boot2docker", arch: "amd64", family: "unix"
Thank you very much for your work on supporting this project. Wishing you the best of luck in your endeavors.
even if you get the plugin to work it's likely to fail, see: https://help.github.com/articles/working-with-large-files/
you should set jvm memory
I am facing the same problem. How to solve it?