OutOfMemory Errors on the Bamboo Server or Local Agent

Still need help?

The Atlassian Community is here for you.

Ask the community

I am getting 'Out of Memory' errors. How can I allocate more memory to Bamboo?

Heap memory

The default memory settings for Bamboo may be too low for your situation and you might have to adjust the settings in order to run a bigger Bamboo instance with sufficient memory.

To increase the heap space allocated to Bamboo, see Configuring your system properties.

Permanent Generation size

(info) This is only applicable to Bamboo version 5.9 and older. Bamboo version 5.10 requires Java 1.8, which obsoletes the -XX:MaxPermSize parameter.

If you get the error message: java.lang.OutOfMemoryError: PermGen space this means that you have exceeded Java's fixed 64Mb block for loading class files. You will need to add the argument -XX:MaxPermSize and increase the memory.

  • JDK 1.4 does not provide information as to why the OutOfMemory error occurred.
  • JDK 1.5 and above are recommended as they provide a description of the error as in the above example.

Bamboo builds fail due to java.lang.OutOfMemoryError

If you notice Bamboo builds are failing with an OutOfMemoryError, this means that you haven't allocated sufficient memory to to your build.

This section is only relevant if your builds are failing due to memory error, if your Bamboo server is running out of memory then refer to section above

Firstly, builds in Bamboo are exactly the same as how they would be executed via command line, if you haven't done so already, please run your build outside of Bamboo via command line and confirm that build is successful

If you are using Ant you can use ANT_OPTS to increase the allocated memory

  • For, e.g. ANT_OPTS "-XmsMIN_HEAPm -XmxMAX_HEAPm"

In case of Maven, you should use MAVEN_OPTS to increase allocated memory

  • For, e.g. MAVEN_OPTS "-XmsMIN_HEAPm -XmxMAX_HEAPm"

When getting this error: "java.lang.OutOfMemory: GC overhead limit exceeded" while running an Ant or Maven build, put this parameter -XX:-UseGCOverheadLimit in ANT_OPTS or MAVEN_OPTS.

In rare cases the Bamboo agent itself can throw an OutOfMemoryError while processing the build results. See  Configuring Bamboo agent on start-up  to increase the agent's heap setting.

OutOfMemoryError due to SVN Memory Leak

If you notice Bamboo becoming extremely slow/hanging  with OutOfMemoryError thrown in the logs and you are running on Bamboo 4.2.1 with subversion repository, then you are most likely affected by this bug BAM-12362 - Getting issue details... STATUS  which has already being fixed in version 4.2.2. We will however recommend that you upgrade to the latest version release for optimal performance.

java.lang.OutOfMemoryError: unable to create new native thread

See the below knowledge-base article for an in-depth explanation of the this Out Of Memory error:

Allocating large maximum heap and low minimum heap can slow down Bamboo

If Xmx (the maximum heap size) is much larger than Xms (the initial heap size) Java will spend long periods of time in garbage collection to reach the Xms goal. This will slow down Bamboo performance.

Therefore if you increase the maximum heap size you will also need to increase the initial heap size to avoid unnecessary garbage collection. For heap sizes of 4 GB and larger we recommend setting both Xms and Xmx to the same value.

Getting a memory dump on OOM errors

Passing in -XX:+HeapDumpOnOutOfMemoryError will make the JVM create a memory dump, when it runs out of memory. To configure Bamboo with this option see this page

(warning) You need to restart the Bamboo server for changes to take effect. If you have any elastic agents running, ensure that they are shut down before you restart the Bamboo server. If you do not shut down your elastic instances before restarting, they will continue to run and become orphaned from your Bamboo server.

OutOfMemoryError when running a build with Clover

Please have a look at the following articles:

Last modified on Sep 20, 2019

Was this helpful?

Provide feedback about this article
Powered by Confluence and Scroll Viewport.