Bitbucket Server crashes or stops responding due to java.lang.OutOfMemoryError: Java heap space

Still need help?

The Atlassian Community is here for you.

Ask the community

Platform notice: Server and Data Center only. This article only applies to Atlassian products on the Server and Data Center platforms.

Support for Server* products ended on February 15th 2024. If you are running a Server product, you can visit the Atlassian Server end of support announcement to review your migration options.

*Except Fisheye and Crucible


Bitbucket Server crashes or stops responding . The following appears in the atlassian-bitbucket.log:

java.lang.OutOfMemoryError: Java heap space


What does this generally means?

There most common reason for the java.lang.OutOfMemoryError: Java heap space error is simple – you try to fit an XXL application into an S-sized Java heap space. That is – the application just requires more Java heap space than available to it to operate normally.

Other causes for this OutOfMemoryError message are more complex and are caused by a programming error:

  • Spikes in usage/data volume. The application was designed to handle a certain amount of users or a certain amount of data. When the number of users or the volume of data suddenly spikes and crosses that expected threshold, the operation which functioned normally before the spike ceases to operate and triggers the java.lang.OutOfMemoryError: Java heap space error.
  • Memory leaks. A particular type of programming error will lead your application to constantly consume more memory. Every time the leaking functionality of the application is used it leaves some objects behind into the Java heap space. Over time the leaked objects consume all of the available Java heap space and trigger the already familiar java.lang.OutOfMemoryError: Java heap space error.

Common reasons observed in Bitbucket server

  1. Third-party add-ons which requires more resources
    • Third-party add-ons like Awesome Graph may require allocating more memory, if you have many third-party add-ons and facing OutOfMemory issues, you need to increase the heap size.
  2. Versions Yet Another Commit Checker prior to 1.13
    • Below stack trace in the log which is generated by jGit:

      2017-02-20 13:51:11,121 ERROR [threadpool:thread-2]  c.a.s.i.c.StateTransferringExecutor Error while processing asynchronous task
      java.lang.OutOfMemoryError: Java heap space
      at<init>( ~[na:na]

      being jGit not part of the Bitbucket application the reason for this failure is to be searched on one of the installed add-ons.In this case the stack trace points to a specific add-on named Yet Another Commit Checker:

      at com.isroot.stash.plugin.YaccHook.onReceive( ~[na:na]
  3. OutOfMemory Errors caused by the behavior explained in 
    BSERV-10100 - Getting issue details... STATUS


Check the  atlassian-bitbucket.log for specific log entries mentioned above and for a detailed analysis have a look at  How to debug Out of Memory Heap Space.


Scenario #1

If you are using third-party add-ons that use more resources, allocate more heap size. We don't recommend setting heap larger than 2GB, unless it is specifically instructed by Atlassian Support for troubleshooting a specific problem.
Please read Scaling Bitbucket Server - Memory to know more about allocating memory.

Scenario #2

The issue is caused by a known issue of the add-on that is documented at YACC uses a lot of JVM heap for org.eclipse.jgit objects -- can cause OOME under heavy load The issue is already solved and is available in the add-on from version 1.13.

Scenario #3

Upgrade Bitbucket server to latest version or versions wherein BSERV-10100 is fixed.

Last modified on Jun 13, 2023

Was this helpful?

Provide feedback about this article
Powered by Confluence and Scroll Viewport.