To our knowledge, JIRA does not have any memory leaks. We know of various public high-usage JIRA instances (eg. 40k issues, 100+ new issues/day, 22 pages/min in 750Mb of memory) that run for months without problems. When memory problems do occur, the following checklist can help you identify the cause.
Too little memory allocated?
Check the System Info page (see Increasing JIRA memory) after a period of sustained JIRA usage to determine how much memory is allocated.
Too much memory allocated?
When increasing Java's memory allocation with -Xmx, please ensure that your system actually has the allocated amount of memory free. For example, if you have a server with 1Gb of RAM, most of it is probably taken up by the operating system, database and whatnot. Setting -Xmx1Gb to a Java process would be a very bad idea. Java would claim most of this memory from swap (disk), which would dramatically slow down everything on the server. If the system ran out of swap, you would get OutOfMemoryErrors.
If the server does not have much memory free, it is better to set -Xmx conservatively (eg. -Xmx256m), and only increase -Xmx when you actually see OutOfMemoryErrors. Java's memory management will work to keep within the limit, which is better than going into swap.
Bugs in older JIRA versions
Please make sure you are using the latest version of JIRA. There are often memory leaks fixed in JIRA. Here are some recent ones:
Too many webapps (out of PermGen space)
People running multiple JSP-based web applications (eg. JIRA and Confluence) in one Java server are likely to see this error:
Java reserves a fixed 64Mb block for loading class files, and with more than one webapp this is often exceeded. You can fix this by setting the
-XX:MaxPermSize=128m property. See the Increasing JIRA memory page for details.
Tomcat memory leak
Tomcat caches JSP content. If JIRA is generating huge responses (eg. multi-megabyte Excel or RSS views), then these cached responses will quickly fill up memory and result in OutOfMemoryErrors.
In Tomcat 5.5.15+ there is a workaround – set the
org.apache.jasper.runtime.BodyContentImpl.LIMIT_BUFFER=true property (see how). For earlier Tomcat versions, including that used in JIRA Standalone 3.6.x and earlier, there is no workaround. Please upgrade Tomcat, or switch to another app server.
We strongly recommend running JIRA in its own JVM (app server instance), so that web applications cannot affect each other, and each can be restarted/upgraded separately. Usually this is achieved by running app servers behind Apache or IIS.
If you are getting OutOfMemoryErrors, separating the webapps should be your first action. It is virtually impossible to work out retroactively which webapp is consuming all the memory.
In order to correctly 'thread' email notifications in mail browsers, JIRA tracks the
Message-Id header of mails it sends. In heavily used systems, the
notificationinstance table can become huge, with millions of records. This can cause OutOfMemoryErrors in the JDBC driver when it is asked to generate an XML export of the data (see JRA-11725)
Services (custom, CVS, etc)
Occasionally people write their own services, which can cause memory problem if (as is often the case) they iterate over large numbers of issues. If you have any custom services, please try disabling them for a while to eliminate them as a cause of problems.
The CVS service sometimes causes memory problems, if used with a huge CVS repository (in this case, simply increase the allocated memory).
A symptom of a CVS (or general services-related) problem is that JIRA will run out of memory just minutes after startup.
JIRA backup service with large numbers of issues.
Do you have hundreds of thousands of issues? Is JIRA's built-in backup service running frequently? If so, please switch to a native backup tool and disable the JIRA backup service, which will be taking a lot of CPU and memory to generate backups that are unreliable anyway (due to lack of locking). See the JIRA backups documentation for details.
JIRA mail misconfiguration causing comment loops.
Does a user have an e-mail address that is the same as one of the mail accounts in your mail handler services? This can cause a comment loop where notifications are sent out and appended to the issue which then triggers another notification and so forth. If a user then views that issue, it could consume a lot of memory. You can query your database using this query that will show you issues with more than 50 comments. It could be normal for issues that have 50 comments, you want to spot for any irregular pattern in the comments themselves such as repeating notifications.
Eclipse Mylyn plugin
If your developers use the Eclipse Mylyn plugin, make sure they are using the latest version. The Mylyn bundled with Eclipse 3.3 (2.0.0.v20070627-1400) uses the
getProjects method, causing problems as described above.
Huge XML/RSS or SOAP requests
This applies particularly to publicly visible JIRAs. Sometimes a crawler can slow down JIRA by making multiple huge requests. Every now and then someone misconfigures their RSS reader to request XML for every issue in the system, and sets it running once a minute. Similarly, people sometimes write SOAP clients without consideration of the performance impact, and set it running automatically. JIRA might survive these (although be oddly slow), but then run out of memory when a legitimate user's large Excel view pushes it over the limit.
The best way to diagnose unusual requests is to enable Tomcat access logging (on by default in JIRA Standalone), and look for requests that take a long time.
In JIRA 3.10 there is a
jira.search.views.max.limit property you can set in
WEB-INF/classes/jira-application.properties, which is a hard limit on the number of search results returned. It is a good idea to enable this for sites subject to crawler traffic.
Unusual JIRA usage
Every now and then someone reports memory problems, and after much investigation we discover they have 3,000 custom fields, or are parsing 100Mb emails, or have in some other way used JIRA in unexpected ways. Please be aware of where your JIRA installation deviates from typical usage.
If you have been through the list above, there are a few further diagnostics which may provide clues.
Getting memory dumps
By far the most powerful and effective way of identifying memory problems is to have JIRA dump the contents of its memory on exit (when exiting due to an OutOfMemoryError hang). These run with no noticeable performance impact. This can be done in one of two ways:
- On Sun's JDK 1.5.0_07 and above, or 1.4.2_12 and above, set the
-XX:+HeapDumpOnOutOfMemoryErroroption. If JIRA runs out of memory, it will create a
jira_pid*.hproffile containing the memory dump in the directory you started JIRA from.
- On other platforms, you can use the yourkit profiler agent. Yourkit can take memory snapshots when when the JVM exits, or when an OutOfMemoryError is imminent (eg. 95% memory used), or when manually triggered. The agent part of Yourkit is freely redistributable. For more information, see Profiling Memory and CPU usage with YourKit.
Please reduce your maximum heap size (-Xmx) to 750m or so, so that the generated heap dump is of manageable size. You can turn -Xmx up once a heap dump has been taken.
Enable gc logging
Garbage collection logging looks like this:
This can be parsed with tools like gcviewer to get an overall picture of memory use:
To enable gc logging, start JIRA with the option
-XX:+PrintGCDetails -XX:+PrintGCTimeStamps -verbose:gc -Xloggc:gc.log. Replace
gc.log with an absolute path to a
For example, with a Windows service, run:
If you modify
bin/setenv.sh, you will need to restart JIRA for the changes to take effect.
It is important to know what requests are being made, so unusual usage can be identified. For instance, perhaps someone has configured their RSS reader to request a 10Mb RSS file once a minute, and this is killing JIRA.
If you are using Tomcat, access logging can be enabled by adding the following to
conf/server.xml, below the
%S logs the session ID, allowing requests from distinct users to be grouped. The %D logs the request time in milliseconds. Logs will appear in
logs/access_log.<date>, and look like this:
Alternatively, or if you are not using Tomcat or can't modify the app server config, JIRA has a built-in user access logging which can be enabled from the admin section, and produces terser logs like: