JIRA Software 7.6 Enterprise Release performance report
This page compares the performance of JIRA 7.2 and JIRA 7.6 Enterprise Release.
About Enterprise Releases
We recommend upgrading JIRA regularly, however if your organisation's process means you only upgrade about once a year, upgrading to an Enterprise Release may be a good option, as it provides continued access to critical security, stability, data integrity and performance issues until this version reaches end of life.
JIRA 7.6 was not focused solely on performance, however we do aim to provide the same, if not better, performance with each release. In this section, we’ll compare JIRA 7.2 to JIRA 7.6 Enterprise Release, both Server and Data Center. We ran the same extensive test scenario for both JIRA versions.
The following chart presents mean response times of individual actions performed in JIRA. To check the details of these actions and the JIRA instance they were performed in, see Testing methodology.
Response times for JIRA actions
The following sections detail the testing environment, including hardware specification, and methodology we used in our performance tests.
Before we started the test, we needed to determine what size and shape of data set represents a typical large JIRA instance. To achieve that, we used our Analytics data to form a picture of our customers' environments and what difficulties they face when scaling JIRA in a large organization.
Baseline JIRA data set
|JIRA data dimension||Value|
We chose a mix of actions that would represent a sample of the most common user actions. An action in this context is a complete user operation like opening of an Issue in the browser window. The following table details the actions that we included in the script for our testing persona, indicating how many times each action is repeated during a single test run.
|Action name||Description||Number of times an action is performed during a single test run|
|View Dashboard||Opening the Dashboard page.||10|
|Create Issue||Submitting a Create Issue dialog.||5|
|View Issue||Opening an individual issue in a separate browser window.||55|
|Edit Issue||Editing the Summary, Description and other fields of an existing Issue.||5|
|Add Comment||Adding a Comment to an Issue.||2|
|Search with JQL|
Performing a search query using JQL in the Issue Navigator interface.
The following JQL queries were used...
Half of these queries are very heavyweight, which explains high average response time.
|View Board||Opening of Agile Board||10|
|Browse Projects||Opening of the list of Projects (available under Projects > View All Projects menu)||5|
|Browse Boards||Opening of the list of Agile Boards (available under Agile > Manage Boards menu)||2|
|All Actions||A mean of all actions performed during a single test run.||-|
The performance tests were all run on a set of AWS EC2 instances. For each test, the entire environment was reset and rebuilt, and then each test started with some idle cycles to warm up instance caches. Below, you can check the details of the environments used for JIRA Server and JIRA Data Center, as well as the specifications of the EC2 instances.
To run the tests, we used 10 scripted browsers and measured the time taken to perform the actions. Each browser was scripted to perform a random action from a predefined list of actions and immediately move on to the next action (ie. zero think time). Please note that it resulted in each browser performing substantially more tasks than would be possible by a real user and you should not equate the number of browsers to represent the number of real-world concurrent users.
Each test was run for 20 minutes, after which statistics were collected.
Here are the details of our test environment:
|JIRA Server||JIRA Data Center|
The environment with JIRA Server consisted of:
The environment with JIRA Data Center consisted of:
JIRA Server: 1 node
JIRA Data Center: 2 nodes
|CPU cores:||Java options:|
|CPU:||Operating system: |
JIRA Data Center:
|EC2 type:||Operating system:|
|CPU cores:||Automation script:|