Jira Software 7.13 Long Term Support release performance report

On this page

Still need help?

The Atlassian Community is here for you.

Ask the community

This page compares the performance of Jira 7.6 and Jira 7.13 Long Term Support release. 

About Long Term Support releases
We recommend upgrading Jira regularly, however if your organisation's process means you only upgrade about once a year, upgrading to a Long Term Support release may be a good option, as it provides continued access to critical security, stability, data integrity and performance issues until this version reaches end of life. 

Performance

This is an excerpt from the Jira 7.13 performance and scaling report, focusing on performance results for Jira 7.13. You can see the full report here.

Jira 7.13 was not focused solely on performance, however we do aim to provide the same, if not better, performance with each release. In this section, we’ll compare Jira 7.6 to Jira 7.13 Long Term Support release, both Server and Data Center. We ran the same extensive test scenario for both Jira versions. 

The following chart presents mean response times of individual actions performed in Jira. To check the details of these actions and the Jira instance they were performed in, see Testing methodology.

Response times for Jira actions

Results

  • Jira 7.13.2 shows significant improvement in response time when viewing boards (-30% Server, -30% Data Center)viewing backlogs (-10% Server, -12% Data Center), and viewing project summary (-40% Server, -37% Data Center).

  • Most of the actions looks similar between the two versions. We’ve observed small performance degradation when viewing boards in Server (+3% Server, -9% Data Center)viewing dashboards (+5% Server, +1% Data Center), browsing projects (+6% Server, +1% Data Center), and adding comments in DC (0% Server, 1% Data Center).

  • The mean of all actions has improved in Jira 7.13.2 (-5% Server, -5% Data Center)

Testing methodology

The following sections detail the testing environment, including hardware specification, and methodology we used in our performance tests.

How we tested...

Before we started the test, we needed to determine what size and shape of data set represents a typical large Jira instance. To achieve that, we used our Analytics data to form a picture of our customers' environments and what difficulties they face when scaling Jira in a large organization.

The following table presents rounded values of the 999th permille of each data dimension. We used these values to generate a sample dataset with random test data in Jira Data Generator.

Baseline Jira data set

Jira data dimensionValue
Issues1,000,000
Projects1500
Custom Fields1400
Workflows450
Attachments660,000
Comments2,900,000
Agile Boards1,450
Users100,000
Groups22,500
Security Levels170
Permissions200
Actions performed...

We chose a mix of actions that would represent a sample of the most common user actions. An action in this context is a complete user operation like opening of an Issue in the browser window. The following table details the actions that we included in the script for our testing persona, indicating how many times each action is repeated during a single test run.

Action nameDescriptionNumber of times an action is performed during a single test run
View DashboardOpening the Dashboard page.10
Create IssueSubmitting a Create Issue dialog.5
View IssueOpening an individual issue in a separate browser window.55
Edit IssueEditing the Summary, Description and other fields of an existing Issue.5
Add CommentAdding a Comment to an Issue.2
Search with JQL

Performing a search query using JQL in the Issue Navigator interface.

The following JQL queries were used...
description ~ 'totally' or summary ~ 'hobos' and comment !~ 'propel' ORDER BY key
reporter was in (admin) and status = Closed order by createdDate

 

comment ~ 'contest* saw' and reporter was admin order by assignee desc

 

text ~ 'a* ba*' ORDER BY Status, summary

 

priority was in (Low, Lowest) or (status = 'In Progress' and assignee changed) or createdDate > '2016/07/02 00:00'

 

resolution = Unresolved and priority = Low

 

text ~ 'witch* doctrine' and status was not Closed order by priority

 

project = novemcinctus and assignee = admin ORDER BY assignee

 

assignee in membersOf('users_1') order by project

 

not (status = closed and resolution = Done and priority = High)

Half of these queries are very heavyweight, which explains high average response time.

10
View BoardOpening of Agile Board10
Browse ProjectsOpening of the list of Projects (available under Projects > View All Projects menu)5
Browse BoardsOpening of the list of Agile Boards (available under Agile > Manage Boards menu)2
All ActionsA mean of all actions performed during a single test run.-
Test environment...

The performance tests were all run on a set of AWS EC2 instances. For each test, the entire environment was reset and rebuilt, and then each test started with some idle cycles to warm up instance caches. Below, you can check the details of the environments used for Jira Server and Jira Data Center, as well as the specifications of the EC2 instances.

To run the tests, we used 10 scripted browsers and measured the time taken to perform the actions. Each browser was scripted to perform a random action from a predefined list of actions and immediately move on to the next action (ie. zero think time). Please note that it resulted in each browser performing substantially more tasks than would be possible by a real user and you should not equate the number of browsers to represent the number of real-world concurrent users.

Each test was run for 20 minutes, after which statistics were collected. 

Here are the details of our test environment:

Jira ServerJira Data Center

The environment with Jira Server consisted of:

  • 1 Jira node
  • Database on a separate node
  • Load generator on a separate node

The environment with Jira Data Center consisted of:

  • 2 Jira nodes
  • Database on a separate node
  • Load generator on a separate node
  • Shared home directory on a separate node
  • Load balancer (AWS ELB HTTP load balancer)
Jira
HardwareSoftware
EC2 type:

c4.8xlarge (see EC2 types)  

Jira Server: 1 node

Jira Data Center: 2 nodes

Operating system:Ubuntu 16.04 LTS
CPU:Intel Xeon E5-2666 v3 (Haswell) Java platform:Java 1.8.0
CPU cores:36 Java options:

8 GB heap


Memory:60 GB
Disk:AWS EBS 100 GB gp2
Database
HardwareSoftware
EC2 type:c4.8xlarge (see EC2 types)   Database:MySQL 5.5
CPU:Intel Xeon E5-2666 v3 (Haswell)Operating system:


Ubuntu 16.04 LTS


CPU cores:36
Memory:60 GB
Disk:

Jira Server: AWS EBS 100 GB gp2

Jira Data Center: AWS EBS 60 GB gp2

Load generator
HardwareSoftware
EC2 type:c4.8xlarge (see EC2 types)   Operating system:Ubuntu 16.04 LTS
CPU:Intel Xeon E5-2666 v3 (Haswell)Browser:Google Chrome 62


CPU cores:36Automation script:

Chromedriver 2.33

WebDriver 3.4.0

Java JDK 8u131

Memory:60 GB
Disk:AWS EBS 30 GB gp2
Last modified on Jun 26, 2020

Was this helpful?

Yes
No
Provide feedback about this article
Powered by Confluence and Scroll Viewport.