Performance Testing Scripts

Please be aware that the content on this page is not actively maintained and Atlassian can not guarantee providing any support for it. Furthermore, the performance testing scripts which you can download from Atlassian's public Maven repository (via the link on this page) are no longer supported.

This page is provided for your information only and using it is done so at your own risk. Instead of using these scripts, we would recommend our JIRA Performance Testing with Grinder page.

This page contains scripts and hints for testing usage load on your JIRA installation.

When setting up a new JIRA installation, it is useful to understand how it will perform under your anticipated load before users begin accessing it. Scripts that generate 'request' (or usage) load are provided in our public Maven repository (link below). Using these scripts, you can find out where you may need to consider improving your configuration to remove bottlenecks.

While this kind of testing is not an exact science, the tools and processes described here are intended to be straightforward and configurable, and provide you with an extensible way to assess load testing.

The performance tests described on this page utilise JMeter. While it is not necessary to know JMeter, briefly reading through the JMeter documentation is recommended as it may help you resolve any JMeter-specific issues.

It is rarely the case that these scripts will perform representative testing for you 'out of the box'. However, it should be possible to build an appropriate load test by configuring or extending these scripts.

Load testing scripts should not be used on a production JIRA installation!


While we recommend using a copy of your production data for testing usage load, the load testing scripts below will modify data within the targeted JIRA installation! Hence, these scripts should not be used on a production JIRA installation. Instead, use a copy of your production JIRA data on a test JIRA installation.

If you do run these test scripts against your production JIRA installation, you will be responsible for any data loss and backup recovery!

Likewise, when making changes to your JIRA installation to remove performance bottlenecks, it is useful to assess the impact of these changes in a test JIRA installation before implementing them in production.

On this page:

Prerequisites

You will need the following:

  • A JIRA installation, set up and running with an administrator user. The scripts assume that the username/password combination of this user is 'admin'/'admin'.
  • It is recommended that you test with a production quality database, such as one listed on the Supported Platforms page. Do not use HSQLDB.
  • Apache JMeter (currently version 2.3.4). If you intend to do high load testing, please use our modified version of JMeter instead (which requires Java 1.6).
  • The load testing scripts and resources which are available in our public Maven repository — Please choose the version that most closely matches your JIRA version and download the ZIP or Gzip file in that directory. If in doubt, download the ZIP file archive.

Users have reported problems using the Windows built-in unzip utility to extract these archives. If you encounter such a problem, please use a third party file archiving and extraction program (for example, 7-Zip) to extract these performance tests.

Quick, just tell me how to run these tests!

If you do not want to read the rest of this document, here are the main points:

  1. Create the setup testdata:

    <jmeter location>/bin/jmeter -n -t jmeter-test-setup.jmx -Jadmin.user=<username> -Jadmin.pass=<password>
  2. Run the fixed load test:

    <jmeter location>/bin/jmeter -n -t jmeter-test-fixedload.jmx

The remainder of this document is just an elaboration of those two steps.

For information on how to use JMeter please refer to the JMeter documentation.

Performance Tests

JIRA performance tests are made up of two parts:

  • Setup test — runs first and prepares the JIRA installation for a subsequent fixed load test
  • Fixed load test — simulates a number of users accessing the JIRA installation.

Setup Test

The setup test is responsible for:

  • Creating projects
  • Creating users
  • Creating and commenting on (and optionally resolving) issues.

Running the setup test:

After extracting the performance test zip file, change into the performanceTest directory. From this directory, run the performance setup test:

<jmeter location>/bin/jmeter -n -t jmeter-test-setup.jmx -Jadmin.user=<username> -Jadmin.pass=<password>

where <jmeter.location> is the base directory of JMeter

(info) If you omit the -n switch, JMeter will run as a GUI. You may then start the test from within the GUI.

As seen above with the admin.user and admin.pass parameters, JMeter supports -Jparameter=value command arguments in order to control execution. The following parameters control the setup test execution:

Configuration Control

Parameter

Default

Explanation

jira.host

localhost

The hostname or address of the JIRA installation.

jira.port

8000

The network port that the JIRA installation is running on.

jira.context

/

JIRA webapp context.

admin.user

admin

Administrator username.

admin.pass

admin

Administrator password.

script.base

.

The location of the performance tests. This should only be set if you run the tests from outside the scripts directory.

remove.data

false

Running the script with this enabled will remove the users and projects created by the test.

User Control

Parameter

Default

Explanation

create.users.enable

true

Create users in the target JIRA installation. Use false if you already have the users created elsewhere.

browseissue.max

250

The number of users to be created for browsing the JIRA installation (aka "browseissue" users).

createissue.max

250

The number of users to be created for creating issues (aka "createissue" users).

editissue.max

250

The number of users to be created for editing issues (aka "editissue" users).

search.max

250

The number of users to be created for searching issues (aka "search" users).

useraction.max

250

The number of users to be created for browsing user information (aka "useraction" users).

browseissue.groupname

none

The group to which "browseissue" users will be placed. Use none for no group.

createissue.groupname

jira-developers

The group to which "createissue" users will be placed. Use none for no group.

editissue.groupname

jira-developers

The group to which "editissue" users will be placed. Use none for no group.

search.groupname

none

The group to which "search" users will be placed. Use none for no group.

useraction.groupname

jira-developers

The group to which "useraction" users will be placed. Use none for no group.

Project Control

Parameter

Default

Explanation

create.projects.enable

true

Create projects. Use false if you want to use existing projects (in existing data).

project.max

20

The number of projects to create in the system.

Issue Control

Parameter

Default

Explanation

create.issues.enable

true

Creates issues in the target JIRA installation. Use false if you do not want the test to create sample issues.

issue.max

3000

The number of issues to be created.

issue.comment.enable

true

Controls whether or not comments are added to issues.

issue.comment.max

10

If issue.comment.enable is true, then the number of actual comments created on an issue is chosen randomly between 0 and this value.

issue.close

true

Controls whether or not issues will be closed automatically after being created.

issue.close.percentage

60

If issue.close is enabled, then this value defines the percentage of issues closed.

issue.setupload.threads

10

The number of threads used for creating the issues.

issue.setupload.pause

50

The amount of time (in milliseconds) for which a simulated user will 'sleep' between each request during issue creation.

resource.dir

resources

The directory which contains the CSV data resources.

Test Output

Once you have chosen your target settings, run JMeter and you should get output similar to the following:

jmeter -n -t jmeter-test-setup.jmx
Created the tree successfully using jmeter-test-setup.jmx
Starting the test @ Mon Oct 26 23:53:28 CDT 2009 (1256619208435)
Generate Summary Results +   931 in  31.3s =   29.7/s Avg:    26 Min:    13 Max:  3256 Err:     0 (0.00%)
Generate Summary Results +  2948 in 180.0s =   16.4/s Avg:    31 Min:     8 Max:  1162 Err:     0 (0.00%)
Generate Summary Results =  3879 in 211.4s =   18.3/s Avg:    29 Min:     8 Max:  3256 Err:     0 (0.00%)
Generate Summary Results +  5048 in 179.9s =   28.1/s Avg:    44 Min:     7 Max:   936 Err:     0 (0.00%)
Generate Summary Results =  8927 in 391.4s =   22.8/s Avg:    37 Min:     7 Max:  3256 Err:     0 (0.00%)
Generate Summary Results +  3114 in 180.1s =   17.3/s Avg:    41 Min:     7 Max:   805 Err:     0 (0.00%)
Generate Summary Results = 12041 in 571.3s =   21.1/s Avg:    38 Min:     7 Max:  3256 Err:     0 (0.00%)
Generate Summary Results +  4956 in 179.8s =   27.6/s Avg:    45 Min:     7 Max:  1844 Err:     0 (0.00%)
Generate Summary Results = 16997 in 751.4s =   22.6/s Avg:    40 Min:     7 Max:  3256 Err:     0 (0.00%)
Generate Summary Results +   313 in  17.1s =   18.3/s Avg:    37 Min:     7 Max:   165 Err:     0 (0.00%)
Generate Summary Results = 17310 in 768.5s =   22.5/s Avg:    40 Min:     7 Max:  3256 Err:     0 (0.00%)
Tidying up ...    @ Tue Oct 27 00:06:17 CDT 2009 (1256619977181)
... end of run

This output will be updated every 3 minutes, showing the most recent activity as well as a summary for the whole test.

Result Logs

In addition to this summary data, which is output on the command line, log files are created for both the successful (jmeter-results-setup.jtl) and unsuccessful (jmeter-assertions-setup.jtl) results. These log files are saved in JTL format (which is based on XML). There are a number of parsers which will generate graphs from these log files. For more information, see the JMeter wiki page on Log Analysis.

Fixed Load Test

Once the setup test has successfully run, the fixed load test can be run. This test will simulate a number of users accessing the JIRA installation.

This test has a number of parameters for tweaking the behavior if the test. By default, the test has the following behavior and strategy:

  • Several groups of users, all running concurrently for a fixed amount of time, each with a small delay between requests.
    • 'Edit Issue' (editissue) users browse a project and then attempt to find an issue. They will then comment, edit or change the workflow of that issue.
    • 'User Action' (useraction) users create filters, view watches and votes.
    • 'Browse Issue' (browseissue) users browse projects and issues.
    • 'RSS' users browse project and then periodically fetch the RSS feed for that project.
    • 'Create Issues' (createissue) users add new issues to the instance.
    • 'Search Issues' (search) users search for issues using the quick search textbox.

(info) There is no execution of JavaScript by the JMeter client. JavaScript performance will depend on several factors such as your browser and workstation speed. JMeter does not measure this.

Running the fixed load test:

<jmeter location>/bin/jmeter -n -t jmeter-test-fixedload.jmx

As with the setup test (above), this command will run the fixed load test with the default values. Similarly, it is possible to control the execution of JMeter with -J parameters. The fixed load test has the following available parameters:

Configuration Control

Parameter

Default

Explanation

jira.host

localhost

The hostname or address of the JIRA installation.

jira.port

8000

The network port that the JIRA installation is running on.

jira.context

/

JIRA webapp context.

admin.user

admin

Administrator username.

admin.pass

admin

Administrator password.

script.base

.

The location of the performance tests. This should only be set if you run the tests from outside the scripts directory.

script.runtime

1800

The amount of time to run for (in seconds).

resource.dir

resources

The subdirectory which contains the resource CSV files. Replace this if you wish to customize the backend data.

Edit Issue

Parameter

Default

Explanation

editissue.threads

5

The number of simultaneous 'Edit Issue' users to simulate.

editissue.pause

15000

The pause between each 'Edit Issue' user request (in milliseconds).

workflow.matchname

(Close | Resolve)

A regular expression to match the workflow to action.

editworkflow.percentage

20

The percentage of 'Edit Issue' user requests that will attempt to change the issue workflow.

addcomment.percentage

60

The percentage of 'Edit Issue' user requests that will attempt to add a comment to an issue.

editissue.percentage

20

The percentage of 'Edit Issue' user requests that will attempt to edit an issue.

editissue.issuestoown

5

The number of issues the test attempts to assign to an 'Edit Issue' user.

User Actions

Parameter

Default

Explanation

useraction.threads

1

The number of simultaneous 'User Action' users to simulate.

useraction.pause

40000

The pause between each 'User Action' user request (in milliseconds).

createfilter.percentage

10

The percentage of 'User Action' user requests that will attempt to create a filter.

viewwatches.percentage

10

The percentage of 'User Action' user requests that will attempt to 'view watches'.

viewvotes.percentage

10

The percentage of 'User Action' user requests that will attempt to view votes.

Browse Issues and Projects

Parameter

Default

Explanation

browseissue.threads

40

The number of simultaneous 'Browse Issue' users to simulate.

browseissue.pause

3000

The pause between each 'Browse Issue' user request (in milliseconds).

userprofile.percentage

10

The percentage of 'Browse Issue' user requests that will attempt to browse a user profile.

browsedashboard.percentage

20

The percentage of 'Browse Issue' user requests that will attempt to browse the dashboard.

dashboard.category

allprojects

The project category for project status gadget requests.

RSS

Parameter

Default

Explanation

browserss.threads

10

The number of simultaneous 'RSS' users to simulate.

browserss.pause

60000

The pause between each 'RSS' user request (in milliseconds).

Create Issues

Parameter

Default

Explanation

issue.create.threads

3

The number of simultaneous 'Create Issue' users to simulate.

issue.create.pause

15000

The pause between each 'Create Issue' user request (in milliseconds).

issue.comment.max

2

The maximum number of comments on an issue. The actual number is chosen randomly between 0 and this value.

Search For Issues

Parameter

Default

Explanation

search.threads

2

The number of simultaneous 'Search' users to simulate.

search.pause

15000

The pause between each 'Search' user request (in milliseconds).

Test Output

Once you have chosen your target settings, run JMeter and you should get output similar to the following:

jmeter -n -t jmeter-test-fixedload.jmx
Created the tree successfully using jmeter-test-fixedload.jmx
Starting the test @ Wed Oct 28 01:13:22 CDT 2009 (1256710402435)
Waiting for possible shutdown message on port 4445
Generate Summary Results +   568 in  97.9s =     5.8/s Avg:    62 Min:     1 Max:  1534 Err:     0 (0.00%)
Generate Summary Results +  3861 in 179.4s =    21.5/s Avg:    39 Min:     0 Max:   494 Err:     0 (0.00%)
Generate Summary Results =  4429 in 277.4s =    16.0/s Avg:    42 Min:     0 Max:  1534 Err:     0 (0.00%)
Generate Summary Results +  7356 in 180.0s =    40.9/s Avg:    37 Min:     0 Max:   481 Err:     0 (0.00%)
Generate Summary Results = 11785 in 457.3s =    25.8/s Avg:    39 Min:     0 Max:  1534 Err:     0 (0.00%)
Generate Summary Results + 10841 in 180.1s =    60.2/s Avg:    38 Min:     0 Max:   995 Err:     0 (0.00%)
Generate Summary Results = 22626 in 637.4s =    35.5/s Avg:    39 Min:     0 Max:  1534 Err:     0 (0.00%)
Generate Summary Results + 11821 in 180.3s =    65.6/s Avg:    37 Min:     0 Max:   507 Err:     0 (0.00%)
Generate Summary Results = 34447 in 817.3s =    42.1/s Avg:    38 Min:     0 Max:  1534 Err:     0 (0.00%)
Generate Summary Results + 11904 in 180.9s =    65.8/s Avg:    38 Min:     0 Max:   658 Err:     0 (0.00%)
Generate Summary Results = 46351 in 997.4s =    46.5/s Avg:    38 Min:     0 Max:  1534 Err:     0 (0.00%)
Generate Summary Results + 11697 in 180.3s =    64.9/s Avg:    38 Min:     0 Max:   488 Err:     0 (0.00%)
Generate Summary Results = 58048 in 1177.4s=    49.3/s Avg:    38 Min:     0 Max:  1534 Err:     0 (0.00%)
Generate Summary Results + 11731 in 180.0s =    65.2/s Avg:    39 Min:     0 Max:   810 Err:     0 (0.00%)
Generate Summary Results = 69779 in 1357.4s=    51.4/s Avg:    38 Min:     0 Max:  1534 Err:     0 (0.00%)
Generate Summary Results + 11646 in 180.0s =    64.7/s Avg:    39 Min:     0 Max:   776 Err:     0 (0.00%)
Generate Summary Results = 81425 in 1537.4s=    53.0/s Avg:    38 Min:     0 Max:  1534 Err:     0 (0.00%)
Generate Summary Results + 11810 in 180.0s =    65.6/s Avg:    39 Min:     0 Max:   798 Err:     0 (0.00%)
Generate Summary Results = 93235 in 1717.3s=    54.3/s Avg:    38 Min:     0 Max:  1534 Err:     0 (0.00%)
Generate Summary Results +  5453 in 109.1s =    50.0/s Avg:    42 Min:     0 Max:   858 Err:     0 (0.00%)
Generate Summary Results = 98688 in 1826.4s=    54.0/s Avg:    39 Min:     0 Max:  1534 Err:     0 (0.00%)
Tidying up ...    @ Wed Oct 28 01:43:49 CDT 2009 (1256712229128)
... end of run

This output will be updated every 3 minutes, showing the most recent activity as well as a summary for the whole test.

Result Logs

As above, there will be output on the command line and log files will be created for both the successful (jmeter-results-setup.jtl) and unsuccessful (jmeter-assertions-setup.jtl) results. These log files are saved in the JTL format (based on XML). There are a number of parsers which will generate graphs from these logs files. For more information, see the JMeter wiki page on Log Analysis.

Was this helpful?

Thanks for your feedback!

Why was this unhelpful?

Have a question about this article?

See questions about this article

Powered by Confluence and Scroll Viewport