We no longer support these performance testing scripts.

They are no longer used internally for testing and may not work with Confluence 5.2.

Load Testing Confluence

This page contains scripts and hints on load-testing your Confluence installations.

Introduction

Before making a new Confluence instance available to your users it is useful to get a feel for how it will perform under your anticipated load and where you may need to consider improving your configuration to remove bottlenecks. Likewise, before making changes to your Confluence instance it would again be useful to assess the impact of these changes before making them live in a production context.

This kind of testing is not an exact science but the tools and process described here are intended to be a straightforward, configurable and extensible way of allowing you to begin this kind of load testing.

It will rarely be the case that these scripts will perform representative testing for you 'out of the box'. But either through configuration or by extending the scripts it should be possible to build an appropriate load test.

Load testing scripts are not designed for a production environment

The load testing scripts will update the data within the targeted Confluence instance and are not designed to be run against a production server. If you want to load test your production environment you will need to perform these tests on a backup of your data and restore your real data after the tests.

On this page:

(warning) The information on this page does not apply to Confluence OnDemand.

Setup

You will need the following -

  • A Confluence server, set up and running with an admin user. The scripts assume a default username and password for this user: 'admin'/'admin'.
  • Ensure the Confluence Remote API is enabled in the administration options. See Enabling the Remote API for details on how to configure this.
  • Apache JMeter
  • The load testing scripts and resources which are available in our public Maven repository— Please choose the version that most closely matches your Confluence version and download the ZIP or Gzip file in that directory. If in doubt, download the ZIP file archive.
    • e.g. 

      Confluence Version

      Performance Test Script

      4.0.3 - 4.1.x

      4.0.3

      4.2 - 5.0.x

      4.2.3

      5.1 - 5.4.x4.3

Users have reported problems when using the Windows built-in unzip utility. Please use a third party file archiving and extraction program (for example, 7-Zip) to extract these performance tests.

The test scripts have been updated to work with Confluence 3.4 in version 3.4.  Using an older version of the tests will result in errors when running the test.

Quick, Just Tell Me How To Run It.

If you don't want to read the rest of this document, here are the main points:

  1. Download and Unzip the performance tests
  2. Open a command prompt and change directory to the performanceTest directory that has just been unzipped.
  3. Create the test data:

    <jmeter location>/bin/jmeter -n -t jmeter-test-setup.jmx -Jspace.zip=<path to a demo space ZIP file> -Jadmin.user=<username> -Jadmin.pass=<password>
    
  4. Run the test:

    <jmeter location>/bin/jmeter -n -t jmeter-test-fixedload.jmx
    

The remainder of this document is just an elaboration of those two steps.

For information on how to use JMeter please refer to the manual

Creating the Test Data

A known data set is required to run the testing against. By default this is the Confluence demo space (space key = DS) although this can be changed (more on this later). If you decide to use the Confluence demo space, ensure that the group "confluence-users" is able to update content in this space.

The script jmeter-test-setup.jmx is used to:

  • create a set of users to be used in the test
  • import the Confluence demo space for running tests against.

You should first ensure that you don't already have the demo space (key = DS) on your test instance. Delete it if you do.

Run the script from the performanceTest directory as follows:

<jmeter location>/bin/jmeter -n -t jmeter-test-setup.jmx -Jspace.zip=<path to a space export.zip>-Jadmin.user=<username> -Jadmin.pass=<password>

Where:

  • <path to a space export.zip> is the absolute path to the space export zip you want to be used in your testing. For example, the path to demo-site.zip as found in your Confluence distribution or source: <confluence install>/confluence/WEB-INF/classes/com/atlassian/confluence/setup/demo-site.zip
  • <username> and <password> are the username and password for an admin user that is able to create Confluence users and to import spaces.

By default the setup process will create 250 users — 50 each of the following formats: tstreader<n>, tstcommentor<n>, tsteditor<n>, tstcreator<n> and tstsearcher<n>. The password for each matches the username.

A typical run of the setup script will only take a few seconds.

Removing the Test Data

You can reverse the effects of the setup script by setting the remove.data parameter to true, e.g.

<jmeter location>/bin/jmeter -n -t jmeter-test-setup.jmx -Jremove.data=true -Jadmin.user=<username> -Jadmin.pass=<password>

Setup Script Parameters

You can modify the behaviour of the setup script via JMeter parameters. These are supplied on the command line in the form -J<parameter name>=<parameter value>.

Parameter

Default

Explanation

script.base

.

The absolute path to the script. Defaults to the current working directory.

space.zip

N/A

The absolute path to space export zip file to be imported as test data.

remove.data

false

Run the script in reverse — remove all test data.

admin.user

admin

The admin user name used to import data and create users.

admin.pass

admin

The password for the admin user.

confluence.context

confluence

The confluence webapp context.

confluence.host

localhost

The address or host name of the test instance.

confluence.port

8080

The port of the test instance.

space.key

ds

The space key for the space import that will be tested against.

space.setup

true

Control whether the test space will be created (or removed).

commentor.max

250

The number of users to be created for making comments.

creator.max

250

The number of users to be created for adding pages.

editor.max

250

The number of users to be created for editing existing pages.

reader.max

250

The number of users to be created for viewing existing pages.

searcher.max

250

The number of users to be created for performing searches.

resource.max

250

The number of users to be created for downloading site resources.

attachments.max

250

The number of users to be created for downloading attachments.

Setup Script Output

On the console you will see no obvious indication of success or otherwise. JMeter will output something similar to this:

Created the tree successfully
Starting the test @ Mon Apr 14 17:35:08 EST 2008 (1208158508222)
Tidying up ... @ Mon Apr 14 17:35:08 EST 2008 (1208158508928)
... end of run

The scripts location/results directory will contain the file jmeter-result-setuptest.jtl. There were failures or errors if there are any assertions in this file that have the value true for the failure or error element, e.g.

<assertionResult>
<name>Manage Users</name>
<failure>true</failure>
<error>false</error>
<failureMessage>Test failed: URL expected to contain /browseusers.action/</failureMessage>
</assertionResult>

Running the Test

The test script itself will put Confluence under a fixed load. Each thread group will attempt to do a certain amount of work for a prescribed period of time (30 minutes by default). This is by design so that load during test runs can accurately be compared against each other.

Execute the test as follows:

<jmeter location>/bin/jmeter -n -t jmeter-test-fixedload.jmx

Where:
<scripts location> is the absolute path to where you extracted the scripts e.g. /Users/YourName/Download/performanceTest. This is needed for the script to find its external resources.

Test Behaviour

The test has a number of parameters to tweak its behaviour but generally speaking it has the rough format of:

  • 5 groups of users - readers, commentors, searchers, editors and creators.
    • readers simply view a set of individual pages or browse space functionality.
    • commentors add comments to a set of pages.
    • searchers perform searches on a fixed set of keywords.
    • editors make small additions to the end of a set of pages.
    • creators add new pages to a particular space.
  • Each individual user in each group will repeat for a fixed amount of time with a small pause between each request.

Note that there is no execution of JavaScript by the client. Keep this in mind if you use this test to gauge Confluence performance in a production environment.

There is also very little use of permissions in these tests. All data involved is accessible to all of the test users.

Test Script Parameters

You can modify the behaviour of the test script via JMeter parameters. These are supplied on the command line in the form -J<parameter name>=<parameter value>.

Parameter

Default

Explanation

script.base

.

The absolute path to the script. Defaults to the current working directory.

confluence.context

confluence

The confluence webapp context.

confluence.host

localhost

The address or host name of the test instance.

confluence.port

8080

The port of the test instance.

create.page.prefix

Nihilist

The title prefix for any created page e.g. Nihilist00001.

script.runtime

1800

The amount of time the script will run for in seconds.

Test Thread Parameters

Parameter

Default

Explanation

threads.reader

15

Number of readers.

pause.reader

2000

The approximate (within 500ms) millisecond pause between reader repeats.

threads.searcher

8

Number of searchers.

pause.searcher

2000

The approximate (within 500ms) millisecond pause between searcher repeats.

threads.creator

3

Number of page creators.

pause.creator

2000

The approximate (within 500ms) millisecond pause between creator repeats.

threads.editor

3

Number of page editors.

pause.editor

2000

The approximate (within 500ms) millisecond pause between editor repeats.

threads.commentor

4

Number of page commentors.

pause.commentor

2000

The approximate (within 500ms) millisecond pause between commentor repeats.

In version 3.0 of the tests, it's now possible to control the percentage executions of certain actions. These percentages are defined in the "Thread Details" configuration screen.

So with the default parameters, you are emulating a load on Confluence of 33 concurrent users who will each be hitting the server approximately every 2 seconds (16 users per second).

23 of these users are read only (searchers or readers) and 10 of them are read/write — 11 read only users per second and 5 read/write users per second.

Test Script Output

During the run of the test script Jmeter will output progress to the console of the form:

Created the tree successfully
Starting the test @ Fri Apr 18 00:07:39 EST 2008 (1208441259523)
Display Summary Results During Run + 462 in 77.6s = 5.9/s Avg: 1564 Min: 18 Max: 33738 Err: 1 (0.22%)
Display Summary Results During Run + 1338 in 189.9s = 7.0/s Avg: 3596 Min: 24 Max: 34545 Err: 0 (0.00%)
Display Summary Results During Run = 1800 in 257.6s = 7.0/s Avg: 3074 Min: 18 Max: 34545 Err: 1 (0.06%)
Display Summary Results During Run + 1046 in 200.9s = 5.2/s Avg: 4529 Min: 40 Max: 50461 Err: 0 (0.00%)
Display Summary Results During Run = 2846 in 438.2s = 6.5/s Avg: 3609 Min: 18 Max: 50461 Err: 1 (0.04%)
Display Summary Results During Run + 677 in 201.2s = 3.4/s Avg: 6638 Min: 46 Max: 27636 Err: 0 (0.00%)
Display Summary Results During Run = 3523 in 618.1s = 5.7/s Avg: 4191 Min: 18 Max: 50461 Err: 1 (0.03%)
Display Summary Results During Run + 561 in 197.5s = 2.8/s Avg: 8326 Min: 171 Max: 39494 Err: 0 (0.00%)
Display Summary Results During Run = 4084 in 798.3s = 5.1/s Avg: 4759 Min: 18 Max: 50461 Err: 1 (0.02%)
Display Summary Results During Run + 555 in 199.2s = 2.8/s Avg: 8247 Min: 160 Max: 45270 Err: 0 (0.00%)
Display Summary Results During Run = 4639 in 978.0s = 4.7/s Avg: 5177 Min: 18 Max: 504
  • No labels