Crowd Data Center Performance

This document describes the performance tests conducted on Crowd Data Center within Atlassian and the results we found.

We aimed to test concurrent authentication requests, hoping that as we increased the number of users, we saw better overall authentication throughput compared to the same number of users in a Server instance.

You can compare these data points to your own implementation to predict the type of results you might expect from implementing Crowd Data Center in your own organization.

We started our load tests by determining the baseline for Crowd Server. We determined this by putting Crowd Server under different levels of authentication requests load until we saw a sizeable decrease in error response rate. We then installed Crowd Data Center onto the same hardware and ran the same set of tests. 

Further below you will find more detailed information on hardware specsload statisticsresponse times, and our testing methodology.

Testing results summary

  • Under a high authentication requests load, Crowd Data Center has improved performance overall.
  • Throughput and capacity show a near linear increase as nodes are added to the cluster.
  • Specific actions show varying improvement in response times. For details, see Testing results for specific actions.
  • You might observe a different trend/behavior based on your configuration and usage. For details, see What we tested.


Testing methodology and specifications

The following sections detail the testing environment and methodology we used in our performance tests.

How we tested

Our load tests were all run on the same set of isolated Amazon EC2 instances. For each test, the entire environment was reset and rebuilt. The testing environment included installing the following components:

  • ELB load balancer
  • PostgreSQL database with the required data
  • Crowd Data Center on one, two, or four machines as required for a specific test
  • Open-source Load & Performance Testing Tool for simulating user authentication against Crowd deployed on one machine, required to generate the load for specific tests

To run the tests, we used the Open-Source Load & Performance Testing to generate user authentication requests load, and to measure the number of successful requests that were handled by Crowd. Each test was run for 3 minutes, after which statistics were collected.

What we tested

  • All tests used the same PostgreSQL database with 10,000 users.
  • All users were stored in the Crowd's internal directory.
  • Authentication requests were selected for this test as they represent the most common user actions in Crowd.


All performance tests were run on the same set of isolated Amazon EC2 instances with the following hardware.


EC2 classHardwareNo. of instances



  • 2 x 2.3 GHz Intel Xeon® E5-2686 v4 (Broadwell) processors, or
  • 2.4 GHz Intel Xeon® E5-2676 v3 (Haswell) processors


  • 8 GB

Up to 4, depending on the test.

Hardware testing notes:

  • The tested Crowd instances used default settings and JVM options.
  • During the tests, we didn't observe high CPU or IO load on either the database or load balancer servers.
  • The testing tool and servers were in different locations to resemble the actual setup.

Testing results for user authentication

The following table shows the average throughput of authentication requests for each Crowd configuration:

  • Crowd Server (one node)
  • Crowd Data Center (two nodes)
  • Crowd Data Center (three nodes)
  • Crowd Data Center (four nodes)

Average for all actions

TestCrowd Server2 nodes3 nodes4 nodes
Number of requests handled per second3886115140

Ready to get started? 

Get going with Data Center straight away.

Last modified on Sep 12, 2017

Was this helpful?

Provide feedback about this article
Powered by Confluence and Scroll Viewport.