Prevent Bamboo from being indexed by search engines
Platform notice: Server and Data Center only. This article only applies to Atlassian products on the Server and Data Center platforms.
Support for Server* products ended on February 15th 2024. If you are running a Server product, you can visit the Atlassian Server end of support announcement to review your migration options.
*Except Fisheye and Crucible
Problem
When Bamboo is exposed to the internet, there are occasions when we may want to prevent search engines like Google from indexing the server. This way, we can prevent some pages from being exposed to the public via search engines. This is possible by introducing robots.txt
in Bamboo.
Solution
Stop Bamboo.
Create a file
robots.txt
on<bamboo-install>/atlassian-bamboo
directory and add the below content. You may customize this entry as per your need as explained in this article.User-agent: * Disallow: /
Start Bamboo
Notes
Bamboo >= 9.2.5, 9.3.4, 9.4.0
- Add the
robots.txt
file to the location as instructed in the Solution section - Add the following property to Bamboo:
-Dbamboo.enable.robots.txt=true
- Restart Bamboo
Bamboo >= 8.1.12, 8.2.8, 9.0.3, 9.1.2, 9.2.3, 9.3.0
Bamboo's default Secure Servlet prevents unknown files from being exposed, including robots.txt
. This is documented on
-
BAM-22474Getting issue details...
STATUS
To allow external files, disable Bamboo's Default Secure Servlet so it can serve any files inside the context path.
Comment the following block in
<bamboo-install>/atlassian-bamboo/WEB-INF/web.xml:
<!-- servlet-mapping> <servlet-name>bamboo-default-servlet</servlet-name> <url-pattern>/</url-pattern> </servlet-mapping -->
- Add the
robots.txt
file to the location as instructed in the Solution section - Restart Bamboo