Prevent Bamboo from being indexed by search engines

Still need help?

The Atlassian Community is here for you.

Ask the community

Platform notice: Server and Data Center only. This article only applies to Atlassian products on the Server and Data Center platforms.

Support for Server* products ended on February 15th 2024. If you are running a Server product, you can visit the Atlassian Server end of support announcement to review your migration options.

*Except Fisheye and Crucible

Problem

When Bamboo is exposed to the internet, there are occasions when we may want to prevent search engines like Google from indexing the server. This way, we can prevent some pages from being exposed to the public via search engines. This is possible by introducing robots.txt in Bamboo.

Solution

  • Stop Bamboo.

  • Create a file robots.txt on <bamboo-install>/atlassian-bamboo directory and add the below content. You may customize this entry as per your need as explained in this article.

    User-agent: *
    Disallow: /
  • Start Bamboo

Notes

Bamboo >= 9.2.5, 9.3.4, 9.4.0

Bamboo >= 8.1.12, 8.2.8, 9.0.3, 9.1.2, 9.2.3, 9.3.0

Bamboo's default Secure Servlet prevents unknown files from being exposed, including robots.txt. This is documented on BAM-22474 - Getting issue details... STATUS

To allow external files, disable Bamboo's Default Secure Servlet so it can serve any files inside the context path.

  • Comment the following block in <bamboo-install>/atlassian-bamboo/WEB-INF/web.xml:

      <!-- servlet-mapping>
        <servlet-name>bamboo-default-servlet</servlet-name>
        <url-pattern>/</url-pattern>
      </servlet-mapping -->
  • Add the robots.txt file to the location as instructed in the Solution section
  • Restart Bamboo


Descriptionstop Bamboo from being indexed by Google
ProductBamboo

Last modified on Sep 4, 2023

Was this helpful?

Yes
No
Provide feedback about this article
Powered by Confluence and Scroll Viewport.