Publish and link your build artifacts

While Bitbucket Pipelines doesn't offer artifact storage, it is straightforward to publish artifacts to Amazon's S3 storage service and link those to your commits using the Bitbucket build status API.

Once published and linked via the build status API, your artifact links will appear on your Bitbucket commit as shown below.

See also:


Step 1: Create an App password for the repository owner

Log in to Bitbucket as the repository owner (also the user who will upload the files) and go to Bitbucket Settings > App Passwords.

Create a new app password with write permissions to your repositories, and take note of the generated password that pops up. The name of the password is only for your reference, so use "Pipelines" or any other name you like.

You should now have two values that you will need for the next step.

Parameter Value
<username> Bitbucket username of the repository owner (and also the user who will upload the artifacts)
<password> App password as generated by bitbucket

Step 2: Create a Pipelines environment variable with the authentication token

Define a new secure environment variable in your Pipelines settings:

  • Parameter name: BB_AUTH_STRING
  • Parameter value: <username>:<password> (using the values from step 1)

You can define this environment variable at either the repository or account level. The example below is shown with an account environment variable.

Step 3: Publish your artifacts to AWS

If you are new to AWS or S3, follow the instructions on our example S3 integration to create an S3 bucket and configure the relevant authentication environment variables in Bitbucket Pipelines.

python <bucket-id> <artifact-file> <artifact-key>

Otherwise, you can use your existing AWS tooling to upload the artifact to an appropriate location.

Step 4: Link your artifacts to your build using the Bitbucket REST API

With the environment variable and app password in place and your artifact published to S3, you can now use curl in your build script to link your artifact's S3 URL to your Bitbucket commit via the build status REST API:

export S3_URL="https://${S3_BUCKET}${S3_KEY_PREFIX}_${BITBUCKET_COMMIT}"
export BUILD_STATUS="{\"key\": \"doc\", \"state\": \"SUCCESSFUL\", \"name\": \"Documentation\", \"url\": \"${S3_URL}\"}"
curl -H "Content-Type: application/json" -X POST --user "${BB_AUTH_STRING}" -d "${BUILD_STATUS}" "${BITBUCKET_REPO_OWNER}/${BITBUCKET_REPO_SLUG}/commit/${BITBUCKET_COMMIT}/statuses/build"


Example bitbucket-pipelines.yml

Below is an example combining all the pieces in a sample Python project. You should adjust all the parameters in the examples to match your repository, and make sure you have all the necessary environment variables (including AWS authentication tokens) defined.

image: python:3.5.1
      - step:
            - pip install boto3==1.3.0   # required for
            - python
            - python "${S3_BUCKET}" documentation.html "${S3_KEY_PREFIX}_${BITBUCKET_COMMIT}"   # upload docs to S3
            - export S3_URL="https://${S3_BUCKET}"${S3_KEY_PREFIX}_${BITBUCKET_COMMIT}"
            - export BUILD_STATUS="{\"key\":\"doc\", \"state\":\"SUCCESSFUL\", \"name\":\"Documentation\", \"url\":\"${S3_URL}\"}"
            - curl -H "Content-Type:application/json" -X POST --user "${BB_AUTH_STRING}" -d "${BUILD_STATUS}" "${BITBUCKET_REPO_OWNER}/${BITBUCKET_REPO_SLUG}/commit/${BITBUCKET_COMMIT}/statuses/build"

You can check your bitbucket-pipelines.yml file with our online validator.

Last modified on Nov 6, 2017

Was this helpful?

Provide feedback about this article
Powered by Confluence and Scroll Viewport.