Pipes

On this page

Still need help?

The Atlassian Community is here for you.

Ask the community

Pipes are designed to simplify configuring your pipeline. They are extra powerful for actions that take several lines of code, especially when you want to work with third party tools. You just paste the pipe, supply a few key pieces of information, and the rest is done for you. To start, we've come up with a few commonly used actions for CI/CD, and in future you'll be able to add your own.

Rather than lines and lines of script, a pipe simply contains a path to the pipe, and some parameters for you to fill in:

script:
  - pipe: atlassian/aws-s3-deploy:0.2.1
    variables:
      AWS_ACCESS_KEY_ID: "$AWS_ACCESS_KEY_ID" # using one of my repository variables
      AWS_SECRET_ACCESS_KEY: "$AWS_SECRET_ACCESS_KEY"
      AWS_DEFAULT_REGION: "us-east-1" 
      S3_BUCKET: "my-bucket-name"
      LOCAL_PATH: "build"

For some deployment pipes, like AWS Elastic Beanstalk Deploy and NPM Publish, we also provide a convenient link in the logs to view the deployed application.

animation of clicking the link in the logs

How to use pipes

There are two ways to add pipes to your pipeline.

Use the online editor

tip/resting Created with Sketch.

We are gradually rolling out this feature, so if you don't see pipes in your editor yet, you can edit the configuration directly, or join our alpha group which has full access.


  1. Open up your bitbucket-pipelines.yml file in the editor.
  2. If pipes don't automatically show on the right, click .
  3. Select the pipe you need.
  4. Copy the pipe, and paste it into the script section of your step.
  5. Add your specific values in double quotes (and uncomment any optional parameters you want to use).
  6. Run your build.

You can fill in the parameter values in-line, or use predefined variables.

Edit the configuration directly

You can add the details of the task to your bitbucket-pipelines.yml file using an editor of your choice.

The README.md file in the available pipes listed below contains instructions on how to use the pipe and the lines you can copy and paste into your bitbucket-pipelines.yml file. While you are there you can have a peek at the scripts to see all the good stuff the pipe is doing behind the scenes.

How it works

We've put all the commands you previously had to put in your yml, with a few useful extras, into a script inside a repository. A pipe uses this script to perform powerful actions, and you only need to provide a few values. The pipes we've provided are public, so you can check the source code to see how it all works.

Memory considerations

Our pipes need to use the docker service, which we start transparently. This counts towards the number of services you can run in a step. By default this docker service uses 1GB of memory, but you can change that in your configuration.

Example: add a pipe to upload to Amazon S3 bucket

If we want our pipeline to upload the contents of the build directory to our my-bucket-name S3 bucket, we can use the AWS S3 Deploy pipe.

  1. Reveal the AWS S3 Deploy pipe by clicking on it
  2. Press the copy button
  3. Move the cursor to the script section of the deployment step (placement is important!) 
  4. Paste the copied code
  5. Here we removed the optional parameters, but you can uncomment them if you need them
  6. Add the parameter values for our deployment (always in double quotes):
    1. using our predefined variables $AWS_ACCESS_KEY_ID and $AWS_SECRET_ACCESS_KEY
    2. specifying our region as us-east-1
    3. supplying the name of our bucket: my-bucket-name
    4. defining that we want to deploy the contents of our build folder
script:
  - pipe: atlassian/aws-s3-deploy:0.2.1
    variables:
      AWS_ACCESS_KEY_ID: "$AWS_ACCESS_KEY_ID" 
      AWS_SECRET_ACCESS_KEY: "$AWS_SECRET_ACCESS_KEY" 
      AWS_DEFAULT_REGION: "us-east-1" 
      S3_BUCKET: "my-bucket-name"
      LOCAL_PATH: "build"

Pipe versions

Pipes use semantic versioning, so as we make changes to the pipe the version number will change. We'll be updating the repository changelog as we go. If you want to use a different version, you can by changing the version number in your bitbucket-pipelines.yml file.

For example:

changing pipe: atlassian/aws-s3-deploy:0.1.0

to pipe: atlassian/aws-s3-deploy:0.2.0

Note: 0.x.y versioned pipes are an exception to classic semantic versioning, as they are still under development, and can introduce breaking changes between minor and patch version changes.

Available pipes

All pipes are kept in the Bitbucket Pipelines pipes project.

Follow the links below to look at the pipe repository:

AWS CodeDeploy

Push a deployment to S3, then trigger it via CodeDeploy.

AWS Elastic Beanstalk Deploy

Deploy your code to Elastic Beanstalk.

AWS Lambda Deploy

Update Lambda function code, and create or update aliases pointing to functions.

AWS S3 Deploy

Upload files to your S3 bucket.

Azure storage deploy

Deploy to Microsoft Azure Storage.

Azure web apps deploy

Deploy to Azure Web Apps.

Datadog Send Event

Send events to Datadog.

GCP App Engine deploy

Deploy your application to Google App Engine.

GCP Cloud Storage deploy

Upload files to your GCP storage bucket.

NPM Publish

Logs into an npm repository and publishes a package from package.json, in the current build directory.

Opsgenie Send Alert

Send alerts to Opsgenie.

SCP Deploy

Copy files to a remote server using SCP.

SFTP Deploy

Copy files to a remote server using SFTP.

Slack Notify  

Sends notification to Slack.


Missing pipe?

If there is a pipe you'd like to see, use the Suggest a pipe box at the bottom of the list of pipes to let us know.


Last modified on Feb 15, 2019

Was this helpful?

Yes
No
Provide feedback about this article
Powered by Confluence and Scroll Viewport.