Configure bitbucket-pipelines.yml

At the center of Pipelines is the bitbucket-pipelines.yml file. It defines all your build configurations (pipelines) and needs to be created in the root of your repository. With 'configuration as code', your bitbucket-pipelines.yml is versioned along with all the other files in your repository, and can be edited in your IDE. If you've not yet created this file, you might like to read Get started with Bitbucket Pipelines first.

YAML is a file format that is easy to read, but writing it requires care. Indenting must use spaces, as tab characters are not allowed.

There is a lot you can configure your pipelines to do, but at its most basic the required keywords in your YAML file are:

pipelines:marks the beginning of all your pipeline definitions.

default: contains the steps that will run on every push.

step : each step starts a new Docker container that includes a clone of your repository, and then runs the contents of your script section inside it.

script : a list of commands that are executed in sequence.


Pipelines can contain any software language that can be run on Linux. We have some examples, but at its most basic a bitbucket-pipelines.yml file could look like this:

pipelines:
  default:
    - step:
        script:
          - echo "I made a pipeline!"

Key concepts

A pipeline is made up of a set of steps.

  • Each step in your pipeline runs a separate Docker container. If you want, you can use different types of container for each step, by selecting different images.
  • The step runs the commands you provide in the environment defined by the image.
  • A single pipeline can have up to 10 steps.

A commit signals that a pipeline should run. Which pipeline runs depends on which section it's in:

  • default - All commits trigger this pipeline, unless they match one of the other sections
  • branches - Specify the name of a branch, or use a glob pattern.
  • tags (Git only) or bookmarks (Mercurial only) - Specify the name of a tag or bookmark, or use a glob pattern.
  • custom - will only run when manually triggered.


Illustration of how indenting creates logical 'sections'

Keywords

You can define your build pipelines by using a selection of the following keywords. They are arranged in this table in the order in which you might use them, with highlighted rows to show keywords that define a logical section.

Keyword Description
pipelines Contains all your pipeline definitions.
default

Contains the pipeline definition for all branches that don't match a pipeline definition in other sections.

branches

Contains pipeline definitions for specific branches.

tags Contains pipeline definitions for specific Git tags and annotated tags.
bookmarks Contains pipeline definitions for specific Mercurial bookmarks.
custom Contains pipelines that can be triggered manually from the Bitbucket Cloud GUI.
parallel Contains steps to run concurrently.
step

Defines a build execution unit. This defines the commands executed and settings of a unique container.

name Defines a name for a step to make it easier to see what each step is doing in the display.
image The Docker image to use for a step. If you don't specify the image, your pipelines run in the default Bitbucket image. This can also be defined globally to use the same image type for every step.
trigger Specifies whether the step is manual or automatic. If you don't specify a trigger type, it defaults to automatic.
deployment

Sets the type of environment for your deployment step.

Valid values are: test, staging, or production.

size

Used to provision extra resources for pipelines and steps.

Valid values are: 1x or 2x

script Contains the list of commands that are executed to perform the build.
artifacts Defines files that are produced by a step, such as reports and JAR files, that you want to share with a following step.
options Contains global settings that apply to all your pipelines.
max-time

The maximum time (in minutes) a step can execute for.

Use a whole number greater than 0 or less than 120. If you don't specify a max-time, it defaults to 120.

clone Contains settings for when we clone your repository into a container
lfs Enables the download of LFS files in your clone. This defaults to false if not specified.
depth

Defines the depth of Git clones for all pipelines.

Use a whole number greater than zero to specify the depth. Use full for a full clone. If you don't specify the Git clone depth, it defaults to 50.

Note: This keyword is supported only for Git repositories.

definitions Defines resources, such as services and custom caches, that you want to use elsewhere in your pipeline configurations.
services Define services you would like to use with you build, which are run in separate but linked containers.
caches Define dependencies to cache on our servers to reduce load time.

pipelines

The start of your pipelines definitions. Under this keyword you must define your build pipelines using at least one of the following:

  • default (for all branches that don't match any of the following)
  • branches (Git and Mercurial)
  • tags (Git)
  • bookmarks (Mercurial)
image: node:4.6.0
  
pipelines:
  default:
    - step:
        name: Build and test
        script:
          - npm install
          - npm test
  tags:                         # add the 'tags' section
    release-*:                  # specify the tag
      - step:                   # define the build pipeline for the tag
          name: Build and release
          script:
            - npm install
            - npm test
            - npm run release
  branches:
    staging:
      - step:
          name: Clone
          script:
            - echo "Clone all the things!" 

default

The default pipeline runs on every push to the repository, unless a branch-specific pipeline is defined. You can define a branch pipeline in the branches section.

Note: The default pipeline doesn't run on tags or bookmarks.


branches

Defines a section for all branch-specific build pipelines. The names or expressions in this section are matched against:

  • branches in your Git repository
  • named branches in your Mercurial repository

You can use glob patterns for handling the branch names.

See Branch workflows for more information about configuring pipelines to build specific branches in your repository.


tags

Defines a container for all tag-specific build pipelines. The names or expressions in this section are matched against tags and annotated tags in your Git repository. You can use glob patterns for handling the tag names.



bookmarks

Serves as a container for all bookmark-specific build pipelines. The names or expressions in this section are matched against bookmarks in your Mercurial repository. You can use glob patterns for handling the tag names.


image: node:4.6.0
  
pipelines:
  default:
    - step:
        name: Build and test
        script:
          - npm install
          - npm test
  bookmarks:                      # add the 'bookmarks' section
    release-*:                    # specify the bookmark
      - step:                     # define the build pipeline for the bookmark
          name: Build and release
          script:
            - npm install
            - npm test
            - npm run release
  branches:
    staging:
      - step:
          name: Clone
          script:
            - echo "Clone all the things!"

custom

Defines a container for pipelines that can only be triggered manually or scheduled from the Bitbucket Cloud interface.


image: node:4.6.0
   
pipelines:
  custom: # Pipelines that are triggered manually
    sonar: # The name that is displayed in the list in the Bitbucket Cloud GUI
      - step:
          script:
            - echo "Manual triggers for Sonar are awesome!"
    deployment-to-prod: # Another display name
      - step:
          script:
            - echo "Manual triggers for deployments are awesome!"
  branches:  # Pipelines that run automatically on a commit to a branch
    staging:
      - step:
          script:
            - echo "Automated pipelines are cool too."


With a configuration like the one above, you should see the following pipelines in the Run pipeline dialog in Bitbucket Cloud:



For more information, see Run pipelines manually.


parallel

Parallel steps enable you to build and test faster, by running a set of steps at the same time.

The total number of build minutes used by a pipeline will not change if you make the steps parallel, but you'll be able to see the results sooner.

There is a limit of 10 for the total number of steps you can run in a pipeline, regardless of whether they are running in parallel or serial.

Indent the steps to define which steps run concurrently:

pipelines:
  default:
    - step:          # non-parallel step
        name: Build
        script:
          - ./build.sh
    - parallel:      # these 2 steps will run in parallel
        - step:
            name: Integration 1
            script:
              - ./integration-tests.sh --batch 1
        - step:
            name: Integration 2
            script:
              - ./integration-tests.sh --batch 2
    - step:          # non-parallel step
        script:
          - ./deploy.sh

Learn more about parallel steps.

step

Defines a build execution unit. Steps are executed in the order that they appear in the bitbucket-pipelines.yml file. You can use up to 10 steps in a pipeline.

Each step in your pipeline will start a separate Docker container to run the commands configured in the script. Each step can be configured to:

Steps can be configured to wait for a manual trigger before running. To define a step as manual, add trigger: manual to the step in your bitbucket-pipelines.yml file. Manual steps:

  • Can only be executed in the order that they are configured. You cannot skip a manual step.
  • Can only be executed if the previous step has successfully completed.
  • Can only be triggered by users with write access to the repository.
  • Are triggered through the Pipelines web interface.

If your build uses both manual steps and artifacts, the artifacts are stored for 7 days following the execution of the step that produced them. After this time, the artifacts expire and any manual steps in the pipeline can no longer be executed. For more information, see Manual steps and artifact expiry.

Note: You can't configure the first step of the pipeline as a manual step.

name

You can add a name to a step to make displays and reports easier to read and understand.

image of pipeline with names used

image

Bitbucket Pipelines uses Docker containers to run your builds.

  • You can use the default image (atlassian/default-image:latest) provided by Bitbucket or define a custom image. You can specify any public or private Docker image that isn't hosted on a private network.
  • You can define images at the global or step level. You can't define an image at the branch level.

To specify an image, use

image: <your_account/repository_details>:<tag>

For more information about using and creating images, see Use Docker images as build environments.

Examples

image: openjdk
Uses the image with the latest openjdk version
image: openjdk:8
Uses the image with openjdk version 8
image: nodesource/node:iojs-2.0.2
Uses the non-official node version with version iojs-2.0.2
image: openjdk 					#this image will be used by all steps unless overridden 
   
pipelines:
  default:
    - step:
      image: nodesource/node:iojs-2.0.2 #override the global image for this step
      script:
        - npm install
        - npm test
    - step: 					#this step will use the global image
	  script:
        - npm install
        - npm test


trigger

Specifies whether a step will run automatically or only after a user manually triggers it. You can define the trigger type as manual or automatic. If the trigger type is not defined, the step defaults to running automatically.

pipelines:
  default:
    - step:
        name: Build and test
        image: node:8.6
        script:
          - npm install
          - npm test
          - npm run build
        artifacts:
          - dist/**
    - step:
        name: Deploy
        image: python:3.5.1
        trigger: manual
        script:
          - python deploy.py


deployment

Sets the type of environment for your deployment step, used in the Deployments dashboard.

Valid values are test, staging, or production.

The following step will display in the test environment in the Deployments view:

    - step:
        name: Deploy to test
        image: aws-cli:1.0
        deployment: test
        script:
          - python deploy.py test

size

You can allocate additional resources to a step, or to the whole pipeline. By specifying the size of 2x, you'll have double the resources available (eg. 4GB memory → 8GB memory).

At this time, valid sizes are 1x and 2x.

2x pipelines will use twice the number of build minutes.

Overriding the size of a single step

pipelines:
  default:
    - step:
        script:
          - echo "All good things..."
    - step:
        size: 2x # Double resources available for this step.
        script:
          - echo "Come to those who wait."

Increasing the resources for an entire pipeline

Using the global size, all steps will inherit the '2x' size.

options:
  size: 2x

pipelines:
  default:
    - step:
        name: Clone with more memory
        script:
          - echo "Clone all the things!"

script

Contains a list of commands that are executed in sequence. Scripts are executed in the order in which they appear in a step. We recommend that you move large scripts to a separate script file and call it the from bitbucket-pipelines.yml.


artifacts

Defines files to be shared from one step to a later step in your pipeline. Artifacts can be defined using glob patterns.

An example showing how to define artifacts:

pipelines:
  default:
    - step:
        name: Build and test
        image: node:8.5.0
        script:
          - npm install
          - npm test
          - npm run build
        artifacts:
          - dist/**
    - step:
        name: Deploy to production
        image: python:3.5.1
        script:
          - python deploy-to-production.py

For more information, see using artifacts in steps

options

Contains global settings that apply to all your pipelines. Currently the only option to define is max-time.

max-time

You can define the maximum time a step can execute for (in minutes) at the global level or step level. Use a whole number greater than 0 and less than 120.

If you don't specify a max-time, it defaults to 120.


options:
  max-time: 60
pipelines:
  default:
    - step:
        name: Sleeping step
        script:
          - sleep 120m # This step will timeout after 60 minutes
    - step:
        name: quick step
        max-time: 5
        script:
          - sleep 120m #this step will timeout after 5 minutes

clone

Contains settings for when we clone your repository into a container. Settings here include:

  • lfs - Support for Git lfs
  • depth - the depth of the Git clone.

lfs

A global setting that specifies that Git LFS files should be downloaded with the clone.

Note: This keyword is supported only for Git repositories.

clone:
  lfs: true
  
pipelines:
  default:
    - step:
        name: Clone and download
        script:
          - echo "Clone and download my LFS files!"


depth

You can define the depth of clones at the global level. Use a whole number greater than zero or use full to specify a full clone.

If you don't specify the Git clone depth, it defaults to 50.

Note: This keyword is supported only for Git repositories.

clone:
  depth: 5       # include the last five commits
 
pipelines:
  default:
    - step:
        name: Cloning
        script:
          - echo "Clone all the things!"


definitions

Define resources used elsewhere in your pipeline configuration. Resources can include:

services

Rather than trying to build all the resources you might need into one large image, we can spin up separate docker containers for services. This will tend to make it quicker to build, and makes it very easy to change a single service without having to redo your whole image.

So if we want a redis service container we could add:

definitions:
  services:
    redis:
      image: redis


caches

Re-downloading dependencies from the internet for each step of a build can take a lot of time. Using a cache they are downloaded once to our servers and then locally loaded into the build each time.

An example showing how to define a custom bundler cache:

definitions:
  caches:
    bundler: vendor/bundle


Glob patterns cheat sheet

Glob patterns don't allow any expression to start with a star. Every expression that starts with a star needs to be put in quotes.

feature/*
  • Matches with feature/<any_branch>.
  • The glob pattern doesn't match the slash (/), so Git branches like feature/<any_branch>/<my_branch> are not matched for feature/*.
feature/bb-123-fix-links
  • If you specify the exact name of a branch, a tag, or a bookmark, the pipeline defined for the specific branch overrides any more generic expressions that would match that branch.
  • For example, let's say that you specified a pipeline for feature/* and feature/bb-123-fix-links. On a commit to the feature/bb-123-fix-links branch Pipelines will execute steps defined for feature/bb-123-fix-links and will not execute the steps defined in feature/*.
'*'


  • Matches all branches, tags, or bookmarks. The star symbol (*) must be in single quotes.
  • This glob pattern doesn't match the slash (/), so Git branches like feature/bb-123-fix-links are not matched for '*'. If you need the slash to match, use '**' instead of '*'.


'**'
  • Matches all branches, branches, tags, or bookmarks. For example, it includes branches with the slash (/) like feature/bb-123-fix-links. The ** expression must be in quotes.
'*/feature'
  • This expression requires quotes.

'master' and duplicate branch names

  • Names in quotes are treated the same way as names without quotes. For example, Pipelines sees master and 'master' as the same branch names.
  • In the situation described above, Pipelines will match only against one name (master or'master', never both).
  • We recommend that you avoid duplicate names in your bitbucket-pipelines.yml file.

See also:



Last modified on Jul 31, 2018

Was this helpful?

Yes
No
Provide feedback about this article
Powered by Confluence and Scroll Viewport.