You can define your build pipelines by using a selection of the following keywords. They are arranged in this table in the order in which you might use them, with highlighted rows to show keywords that define a logical section.
|pipelines||Contains all your pipeline definitions.|
Contains the pipeline definition for all branches that don't match a pipeline definition in other sections.
Contains pipeline definitions for specific branches.
|tags||Contains pipeline definitions for specific Git tags and annotated tags.|
|bookmarks||Contains pipeline definitions for specific Mercurial bookmarks.|
|custom||Contains pipelines that can be triggered manually from the Bitbucket Cloud GUI.|
|pull-requests||Contains pipeline definitions that only run on pull requests.|
|parallel||Contains steps to run concurrently.|
Defines a build execution unit. This defines the commands executed and settings of a unique container.
|name||Defines a name for a step to make it easier to see what each step is doing in the display.|
|image||The Docker image to use for a step. If you don't specify the image, your pipelines run in the default Bitbucket image. This can also be defined globally to use the same image type for every step.|
|trigger||Specifies whether the step is manual or automatic. If you don't specify a trigger type, it defaults to automatic.|
Sets the type of environment for your deployment step.
Valid values are:
Used to provision extra resources for pipelines and steps.
Valid values are:
|script||Contains the list of commands that are executed to perform the build.|
To specify that you'd like to use a particular pipe.
|after-script||A set of commands that will run when your step succeeds or fails|
|artifacts||Defines files that are produced by a step, such as reports and JAR files, that you want to share with a following step.|
|options||Contains global settings that apply to all your pipelines.|
The maximum time (in minutes) a step can execute for.
Use a whole number greater than 0 or less than 120. If you don't specify a max-time, it defaults to 120.
|clone||Contains settings for when we clone your repository into a container|
|lfs||Enables the download of LFS files in your clone. This defaults to
Defines the depth of Git clones for all pipelines.
Use a whole number greater than zero to specify the depth. Use
Note: This keyword is supported only for Git repositories.
|definitions||Defines resources, such as services and custom caches, that you want to use elsewhere in your pipeline configurations.|
|services||Define services you would like to use with you build, which are run in separate but linked containers.|
|caches||Define dependencies to cache on our servers to reduce load time.|
The start of your pipelines definitions. Under this keyword you must define your build pipelines using at least one of the following:
- default (for all branches that don't match any of the following)
- branches (Git and Mercurial)
- tags (Git)
- bookmarks (Mercurial)
image: node:10.15.0 pipelines: default: - step: name: Build and test script: - npm install - npm test tags: # add the 'tags' section release-*: # specify the tag - step: # define the build pipeline for the tag name: Build and release script: - npm install - npm test - npm run release branches: staging: - step: name: Clone script: - echo "Clone all the things!"
The default pipeline runs on every push to the repository, unless a branch-specific pipeline is defined. You can define a branch pipeline in the branches section.
Note: The default pipeline doesn't run on tags or bookmarks.
Defines a section for all branch-specific build pipelines. The names or expressions in this section are matched against:
- branches in your Git repository
- named branches in your Mercurial repository
You can use glob patterns for handling the branch names.
See Branch workflows for more information about configuring pipelines to build specific branches in your repository.
Defines all tag-specific build pipelines. The names or expressions in this section are matched against tags and annotated tags in your Git repository. You can use glob patterns for handling the tag names.
Defines all bookmark-specific build pipelines. The names or expressions in this section are matched against bookmarks in your Mercurial repository. You can use glob patterns for handling the tag names.
image: node:10.15.0 pipelines: default: - step: name: Build and test script: - npm install - npm test bookmarks: # add the 'bookmarks' section release-*: # specify the bookmark - step: # define the build pipeline for the bookmark name: Build and release script: - npm install - npm test - npm run release branches: staging: - step: name: Clone script: - echo "Clone all the things!"
Defines pipelines that can only be triggered manually or scheduled from the Bitbucket Cloud interface.
image: node:10.15.0 pipelines: custom: # Pipelines that are triggered manually sonar: # The name that is displayed in the list in the Bitbucket Cloud GUI - step: script: - echo "Manual triggers for Sonar are awesome!" deployment-to-prod: # Another display name - step: script: - echo "Manual triggers for deployments are awesome!" branches: # Pipelines that run automatically on a commit to a branch staging: - step: script: - echo "Automated pipelines are cool too."
With a configuration like the one above, you should see the following pipelines in the Run pipeline dialog in Bitbucket Cloud:
For more information, see Run pipelines manually.
A special pipeline which only runs on pull requests.
Pull-requests has the same level of indentation as
This type of pipeline runs a little differently to other pipelines. When it's triggered, we'll merge the destination branch into your working branch before it runs. If the merge fails we will stop the pipeline.
This only applies to pull requests initiated from within your repository; pull requests from a forked repository will not trigger the pipeline.
pipelines: pull-requests: '**': #this runs as default for any branch not elsewhere defined - step: script - ... feature/*: #any branch with a feature prefix - step: script: - ... branches: #these will run on every push of the branch staging: - step: script: - ...
branchesin your configuration, and you want them all to only run on pull requests, you can simply replace the keyword
pull-requests(if you already have a pipeline for
defaultyou will need to move this under
pull-requestsand change the keyword from
defaultto '**' to run).
Pull request pipelines run in addition to any branch and default pipelines that are defined, so if the definitions overlap you may get 2 pipelines running at the same time!
Parallel steps enable you to build and test faster, by running a set of steps at the same time.
The total number of build minutes used by a pipeline will not change if you make the steps parallel, but you'll be able to see the results sooner.
There is a limit of 10 for the total number of steps you can run in a pipeline, regardless of whether they are running in parallel or serial.
Indent the steps to define which steps run concurrently:
pipelines: default: - step: # non-parallel step name: Build script: - ./build.sh - parallel: # these 2 steps will run in parallel - step: name: Integration 1 script: - ./integration-tests.sh --batch 1 - step: name: Integration 2 script: - ./integration-tests.sh --batch 2 - step: # non-parallel step script: - ./deploy.sh
Learn more about parallel steps.
Defines a build execution unit. Steps are executed in the order that they appear in the
bitbucket-pipelines.yml file. You can use up to 10 steps in a pipeline.
Each step in your pipeline will start a separate Docker container to run the commands configured in the
script. Each step can be configured to:
- Use a different Docker image.
- Configure a custom max-time.
- Use specific caches and services.
- Produce artifacts that subsequent steps can consume.
Steps can be configured to wait for a manual trigger before running. To define a step as manual, add
trigger: manual to the step in your
bitbucket-pipelines.yml file. Manual steps:
- Can only be executed in the order that they are configured. You cannot skip a manual step.
- Can only be executed if the previous step has successfully completed.
- Can only be triggered by users with write access to the repository.
- Are triggered through the Pipelines web interface.
If your build uses both manual steps and artifacts, the artifacts are stored for 7 days following the execution of the step that produced them. After this time, the artifacts expire and any manual steps in the pipeline can no longer be executed. For more information, see Manual steps and artifact expiry.
Note: You can't configure the first step of the pipeline as a manual step.
You can add a name to a step to make displays and reports easier to read and understand.
Bitbucket Pipelines uses Docker containers to run your builds.
- You can use the default image (
atlassian/default-image:latest) provided by Bitbucket or define a custom image. You can specify any public or private Docker image that isn't hosted on a private network.
- You can define images at the global or step level. You can't define an image at the branch level.
To specify an image, use
For more information about using and creating images, see Use Docker images as build environments.
|Uses the image with the latest openjdk version|
|Uses the image with openjdk version 8|
|Uses the non-official node version with version iojs-2.0.2|
image: openjdk #this image will be used by all steps unless overridden pipelines: default: - step: image: nodesource/node:iojs-2.0.2 #override the global image for this step script: - npm install - npm test - step: #this step will use the global image script: - npm install - npm test
Specifies whether a step will run automatically or only after someone manually triggers it. You can define the trigger type as
automatic. If the trigger type is not defined, the step defaults to running automatically. The first step cannot be manual. If you want to have a whole pipeline only run from a manual trigger then use a custom pipeline.
pipelines: default: - step: name: Build and test image: node:10.15.0 script: - npm install - npm test - npm run build artifacts: - dist/** - step: name: Deploy image: python:3.7.2 trigger: manual script: - python deploy.py
Sets the type of environment for your deployment step, used in the Deployments dashboard.
Valid values are
The following step will display in the
test environment in the Deployments view:
- step: name: Deploy to test image: aws-cli:1.0 deployment: test script: - python deploy.py test
You can allocate additional resources to a step, or to the whole pipeline. By specifying the size of
2x, you'll have double the resources available (eg. 4GB memory → 8GB memory).
At this time, valid sizes are
2x pipelines will use twice the number of build minutes.
Overriding the size of a single step
pipelines: default: - step: script: - echo "All good things..." - step: size: 2x # Double resources available for this step. script: - echo "Come to those who wait."
Increasing the resources for an entire pipeline
Using the global size, all steps will inherit the '2x' size.
options: size: 2x pipelines: default: - step: name: Step with more memory script: - echo "I've got double the memory to play with!"
Contains a list of commands that are executed in sequence. Scripts are executed in the order in which they appear in a step. We recommend that you move large scripts to a separate script file and call it from the
Pipes make complex tasks easier, by doing a lot of the work behind the scenes. This means you can just select which pipe you want to use, and supply the necessary variables. You can look at the repository for the pipe to see what commands it is running. Learn more about pipes.
A pipe to send a message to Opsgenie might look like:
pipelines: default: - step: name: Alert Opsgenie script: - pipe: atlassian/opsgenie-send-alert:0.2.0 variables: GENIE_KEY: $GENIE_KEY MESSAGE: "Danger, Will Robinson!" DESCRIPTION: "An Opsgenie alert sent from Bitbucket Pipelines" SOURCE: "Bitbucket Pipelines" PRIORITY: "P1"
You can also create your own pipes. If you do, you can specify a docker based pipe with the syntax:
Commands inside an after-script section will run when the step succeeds or fails. This could be useful for clean up commands, test coverage, notifications, or rollbacks you might want to run, especially if your after-script uses the value of
Note: If any commands in the after-script section fail:
- we won't run any more commands in that section
- it will not effect the reported status of the step.
pipelines: default: - step: name: Build and test script: - npm install - npm test after-script: - echo "after script has run!"
Defines files to be shared from one step to a later step in your pipeline. Artifacts can be defined using glob patterns.
An example showing how to define artifacts:
pipelines: default: - step: name: Build and test image: node:10.15.0 script: - npm install - npm test - npm run build artifacts: - dist/** - step: name: Deploy to production image: python:3.7.2 script: - python deploy-to-production.py
For more information, see using artifacts in steps
Contains global settings that apply to all your pipelines. The main keyword you'd use here is
You can define the maximum time a step can execute for (in minutes) at the global level or step level. Use a whole number greater than 0 and less than 120.
If you don't specify a max-time, it defaults to 120.
options: max-time: 60 pipelines: default: - step: name: Sleeping step script: - sleep 120m # This step will timeout after 60 minutes - step: name: quick step max-time: 5 script: - sleep 120m #this step will timeout after 5 minutes
Contains settings for when we clone your repository into a container. Settings here include:
lfs- Support for Git lfs
depth- the depth of the Git clone.
A global setting that specifies that Git LFS files should be downloaded with the clone.
Note: This keyword is supported only for Git repositories.
clone: lfs: true pipelines: default: - step: name: Clone and download script: - echo "Clone and download my LFS files!"
This global setting defines how many commits we clone into the pipeline container. Use a whole number greater than zero or if you want to clone everything (which will have a speed impact) use
If you don't specify the Git clone depth, it defaults to the last 50, to try and balance the time it takes to clone and how many commits you might need.
Note: This keyword is supported only for Git repositories.
clone: depth: 5 # include the last five commits pipelines: default: - step: name: Cloning script: - echo "Clone all the things!"
Define resources used elsewhere in your pipeline configuration. Resources can include:
- caches – see Caching dependencies.
- YAML anchors - a way to define a chunk of your yaml for easy re-use - see YAML anchors.
Rather than trying to build all the resources you might need into one large image, we can spin up separate docker containers for services. This will tend to speed up the build, and makes it very easy to change a single service without having to redo your whole image.
So if we want a redis service container we could add:
definitions: services: redis: image: redis
Re-downloading dependencies from the internet for each step of a build can take a lot of time. Using a cache they are downloaded once to our servers and then locally loaded into the build each time.
An example showing how to define a custom bundler cache:
definitions: caches: bundler: vendor/bundle
Glob patterns cheat sheet
Glob patterns don't allow any expression to start with a star. Every expression that starts with a star needs to be put in quotes.