Deploy to Amazon AWS

This guide will help you configure your Bitbucket Pipelines deployment to AWS with Elastic Beanstalk, CodeDeploy and S3.

You may also be interested in this guide: Deploy to Amazon ECS.

Add your AWS credentials to Bitbucket Pipelines

Before we start with the deployment instructions you'll need to add 2 variables to Bitbucket Pipelines so it can interact with your AWS account:

  • AWS_ACCESS_KEY_ID: Your AWS access key.

  • AWS_SECRET_ACCESS_KEY: Your AWS secret access key. Make sure that you save it as a secured variable.

Deploy to AWS with Elastic Beanstalk

In our example below, first we run any build and test commands for our application (in this case using a python test script).

Then we zip up the the files we want to deploy, which we pass to the next step using artifacts.

In the next step we deploy using the AWS Elastic Beanstalk Deploy pipe with the appropriate variables, tracking our production environment builds using deployments.

example bitbucket-pipelines.yml
image: atlassian/default-image:2

pipelines:
  default:
    - step:
        name: "Build and Test"
        script:
          - pytest test/test.py
          - zip application.zip Dockerfile application.py cron.yaml Dockerrun.aws.json
        artifacts: 
          - application.zip
    - step:
        name: "Deploy to Production"
        deployment: production
        script:
        - pipe: atlassian/aws-elasticbeanstalk-deploy:0.2.1
          variables:
            AWS_ACCESS_KEY_ID: $AWS_ACCESS_KEY_ID
            AWS_SECRET_ACCESS_KEY: $AWS_SECRET_ACCESS_KEY
            AWS_DEFAULT_REGION: "us-east-1"
            APPLICATION_NAME: "application-test"
            ZIP_FILE: "application.zip"
            S3_BUCKET: "application-test-bucket"
            VERSION_LABEL: "deploy-$BITBUCKET_BUILD_NUMBER-multiple"

https://bitbucket.org/atlassian/aws-elasticbeanstalk-deploy/src/master/

Deploy to AWS with S3

If you're hosting a static website with S3 you can use the AWS CLI to update your website with Bitbucket Pipelines with the AWS S3 Deploy pipe.

The example below is based on a Node project.

To update your website sync your repository with your S3 bucket. You can set DELETE_FLAG to true to make sure that files that aren't needed anymore get removed from your S3 bucket:

example bitbucket-pipelines.yml
image: node:10.15.1

pipelines:
  default:
    - step:
        name: Build and Test
        script:
          - npm install
          - npm test
          - npm run dist
        artifacts:
          - dist/**
    - step:
        name: Deploy
        deployment: production
        script:
          - pipe: atlassian/aws-s3-deploy:0.2.1
            variables:
              AWS_ACCESS_KEY_ID: $AWS_ACCESS_KEY_ID
              AWS_SECRET_ACCESS_KEY: $AWS_SECRET_ACCESS_KEY
              AWS_DEFAULT_REGION: "us-east-1"
              S3_BUCKET: "my-bucket-name"
              LOCAL_PATH: "dist"
              ACL: "public-read"
              CACHE_CONTROL: "max-age=3600"
              EXPIRES: "2018-10-01T00:00:00+00:00"
              DELETE_FLAG: "true"
              EXTRA_ARGS: "--follow-symlinks"

https://bitbucket.org/atlassian/aws-s3-deploy/


You can check your bitbucket-pipelines.yml file with our online validator.

Last modified on Mar 19, 2019

Was this helpful?

Yes
No
Provide feedback about this article
Powered by Confluence and Scroll Viewport.