Deploy to Amazon ECS

This guide will help you configure Bitbucket Pipelines to deploy automatically to an existing service in AWS EC2 Container Service (ECS).

We're going to package a new version of our example Node.js app in a new Docker image and push this image to DockerHub. Then we'll register a new task definition in ECS that uses this updated Docker image. Finally we'll update our ECS service to use this new task definition. 
You can follow this tutorial step-by-step or jump to the summary at the end. 

You may also be interested in this guide: Deploy to Amazon AWS.

See these repos too:

Prerequisites

You'll need to have:

  • an existing Docker Hub account and a Docker repository created there so we can push an image to it. Alternatively you can also use your own Docker registry. 
  • an AWS Elastic Container Service (ECS) set up, running a service.
  • an AWS access key with permissions associated to execute the RegisterTaskDefinition and UpdateService actions.


Build your artifact

In this example we package and deploy a Node.js app. To easily build the Node.js project you can configure your bitbucket-pipelines.yml to look like this: 

bitbucket-pipelines.yml
pipelines:
  default:
    - step:
        script: # Modify the commands below to build your repository.
          # build and test the Node app
          - npm install
          - npm test


You can check your bitbucket-pipelines.yml file with our online validator.

Build and push a Docker Image

Next we'll package our Node.js app as a Docker image. To do this, we have to enable Docker for our repository, build the Docker image and push that to a registry. In this case we'll be pushing the Docker image to DockerHub

First we need a Dockerfile in the root of our repository. We'll use the following: 

Dockerfile
FROM node:boron

# Create app directory
RUN mkdir -p /usr/src/app
WORKDIR /usr/src/app

# Install app dependencies
COPY package.json /usr/src/app/
RUN npm install

# Bundle app source
COPY . /usr/src/app

EXPOSE 8080
CMD [ "npm", "start" ]

Next, we want to build and push the Docker image to a private repository in the DockerHub registry. This means we have to also configure our DockerHub credentials as environment variables in Bitbucket Pipelines. 

Go to your repository settings in Bitbucket and navigate to Pipelines  >  Environment variables, then add your Docker Hub username and password as secured variables:


Now, add those credentials to the bitbucket-pipelines.yml file, as shown below. 

bitbucket-pipelines.yml
pipelines:
  default:
    - step:
        name: build and publish docker image.
        services:
          - docker # enable Docker for your repository
        script: # Modify the commands below to build your repository.
          # set DOCKER_HUB_USERNAME and DOCKER_HUB_PASSWORD as environment variables
          # Docker variables
          - export IMAGE_NAME="${DOCKER_HUB_USERNAME}/${BITBUCKET_REPO_SLUG}:${BITBUCKET_BUILD_NUMBER}"
          # build and test the Node app
          - npm install
          - npm test
          # build the Docker image (this will use the Dockerfile in the root of the repo)
          - docker build -t "$IMAGE_NAME" .
          # authenticate with the Docker Hub registry
          - docker login --username "$DOCKER_HUB_USERNAME" --password "$DOCKER_HUB_PASSWORD"
          # push the new Docker image to the Docker registry
          - docker push "$IMAGE_NAME"

Set up your AWS credentials

In the same way, we need to add AWS credentials as environment variables in Bitbucket Pipelines. Go to Pipelines  >  Environment variables and add your AWS Access Key, Secret Access Key and default region as secured variables:

Install deploy tools

Our deployment script requires the AWS CLI and the jq library to be installed as part of your build. In this example we use an image with them already installed, but if you use a different Docker image, or create your own you will need to ensure these packages are installed.

bitbucket-pipelines.yml snippet
          - curl "https://s3.amazonaws.com/aws-cli/awscli-bundle.zip" -o "awscli-bundle.zip"
          - sudo apt-get install jq
          - unzip awscli-bundle.zip
          - ./awscli-bundle/install -b ~/bin/aws
          - export PATH=~/bin:$PATH

Register an ECS task definition

Next we can register an ECS task definition that references our newly pushed Docker image. When we register the task definition with our ECS cluster, we get back the version. We'll store this in an environment variable so we can reference it later when we update the ECS service. 

bitbucket-pipelines.yml snippet
          - export ECS_TASK_NAME="${BITBUCKET_REPO_SLUG}"
          # register the ECS task definition and capture the version
          - export TASK_VERSION=$(aws ecs register-task-definition
            --family "${ECS_TASK_NAME}"
            --container-definitions
            '[{"name":"app","image":"'"${IMAGE_NAME}"'","memory":1024}]'
            | jq --raw-output '.taskDefinition.revision')
          - echo "Registered ECS Task Definition: " $TASK_VERSION

Update the ECS Service

Now we can update our existing ECS service to use the Task Definition that we just registered. 

bitbucket-pipelines.yml snippet
          # ECS variables
          - export ECS_CLUSTER_NAME="${BITBUCKET_REPO_OWNER}"
          - export ECS_SERVICE_NAME="${BITBUCKET_REPO_SLUG}"
          # Update the ECS service to use the updated Task version
          - aws ecs update-service --cluster "${ECS_CLUSTER_NAME}" --service "${ECS_SERVICE_NAME}" --task-definition "${ECS_TASK_NAME}:${TASK_VERSION}"

Summary

To get started quickly to deploy to ECS, make sure you have your AWS and Docker Hub credentials configured as repository variables: 

Then create your bitbucket-pipelines.yml file in the root of your repository after replacing the following fields with your own values:

  • IMAGE_NAME (the name of the docker image)
  • ECS_TASK_NAME(the name of the ECS task)
  • ECS_CLUSTER_NAME (the name of the ECS cluster you are deploying to)
  • ECS_SERVICE_NAME (the name of your ECS service)
  • AWS_DEFAULT_REGION (the region your ECS cluster is running in)
bitbucket-pipelines.yml
# enable Docker for your repository
options:
  docker: true

pipelines:
  default:
    - step:
        name: build and publish docker image.
        services:
          - docker # enable Docker for your repository
        script: # Modify the commands below to build your repository.


          # set DOCKER_HUB_USERNAME and DOCKER_HUB_PASSWORD as environment variables
          # Docker variables
          - export IMAGE_NAME="${DOCKER_HUB_USERNAME}/${BITBUCKET_REPO_SLUG}:${BITBUCKET_BUILD_NUMBER}"
          # build and test the Node app
          - npm install
          - npm test
          # build the Docker image (this will use the Dockerfile in the root of the repo)
          - docker build -t $IMAGE_NAME .
          # authenticate with the Docker Hub registry
          - docker login --username $DOCKER_HUB_USERNAME --password $DOCKER_HUB_PASSWORD
          # push the new Docker image to the Docker registry
          - docker push $IMAGE_NAME
    - step:
        name: deploy-to-ecs
        image: atlassian/pipelines-awscli:latest
        deployment: test
        script:
          # AWS authentication variables
          # set AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY as environment variables
          # set AWS_SECURITY_TOKEN and AWS_SESSION_TOKEN as environment variables if using temporary credentials via AWS STS
          - export AWS_DEFAULT_REGION=${AWS_DEFAULT_REGION:-"us-east-1"} #default is us-east-1
          # Dockerhub Authentication variables
          # set DOCKER_HUB_USERNAME and DOCKER_HUB_PASSWORD as environment variables
          # Docker variables
          - export IMAGE_NAME="${DOCKER_HUB_USERNAME}/${BITBUCKET_REPO_SLUG}:${BITBUCKET_BUILD_NUMBER}"
          # ECS variables
          - export ECS_CLUSTER_NAME="${BITBUCKET_REPO_OWNER}"
          - export ECS_SERVICE_NAME="${BITBUCKET_REPO_SLUG}"
          - export ECS_TASK_NAME="${BITBUCKET_REPO_SLUG}"
          # Create ECS cluster, task, service
          - aws ecs list-clusters | grep "${ECS_CLUSTER_NAME}" || aws ecs create-cluster --cluster-name "${ECS_CLUSTER_NAME}"
          # Updating the existing cluster, task, service
          - export TASK_VERSION=$(aws ecs register-task-definition
            --family "${ECS_TASK_NAME}"
            --container-definitions
            '[{"name":"app","image":"'"${IMAGE_NAME}"'","memory":1024}]'
            | jq --raw-output '.taskDefinition.revision')
          - echo "Registered ECS Task Definition:" "${TASK_VERSION}"
          # Create ECS Service
          - aws ecs list-services --cluster "${ECS_CLUSTER_NAME}"  | grep "${ECS_SERVICE_NAME}" || aws ecs create-service --service-name "${ECS_SERVICE_NAME}" --cluster "${ECS_CLUSTER_NAME}" --task-definition "${ECS_TASK_NAME}" --desired-count 1
          - aws ecs update-service --cluster "${ECS_CLUSTER_NAME}" --service "${ECS_SERVICE_NAME}" --task-definition "${ECS_TASK_NAME}:${TASK_VERSION}"
          # Follow https://docs.aws.amazon.com/AmazonECS/latest/developerguide/launch_container_instance.html for instrunctions
          # needed to provide the compute resources (EC2 instances) that the tasks will run on.


Remember, you can check your bitbucket-pipelines.yml file with our online validator.

Last modified on May 18, 2018

Was this helpful?

Yes
No
Provide feedback about this article
Powered by Confluence and Scroll Viewport.