Deploying a Lambda function update to AWS

This guide will walk you through the process of updating an existing Lambda function in AWS using pipes. We will also show you how to configure aliases for your Lambda functions and then use them to promote the newly published version through conceptual test, staging and production environments.

There is also a repo containing all the examples used in this guide.

Prerequisites

You need to have:

  • Access to the AWS console.

  • An existing Lambda function.

  • An IAM user with sufficient permissions and access to update the Lambda function

AWS have tutorials for setting up an IAM user and how to create a Lambda function using the AWS console, if you don't have these done already.

We recommend attaching the AWSLambdaFullAccess policy to the IAM user, to full permissions to work with Lambda functions.

Add your AWS credentials to Bitbucket Pipelines

You'll need to add 3 variables to Bitbucket Pipelines containing the credentials of the IAM user that will be used to update the Lambda function:

  • AWS_ACCESS_KEY_ID: IAM user's AWS access key.

  • AWS_SECRET_ACCESS_KEY: the IAM user's AWS secret access key. Make sure that you save it as a secured variable.

  • AWS_DEFAULT_REGION: Your AWS region.

Basic example: Deploy Javascript function to Lambda

To update a Lambda function, 2 steps are required:

  • building the zipped up function code

  • deploying the zipped code to AWS.


In this example, we will build a simple node.js based Lambda function.

  1. Create a file, index.js, with the following content:

    index.js
    exports.handler = async (event) => {
        const response = {
            statusCode: 200,
            body: JSON.stringify('Hello world')
        };
        return response;
    };
  2. The bitbucket-pipelines.yml file has 2 sections to it, steps to :

    1. build and .zip up the Lambda codeAn IAM user with sufficient permissions and access to update the Lambda function

    2. push the updated code to AWS

    In the example below replace the FUNCTION_NAME variable with the name of your function.

bitbucket-pipelines.yml
- step:
    name: Build and package
    script:
      - apt-get update && apt-get install -y zip
      - zip code.zip index.js
    artifacts:
      - code.zip
- step:
    name: Update Lambda code
    script:
      - pipe: atlassian/aws-lambda-deploy:0.2.1
        variables:
          AWS_ACCESS_KEY_ID: ${AWS_ACCESS_KEY_ID}
          AWS_SECRET_ACCESS_KEY: ${AWS_SECRET_ACCESS_KEY}
          AWS_DEFAULT_REGION: ${AWS_DEFAULT_REGION}
          FUNCTION_NAME: 'my-function'
          COMMAND: 'update'
          ZIP_FILE: 'code.zip'

Advanced example: Using Aliases for Multiple Environments

AWS provides the ability to associate aliases with a particular version of a Lambda function.  When you use them with aliases that represent the name of a deployment environment in Bitbucket Pipelines, you can promote versions of your functions through test, staging and production environments.


Following on from the previous example, we've combined the first 2 steps, and now are adding on steps to promote the function through test, staging and production environments.

pipelines:
  default:
    - step:
        # Build and package the Lambda function.
        name: Build and package
        script:
          - apt-get update && apt-get install -y zip
          - zip code.zip index.js

          # Upload the Lambda - make the version number 
          #available to subsequent steps via artifacts.
          - pipe: atlassian/aws-lambda-deploy:0.2.1
            variables:
              AWS_ACCESS_KEY_ID: ${AWS_ACCESS_KEY_ID}
              AWS_SECRET_ACCESS_KEY: ${AWS_SECRET_ACCESS_KEY}
              AWS_DEFAULT_REGION: ${AWS_DEFAULT_REGION}
              FUNCTION_NAME: 'my-function'
              COMMAND: 'update'
              ZIP_FILE: 'code.zip'

        # The pipe exports the newly published 
        # Lambda version to a file.
        artifacts:
          - pipe.meta.env

    # You can optionally use AWS Lambda aliases 
    # to map the newly published Lambda
    # function version to conceptual environments.
    - step:
        name: Deploy to Test
        deployment: test
        script:
        # Read the 'function_version' from 
        # the update pipe into environment variables.
        - source pipe.meta.env
        # Point the test alias to the function.
        - pipe: atlassian/aws-lambda-deploy:0.2.1
          variables:
            AWS_ACCESS_KEY_ID: $AWS_ACCESS_KEY_ID
            AWS_SECRET_ACCESS_KEY: $AWS_SECRET_ACCESS_KEY
            AWS_DEFAULT_REGION: $AWS_DEFAULT_REGION
            FUNCTION_NAME: 'my-function'
            COMMAND: 'alias'
            ALIAS: 'test'
            VERSION: '${function_version}'

    - step:
        name: Deploy to Staging
        deployment: staging
        script:
        # Read the 'function_version' from 
        # the update pipe into environment variables.
        - source pipe.meta.env
        # Point the 'staging' alias to the function.
        - pipe: atlassian/aws-lambda-deploy:0.2.1
          variables:
            AWS_ACCESS_KEY_ID: $AWS_ACCESS_KEY_ID
            AWS_SECRET_ACCESS_KEY: $AWS_SECRET_ACCESS_KEY
            AWS_DEFAULT_REGION: $AWS_DEFAULT_REGION
            FUNCTION_NAME: 'my-function'
            COMMAND: 'alias'
            ALIAS: 'staging'
            VERSION: '${function_version}'

    - step:
        name: Deploy to Production
        deployment: production
        script:
        # Read the 'function_version' from
        # the update pipe into environment variables.
        - source pipe.meta.env
        # Point the 'production' alias to the function.
        - pipe: atlassian/aws-lambda-deploy:0.2.1
          variables:
            AWS_ACCESS_KEY_ID: $AWS_ACCESS_KEY_ID
            AWS_SECRET_ACCESS_KEY: $AWS_SECRET_ACCESS_KEY
            AWS_DEFAULT_REGION: $AWS_DEFAULT_REGION
            FUNCTION_NAME: 'my-function'
            COMMAND: 'alias'
            ALIAS: 'production'
            VERSION: '${function_version}'
Last modified on Feb 28, 2019

Was this helpful?

Yes
No
Provide feedback about this article
Powered by Confluence and Scroll Viewport.