Recently, AWS announced that they’ve added support for triggering AWS Lambda functions into AWS CodePipeline – AWS’ Continuous Delivery service. They also provided some great step-by-step documentation to describe the process for configuring a new stage in CodePipeline to run a Lambda function. In this article, I’ll describe how I codified the provisioning of all of the AWS resources in the documentation using CloudFormation.
aws_code_pipeline_lambda
This announcement is really big news as it opens up a whole realm of possibilities about what can be run from CodePipeline. Now, you can run event-driven functions any time you want from your pipelines. With this addition, CodePipeline added a new Invoke action category that adds to the list of other actions such as Build, Deploy, Test and Source.
NOTE: All of the CloudFormation examples in this article are defined in the codepipeline-lambda.json file.

tl;dr

If you’d rather not read the detailed explanation of the resources and code snippets of this solution, just click on the CloudFormation Launch Stack button below to automatically provision the AWS resources described herein. You will be charged for your AWS usage. 
Launch Stack

CloudFormation

I went through the 20+ pages of instructions which were easy to follow but, as I often do when going through this kind of documentation, I thought about how I’d make it easier for me and others to run it again without copying/pasting, clicking multiple buttons and so on. In other words, I’m lazy and don’t enjoy repeatedly going over the same thing again and again and I figured this would be something I’d (and others) like to use often in the future. Of course, this leads me to writing a template in CloudFormation since I can define everything in code and type a single command or click a button to reliably and repeatedly provision all the necessary resources to run Invoke actions within a Lambda stage in CodePipeline.
There are six core services that compose this infrastructure architecture. They are CloudFormation, CodePipeline, Lambda, IAM, EC2 and CodeDeploy.
To launch the infrastructure stacks that make up this solution, type the following from the command line. The command will only work if you’ve installed the AWS CLI.
Command for launching CodePipeline Lambda stacks

aws cloudformation create-stack
--stack-name CodePipelineLambdaStack
--template-body https://raw.githubusercontent.com/stelligent/cloudformation_templates/master/labs/codepipeline/codepipeline-cross-account-pipeline.json
--region us-east-1
--disable-rollback --capabilities="CAPABILITY_IAM"
--parameters ParameterKey=KeyName,ParameterValue=YOUREC2KEYPAIRNAME

EC2

From my CloudFormation template, I launched a single EC2 instance that installed a CodeDeploy agent onto it. I used the sample provided by AWS at http://s3.amazonaws.com/aws-codedeploy-us-east-1/templates/latest/CodeDeploy_SampleCF_Template.json and added one small modification to return the PublicIp of the EC2 instance after it’s launched as a CloudFormation Output. Because of this modification, I created a new template based on AWS’ sample.
CloudFormation JSON to define EC2 instance used by CodeDeploy

    "CodeDeployEC2InstancesStack":{
      "Type":"AWS::CloudFormation::Stack",
      "Properties":{
        "TemplateURL":"https://s3.amazonaws.com/stelligent-public/cloudformation-templates/github/labs/codepipeline/codedeploy-ec2.json",
        "TimeoutInMinutes":"60",
        "Parameters":{
          "TagValue":{
            "Ref":"AWS::StackName"
          },
          "KeyPairName":{
            "Ref":"KeyName"
          }
        }
      }
    },

When the stack is complete, you’ll see that one EC2 instance has been launched and automatically tagged with the name you entered when naming your CloudFormation stack. This name is used to run CodeDeploy operations on instance(s) with this tag.
codepipeline_lambda_ec2

CodeDeploy

AWS CodeDeploy automates code deployments to any instance. Previously, I had automated the steps of the Simple Pipeline Walkthrough which included the provisioning of AWS CodeDeploy resources as well so I used this CloudFormation template as a starting point. I uploaded the sample Linux app provided by CodePipeline in the walkthrough to Amazon S3 and used S3 as the Source action in the Source stage in my pipeline in CodePipeline. Below, you see a snippet of defining the CodeDeploy stack from the codepipeline-lambda.json. The nested stack defined in the TemplateURL property defines the CodeDeploy application and the deployment group.
CloudFormation JSON to define EC2 instance used by CodeDeploy

    "CodeDeploySimpleStack":{
      "Type":"AWS::CloudFormation::Stack",
      "DependsOn":"CodeDeployEC2InstancesStack",
      "Properties":{
        "TemplateURL":"https://s3.amazonaws.com/stelligent-public/cloudformation-templates/github/labs/codepipeline/codedeploy-deployment.json",
        "TimeoutInMinutes":"60",
        "Parameters":{
          "TagValue":{
            "Ref":"AWS::StackName"
          },
          "RoleArn":{
            "Fn::GetAtt":[
              "CodeDeployEC2InstancesStack",
              "Outputs.CodeDeployTrustRoleARN"
            ]
          },
          "Bucket":{
            "Ref":"S3Bucket"
          },
          "Key":{
            "Ref":"S3Key"
          }
        }
      }
    },

The screenshot below is that of a CodeDeploy deployment that was generated from the CloudFormation stack launch.
codepipeline_lambda_codedeploy
The CodeDeploy provisioning of this is described in more detail in my article on this topic: Automating AWS CodeDeploy Provisioning in CloudFormation.

CodePipeline

I took a CodePipeline example that I’d written in CloudFormation that defines a simple three-stage pipeline (based on the Simple Pipeline Walkthrough)  and added a new stage in the CloudFormation Resource block to invoke the Lambda function. If I were manually adding this stage, I’d go to my specific pipeline in AWS CodePipeline, click add Stage and then add an action to the stage. Below, you see of a screenshot of what you’d do if you were manually defining this configuration within an AWS CodePipeline action. This is also what got generated from the CloudFormation stack.
codepipeline_lambda_stage

AWS::CodePipeline::Pipeline

At the beginning of the snippet below, you see the use of the AWS::CodePipeline::Pipeline CloudFormation resource type. It has dependencies on the CodeDeploySimpleStack and CodePipelineLambdaTest resources. One of the reasons for this is that there needs to be an EC2 instance type defined already so that I can get access to the PublicIp that I use to run a Lambda function later when verifying the application is up and running. The other is that we need to set the FunctionName property of the Configuration of the Lambda stage in CodePipeline. This function name is generated by the AWS::Lambda::Function resource type that I’ll describe later. By using this approach, you don’t need to know the name of the Lambda function when defining the CloudFormation template.
CloudFormation JSON to define IAM Role for Lambda function execution

    "GenericPipeline":{
      "Type":"AWS::CodePipeline::Pipeline",
      "DependsOn":[
        "CodeDeploySimpleStack",
        "CodePipelineLambdaTest"
      ],
      "Properties":{
        "DisableInboundStageTransitions":[
          {
            "Reason":"Demonstration",
            "StageName":"Production"
          }
        ],
        "RoleArn":{
          "Fn::Join":[
            "",
            [
              "arn:aws:iam::",
              {
                "Ref":"AWS::AccountId"
              },
              ":role/AWS-CodePipeline-Service"
            ]
          ]
        },
        "Stages":[
...
          {
            "Name":"LambdaStage",
            "Actions":[
              {
                "InputArtifacts":[
                ],
                "Name":"MyLambdaAction",
                "ActionTypeId":{
                  "Category":"Invoke",
                  "Owner":"AWS",
                  "Version":"1",
                  "Provider":"Lambda"
                },
                "OutputArtifacts":[
                ],
                "Configuration":{
                  "FunctionName":{
                    "Ref":"CodePipelineLambdaTest"
                  },
                  "UserParameters":{
                    "Fn::Join":[
                      "",
                      [
                        "http://",
                        {
                          "Fn::GetAtt":[
                            "CodeDeployEC2InstancesStack",
                            "Outputs.PublicIp"
                          ]
                        }
                      ]
                    ]
                  }
                },
                "RunOrder":1
              }
            ]
          },...

Lambda

AWS Lambda lets you run event-based functions without provisioning or managing servers. That said, there’s still a decent amount of configuration you’ll need to define in running your Lambda functions. In the example provided by AWS, the Lambda function tests whether it can access a website without receiving an error. If it succeeds, the CodePipeline action and stage succeed, turn to green, and it automatically transitions to the next stage or completes the pipeline. If it fails, that pipeline instance fails, turns red, and ceases any further actions from occurring. It’s a very typical test you’d run to be sure your application was successfully deployed. In the example, AWS has you manually enter the URL for the application. Since this requires manual intervention, I needed to figure out a way to get this URL dynamically. I did this by setting the PublicIp of the EC2 instance that was launched earlier in the stack as an Output of the nested stack. Then I used this PublicIp as an input to the UserParameters property of the Invoke action within the Lambda stage that I defined in CloudFormation for my CodePipeline pipeline.
Once the function has been generated by the stack, you’ll be able to go to a list of Lambda functions in your AWS Console and see the function that was created from the stack.
codepipeline_lambda_function

AWS::IAM::Role

In the CloudFormation code snippet you see below, I’m defining an IAM role that’s capable of calling Lambda functions.
CloudFormation JSON to define IAM Role for Lambda function execution

    "CodePipelineLambdaRole":{
      "Type":"AWS::IAM::Role",
      "Properties":{
        "AssumeRolePolicyDocument":{
          "Version":"2012-10-17",
          "Statement":[
            {
              "Effect":"Allow",
              "Principal":{
                "Service":[
                  "lambda.amazonaws.com"
                ]
              },
              "Action":[
                "sts:AssumeRole"
              ]
            }
          ]
        },
        "Path":"/"
      }
    },

AWS::IAM::Policy

The code snippet below depends on the creation of the IAM role I showed in the example above. The IAM policy that’s attached to the IAM role provides access to the AWS logs and the CodePipeline results so that it can signal success or failure to the CodePipeline action that I defined earlier.
CloudFormation JSON to define IAM Policy for IAM Role for Lambda function execution

    "LambdaCodePipelineExecutionPolicy":{
      "DependsOn":[
        "CodePipelineLambdaRole"
      ],
      "Type":"AWS::IAM::Policy",
      "Properties":{
        "PolicyName":"LambdaRolePolicy",
        "Roles":[
          {
            "Ref":"CodePipelineLambdaRole"
          }
        ],
        "PolicyDocument":{
          "Version":"2012-10-17",
          "Statement":[
            {
              "Effect":"Allow",
              "Action":[
                "logs:*"
              ],
              "Resource":[
                "arn:aws:logs:*:*:*"
              ]
            },
            {
              "Effect":"Allow",
              "Action":[
                "codepipeline:PutJobSuccessResult",
                "codepipeline:PutJobFailureResult"
              ],
              "Resource":[
                "*"
              ]
            }
          ]
        }
      }
    },

AWS::Lambda::Function

In the code snippet below, you see how I’m defining the Lambda function in CloudFormation. There are several things to point out here. I uploaded some JavaScript (Node.js) code to S3 with the name Archive.zip into a bucket specified by the S3Bucket parameter that I set when I launched the CloudFormation stack. This S3 bucket needs to have S3 Versioning enabled on it. Moreover, the Archive.zip file needs to have the .js file used by Lambda in the root of the Archive.zip. Keep in mind that I can call the .zip file whatever I want, but once I name the file and upload it then my CloudFormation template needs to refer to the correct name of the file.
Also, you see that I’ve defined a Handler named validateurl.handler. This means that the JavaScript file in the Archive.zip that hosts the file(s) that Lambda runs must be named validateurl.js. If I want to use a different name, I must change both the JavaScript filename and the CloudFormation template that references it.
CloudFormation JSON to define Lambda function execution

    "CodePipelineLambdaTest":{
      "Type":"AWS::Lambda::Function",
      "DependsOn":[
        "CodePipelineLambdaRole",
        "LambdaCodePipelineExecutionPolicy"
      ],
      "Properties":{
        "Code":{
          "S3Bucket":{
            "Ref":"S3Bucket"
          },
          "S3Key":"Archive.zip"
        },
        "Role":{
          "Fn::GetAtt":[
            "CodePipelineLambdaRole",
            "Arn"
          ]
        },
        "Description":"Validate a website URL",
        "Timeout":20,
        "Handler":"validateurl.handler",
        "Runtime":"nodejs",
        "MemorySize":128
      }
    },

Lambda Test Function

With all of this configuration to get something to run, sometimes it’s easy to overlook that we’re actually executing something useful and not just configuring the support infrastructure. The snippet below is the actual test that gets run as part of the Lambda action in the Lambda stage that I defined in the CloudFormation template for CodePipeline. This code is taken directly from the Integrate AWS Lambda Functions into Pipelines in AWS CodePipeline instructions from AWS. This JavaScript code verifies that it can access the supplied website URL of the deployed application. If it fails, it signals for CodePipeline to cease any further actions in the pipeline.
JavaScript to test access to a website

var assert = require('assert');
var AWS = require('aws-sdk');
var http = require('http');
exports.handler = function(event, context) {
    var codepipeline = new AWS.CodePipeline();
    // Retrieve the Job ID from the Lambda action
    var jobId = event["CodePipeline.job"].id;
    // Retrieve the value of UserParameters from the Lambda action configuration in AWS CodePipeline, in this case a URL which will be
    // health checked by this function.
    var url = event["CodePipeline.job"].data.actionConfiguration.configuration.UserParameters;
    // Notify AWS CodePipeline of a successful job
    var putJobSuccess = function(message) {
        var params = {
            jobId: jobId
        };
        codepipeline.putJobSuccessResult(params, function(err, data) {
            if(err) {
                context.fail(err);
            } else {
                context.succeed(message);
            }
        });
    };
    // Notify AWS CodePipeline of a failed job
    var putJobFailure = function(message) {
        var params = {
            jobId: jobId,
            failureDetails: {
                message: JSON.stringify(message),
                type: 'JobFailed',
                externalExecutionId: context.invokeid
            }
        };
        codepipeline.putJobFailureResult(params, function(err, data) {
            context.fail(message);
        });
    };
    // Validate the URL passed in UserParameters
    if(!url || url.indexOf('http://') === -1) {
        putJobFailure('The UserParameters field must contain a valid URL address to test, including http:// or https://');
        return;
    }
    // Helper function to make a HTTP GET request to the page.
    // The helper will test the response and succeed or fail the job accordingly
    var getPage = function(url, callback) {
        var pageObject = {
            body: '',
            statusCode: 0,
            contains: function(search) {
                return this.body.indexOf(search) > -1;
            }
        };
        http.get(url, function(response) {
            pageObject.body = '';
            pageObject.statusCode = response.statusCode;
            response.on('data', function (chunk) {
                pageObject.body += chunk;
            });
            response.on('end', function () {
                callback(pageObject);
            });
            response.resume();
        }).on('error', function(error) {
            // Fail the job if our request failed
            putJobFailure(error);
        });
    };
    getPage(url, function(returnedPage) {
        try {
            // Check if the HTTP response has a 200 status
            assert(returnedPage.statusCode === 200);
            // Check if the page contains the text "Congratulations"
            // You can change this to check for different text, or add other tests as required
            assert(returnedPage.contains('Congratulations'));
            // Succeed the job
            putJobSuccess("Tests passed.");
        } catch (ex) {
            // If any of the assertions failed then fail the job
            putJobFailure(ex);
        }
    });
};

Post-Commit Git Hook for Archiving and Uploading to S3

I’m in the process of figuring out how to add a post-commit hook that moves files committed to a specific directory in a Git repository, zip up the necessary artifacts and upload to a pre-defined directory in S3 so that I can remove this manual activity as well.

Summary

By adding the ability to invoke Lambda functions directly from CodePipeline, AWS has opened a whole new world of what can be orchestrated into our software delivery processes in AWS. You learned how to automate the provisioning of not just the Lambda configuration, but dependent AWS resources including the automated provisioning of your pipelines in AWS CodePipeline. If you have any questions, reach out to us on Twitter @stelligent or @paulduvall.

Resources