AWS can be a big tease. Last November at re:Invent, they announced AWS CodePipeline, an orchestration engine for builds. As they described it, I got more and more excited, up until the point they said it wouldn’t be released in mid-2015. Well, today might as well be Christmas, because CodePipeline has just been released and I am already figuring out how I can use it all over the place.
aws_code_pipeline
AWS CodePipeline is part of the Amazon Web Services suite of DevOps tools, and provides orchestration of continuous delivery pipelines in the cloud. It fills the void between where your traditional continuous integration solution ends and your deployments begin (which if you’ve caught me at a conference or meetup in the past few years, you’ve heard me complaining about). It handles orchestrating every step of your build, test, and deploy process, from source code to server deployments.
CodePipeline uses four core building blocks, called providers, to put together your pipeline. The first is your Source Provider, which allows you to define the location of the source code for your application. At this time, CodePipeline supports two source providers: artifacts zipped up and uploaded to an S3 bucket, and repositories hosted on GitHub.
Another type of provider is the Deployment Provider. This provider will deploy your application to your servers. CodePipeline supports two AWS deployment providers: AWS CodeDeploy and AWS Elastic Beanstalk. Either will need to be configured independently, and then hooked in as a CodePipeline stage.
The final types of provider are the most interesting: the Build and Test Providers. With these providers you can execute the logic for building and testing your application. As of right now, CodePipeline supports these providers in a couple different ways. For build providers, there are Jenkins and CloudBees Plugins. Once installed on your Jenkins server, you can create jobs that will poll CodePipeline and execute actions that are part of the pipeline.
If you don’t currently have a Jenkins server, and don’t particularly feel like setting one up, CodePipeline also supports custom actions so you can plug into whatever service you are working with.
For testing, you can still delegate to Jenkins or a customer provider, but CodePipeline comes with built in support for the following test frameworks: Apica and Blazemeter for load testing, Ghost Inspector for UI testing, and Runscope for API monitoring.
To get CodePipeline configured and running, you’ll need to set up a few things. First, you’ll need to install the aforementioned CodePipeline Jenkins Plugin on your Jenkins server if you’re going to be using that. This can be done from your Jenkins Manage Plugins screen, or you can upload the plugin manually following these steps.
You’ll also need an AWS IAM Role configured for CodePipeline to execute AWS actions on your behalf (mostly messing around with S3 buckets). An example of the role you’ll need to available here. When you create your pipeline, you’ll need to reference that role in its configuration.
In addition to those, if you want CodePipeline to orchestrate deployments for you, you’ll need to configure your application to be deployed via CodeDeploy or Elastic Beanstalk. Which one you pick decides on the complexity of your app: simpler applications will do very will with Elastic Beanstalk, but for applications that require a bit of configuration to get going will be better served by using CodeDeploy.
With all those parts in place, setting up your first CodePipeline is as simple as using the wizard to glue all those parts together. Alternatively, CodePipeline is supported by the AWS CLI and the AWS API, and the SDKs have already started rolling out support for it.  Now all I have left to do is sit patiently until it’s supported by CloudFormation. 🙂
Stelligent is an expert in deploying continuous delivery pipelines in AWS. If you are looking for help moving your application into CodePipeline, or another type of continuous delivery pipeline, we can definitely help you with that. Reach out to us here.