Amazon Pipeline Starter Kit for Continuous Delivery

  • August 18, 2016

As DevOps consultants, at Flux7 we believe that Continuous Delivery (CD) is a key tenet of successful DevOps. And as heavy users of Amazon Web Services (AWS), we have a keen interest in any tools or features that streamline CD for our clients within AWS. For this reason, we are pretty excited to dive into the Amazon Pipeline Starter Kit. Now, you may be familiar with two services that Amazon has traditionally offered to help facilitate CD: AWS CodePipeline and AWS CodeDeploy.

The Pipeline Starter Kit takes advantage of both of these services for people who don’t want to set up the resources themselves. The starter kit includes an AWS CloudFormation template to create the pipeline and all of its resources. (The template uses the US East region.) For those of you unfamiliar with these two services, or could use a refresher, Amazon defines them as such:

  • AWS CodePipeline is a continuous delivery service for fast and reliable application updates. CodePipeline builds, tests, and deploys your code every time there is a code change, based on the release process models you define. This enables you to rapidly and reliably deliver features and updates. With AWS CodePipeline, you only pay for what you use. There are no upfront fees or long-term commitments.

  • AWS CodeDeploy is a service that automates code deployments to any instance, including Amazon EC2 instances and instances running on-premises. AWS CodeDeploy makes it easier for you to rapidly release new features, helps you avoid downtime during application deployment, and handles the complexity of updating your applications. You can use AWS CodeDeploy to automate software deployments, eliminating the need for error-prone manual operations, and the service scales with your infrastructure so you can easily deploy to one instance or thousands.

Starting the kit, we followed the instructions and found it to be a fairly straightforward process for those familiar with CodeDeploy and CloudFormation and who want to learn about CodePipeline. While we experienced a slight hiccup with our Jenkins server, we were able to quickly recover and found that once the stack was created it was straightforward to see it working.

You will find that the Pipeline Kit creates the following components:

  1. A development web server
  2. A production web server
  3. A Jenkins server
  4. An S3 bucket
  5. A CodePipeline
  6. And a CodeDeploy application and Deployment groups for Dev and Prod servers

Based on the value provided for an Input Parameter, Jenkins either pulls the code from github or from an S3 bucket path and then builds the pipeline using Maven build steps. Then using a CodePipeline plugin, the kit triggers the actions for CodePipeline. CodePipeline then orchestrates the process of deploying it first to the Development Server and when it succeeds, it deploys to the Production Server. The kit performed exactly as expected, based on Amazon’s tutorial and instructions. If you’re trying this yourself for the first time, it’s important to note that the Pipeline Starter Kit will include services beyond the AWS Free Tier so the use of the kit will result in charges.  

Given its performance and ability to take advantage of solid services for CD, I believe that the Pipeline Kit could have significant value to organizations building – or looking to improve – their cloud-based DevOps.

Additionally, organizations looking to excel at CD and DevOps should consider the following elements that can bring additional value by helping further streamline and/or automate DevOps and CD:

  • A template for integrating CodeCommit with CloudFormation, which is the template basis for the starter kit. This would most certainly help streamline the creation of future code pipelines as the template provided by AWS requires a github username and authentication token as input parameters.

    However, if CodeCommit is used and the Jenkins IAM role as proper permissions then the user need not pass any authentication as parameter. It will be securely handled with IAM.
  • Create cross-account CodePipelines where resources can be used from one AWS account to another. When development and production possess different accounts yet want to make CodePipeline work for both, they may create cross-account pipelines. In this case, the pipeline will certainly use resources created (and likely managed) by another AWS account. To achieve cross-account pipelines, users will need to create an AWS KMS key, add it to the pipeline, and set up account policies and roles to enable cross-account access. Note that source actions cannot use Amazon S3 buckets from other AWS accounts.

    For a deeper look at cross-accounts access, please see our article that walks-through this topic.

Crawl, walk, run is a typical IT metaphor that applies well here. As AWS consultants who see the benefits of cloud-based DevOps, and more specifically, Continuous Delivery, we appreciate those features that enable organizations to move more quickly through the crawl phase and shorten the time to benefit. Is your organization looking to reap the benefits of CD?

To read more about optimized code delivery click here to download a paper

To learn more about cloud-based DevOps and how it can help with your organizational maturity, or to schedule an assessment, please contact us today.

Did you find this useful? 

Interested in getting tips, best practices and commentary delivered regularly? Click the button below to sign up for our blog and set your topic and frequency preferences.

Subscribe to our blog


Related Blog Posts