In 2013 Gene Kim, Kevin Behr and George Spafford published The Phoenix Project, a book that marries the concepts of manufacturing agility from Eliyahu Goldratt’s The Goal and relates them to IT. As they elucidate in the story, a new approach to IT is clearly needed and many organizations are embracing that change through the DevOps methodology. However, DevOps can be a very broad term making it difficult for people to know where to begin. As a result, we have narrowed the DevOps model
down to something actionable, defining it as a methodology that streamlines delivery of the four pillars of infrastructure, code, server configurations, and security rules.
We learn from both the Phoenix Project and The Goal that working with smaller batches is actually better as doing so reduces risk, and saves you the time of having to go back and re-do your work when an error occurs. For example, code should be pushed in smaller batches as should changes to infrastructure. Yet, the downside to this approach is the overhead of extra movement that smaller batches creates and the room for error as more and more things move around the environment. To reduce the overhead of too many pushes, the Flux7 DevOps team believes that automation can play an important and meaningful role. Indeed, automation makes it cheaper, faster and more reliable to do things in smaller batches, and is what enables us to streamline delivery.
Using our working definition, DevOps adoption will streamline the delivery of four primary components:
- Infrastructure – Everything below and up to the virtual machine.
- Code – The actual code being developed by developers in-house.
- Server Configurations – Software prerequisite(s) that go on the server before code can run.
- Security Rules – Organization-specific security constraints such as access control.
Each of these four pillars have a streamlined mechanism for delivery so that we can release smaller batches with agility, while delivering technology faster and more efficiently.
Tooling to Streamline: 4Cs
At Flux7, we call the tooling that we use to streamline these deliveries the four Cs:
- Cloud – What we really like about the cloud is that in order to streamline delivery, you must be able to summon different components with automation. That is, being able to create new Virtual Machines, routers, load balancers, etc. requires a very streamlined interface, which the cloud has.
AWS is our favorite cloud provider because it is the most automated and allows for nearly all its components to be provisioned and configured entirely through code, with no manual steps necessary. When it comes to the cloud and streamlining these different components, we first rely on AWS to give us a mechanism through which we can provision infrastructure without manual interventions and secondly rely on AWS and HashiCorp tools (like AWS CloudFormation, AWS EC2 Systems Manager, and Terraform) to automate provisioning.
- Configuration Management – The concept here is to provision the software that is required by an application. Configuration management helps with the delivery of server configuration components. We most often use Ansible and Chef to streamline delivery of confirmation management. Kubernetes and AWS ECS are also tools of choice.
- Containers – With containers, we also want to provision software but to encapsulate the application and all its software dependencies within a single container. This approach helps with the delivery of both code and server configuration components. Docker and Nomad are our tools of choice for this pillar.
- CI/CD – Continuous integration and delivery is really a concept that can be deployed across all components, though it is most typically applied to code where we can have a streamlined, automated process — for example, a process that moves a piece of code from a developer’s laptop to production. When infrastructure is being defined as code, (thanks to tools like AWS CloudFormation and Terraform,) and when server configuration is being defined as code, (via tools like Ansible,) we can build CI/CD of code, server configuration and infrastructure.
When security is defined as code, we can even have continuous integration and delivery of security rules. Within the CI/CD arena, the tools we use are AWS Code* Services, Jenkins, Atlassian, and Packer as they can all help build workflows and orchestration layers. And, as a result, they are central to streamlining DevOps.
With this overview in hand, now you are ready to take the DevOps discussion even deeper. We recommend the following reading to delve further into how you can effectively adopt DevOps in your organization:
- The Flux7 Enterprise DevOps Framework, a model for marrying DevOps process improvement with digital transformation.
- Seven Steps to Successful and Sustainable DevOps Transformation
- The Socratic Approach: Why We Start DevOps Projects With “Why?”.
- Why We Teach Our Customers How to Fish
- DevOps Case Studies
Post Date: 07/20/2017