The Impact of Containerization on DevOps


· ·

Ever since its formal introduction in 2008, DevOps has helped organizations shorten the distance from development to production. Software teams are delivering changes faster without sacrificing quality, stability, or availability. To do this, developers and IT operations staff have started working closely together to create a faster, more efficient pipeline to production.

Around the same time, containers started transforming the way we develop and deploy applications. 20% of global organizations today run containerized applications in production, and more than 50% will do so by 2020. But how has the growth of containers impacted DevOps, and where will that lead us in the future?

Thinking Differently About Applications

Not too long ago, applications were deployed as large, bulky, monolithic packages. A single build contained everything needed to run the entire application, which meant that changing even a single function required a completely new build. This had a huge impact on operations, since upgrading an application meant having to stop the application, replace it with the new version, and start it up again. Virtual machines and load balancing services removed some of the pain, but the inefficiency of managing multiple virtual machines and dedicated servers made it difficult to push new features quickly.

Containers allowed us to split up these monolithic applications into discrete and independent units. Like VMs, containers provide a complete environment for software to run independently of the host environment. But unlike VMs, containers are short-lived and immutable. Instead of thinking of containers as lightweight VMs, we need to think of them more as heavy processes. For developers, this means breaking down monolithic applications into modular units. Functions in the original monolith are now treated as services to be called upon by other services. Instead of a single massive codebase, you now have multiple smaller codebases specific to each service. This allows for much greater flexibility in building, testing, and deploying code changes.

This idea of breaking up applications into discrete services has its own name: microservices. In a microservice architecture, multiple specialized services work together to perform a bigger service. The kicker is that any one service can be stopped, swapped out, restarted, and upgraded without affecting any of the other services. One developer can fix a bug in one service while another developer adds a feature to another service, and both can occur simultaneously with no perceivable downtime. Containers are the perfect vessel for microservices, as they provide the framework to deploy, manage, and network these individual services.

Giving Power to Developers

Containers also empower developers to choose their own tools. In the old days, decisions about an application’s language, dependencies, and libraries had far-reaching effects on both development and operations. Test and production environments would need to have the required software installed, configured, and fully tested before the application could be deployed. With containers, the required software travels with the application itself, giving developers more power to choose how they run their applications.

Of course, this doesn’t mean developers can just pick any language or platform for their containers. They still need to consider what the container is being used for, what libraries it requires, and how long it will take to onboard other developers. But with the flexibility and ease of replacing containers, the impact of this decision is much less significant than it would be for a full-scale application.

Streamlining Deployments

Containers don’t just make developers’ lives easier. With containers providing the software needed to run applications, operators can focus on providing a platform for the containers to run on. Orchestration tools like Docker Swarm and Kubernetes have made this process even easier by helping operators manage their container infrastructure as code. Operators can simply declare what the final deployment should look like, and the orchestrator automatically handles deploying, networking, scaling, and mirroring the containers.

Orchestration tools have also become a significant part of continuous integration and continuous deployment (CI/CD). Once a new base image has been built, a CI/CD tool like Jenkins can easily call an orchestration tool to deploy the container or replace existing containers with new versions generated from the image. The lightweight and reproducible nature of containers makes them much faster to build, test and deploy in an automated way than even virtual machines.

When combined, CI/CD and container orchestration means fast, automated deployments to distributed environments with almost no need for manual input. Developers check in code, CI/CD tools compile the code into a new image and perform automated tests, and orchestration tools seamlessly deploy containers based on the new image. Except for environment-specific details such as database connection strings and service credentials, the code running in production is identical to the code running on the developer’s machine. This is how companies like Etsy manage to deploy to production up to 50 times per day.

Security From the Start

Despite the recent publicity, volume, and scale of data breaches in IT systems, security is often treated as an afterthought. Only 18% of IT professionals treat security as a top priority for development, and 36% of IT organizations don’t budget enough for security. Surprisingly though, most feel security would be beneficial development: 43% believe fixing flaws during development is easier than patching in changes later, and only 17% believe that adding security to the development process will slow down DevOps.

Combining security and DevOps is what led to DevSecOps. The goal of DevSecOps is to shift security left in the software development lifecycle from the deployment phase to the development phase, making it a priority from the start. Of course, this means getting developers on-board with developing for security in addition to functionality and speed. In 2017, only 10 to 15% of organizations using DevOps managed to accomplish this.

“Security is seen as the traditional firewall to innovation and often has a negative connotation. With shifting security left, it’s about helping build stuff that’s innovative and also secure.”
– Daniel Cuthbert, Global Head of Cyber Security Research at Banco Santander (Source)

As DevSecOps becomes more ingrained in development culture, developers won’t have a choice but to embrace security. The immutable nature of containers make them impractical to patch after deployment, and while operations should continue monitoring for vulnerabilities, the responsibility for fixing for these vulnerabilities will still fall to developers. The good news is that with containers, vulnerabilities can be fixed, built, tested, and re-deployed up to 50% faster than with traditional application development methods.

Moving Forward

Although DevOps and containerization are still fairly new concepts, they’ve already sparked a revolution in software development. As the tools and technologies continue to mature, we’ll start to see more companies using containers to build, change, test, and push new software faster.


Leave a Comment

Your email address will not be published. Required fields are marked *

Skip to toolbar