MORE LIKE THIS
Docker: The first true devops tool?
How-to: Get started with Docker
Review: Docker 1.0 is ready for prime time
InfoWorld | Sep 18, 2014
Docker
Docker is like a forest fire. This new Linux container technology is igniting everything in its path, and many of us are having trouble keeping up with how quickly it is scorching the earth. Not only is Docker one of the most popular open source projects in history, but it is fundamentally shifting the way people think about building applications.
FEATURED RESOURCE
PRESENTED BY SCRIBE SOFTWARE
10 Best Practices for Integrating Data
Data integration is often underestimated and poorly implemented, taking time and resources. Yet it
LEARN MORE
Many of the ideas behind Docker-based applications are not original (strictly speaking), but Docker brings a fresh new slant to these old concepts. With many cloud development practices, Docker encourages best practices like 12-Factor Applications, which were developed for building PaaS-based apps and are now catching on for Docker-based apps.
What can we learn from the Docker inferno? Let’s look at four areas.
Monolithic cloud application development is dead. It is being replaced by microservices architectures, which decompose large applications – with all the functionality built-in – into smaller, purpose-driven services that communicate with each other through common REST APIs.
In the ’90s, a similar concept was called interface/component-based architecture. More recently, SOA (service-oriented architecture) seemed to gather some momentum. Now microservices concepts have become a standard meme in the Docker community, where the trend is to decompose applications into decoupled, minimalist, and specialized containers that are designed to do one thing really well.
The fully encapsulated Docker container enables microservices by creating a highly efficient distribution model for microservices applications. This changes cloud development practices by putting larger-scale architectures like those used at Facebook and Twitter within the reach of smaller development teams.
Even though Puppet, Chef, Salt, and others pioneered the devops movement, those tools are still more popular with the ops teams than they are with developers.
Docker is the first devops tool that's as popular with developers as it is with ops engineers. Why? Developers can work inside the containers, and ops engineers can work outside the containers in parallel.
When development teams adopt Docker, they add a new layer of agility to the software development lifecycle. The big difference is consistency. Docker-based applications run exactly the same on a laptop as they do in production. Because Docker encapsulates the entire state around an application, you do not have to worry about missing dependencies or bugs due to architectural differences in the underlying operating system.
Continuous integration has been a great way to reduce the number of bugs in your final product by automatically testing your code. But there are two big drawbacks of continuous integration.
First, it's hard to encapsulate all the dependencies. Traditional CI (continuous integration) /CD (continuous delivery) technologies like Jenkins or Travis build application slugs by pulling source-code repositories. Although this works relatively well for many applications, there can be binary dependencies or OS-level variations that make code run slightly different in production than in dev/test/QA. Because Docker encapsulates the entire state of the application, there is more certainty that the code running in dev/test/QA runs exactly the same in production.
Second, continuous integration was not built for microservices architectures. CI was built with the assumption that an app was located in one code repository. However, Docker best practices encourage microservices architectures with various Docker containers all loosely coupled. This is creating a new breed of CI/CD tools like Drone and Shippable that were built from the ground up with Docker containers in mind. These tools allow you to start testing multi-container applications that draw from multiple code repositories.
RESOURCES
WHITE PAPER
Accelerating Oracle with Preferred Reads
WHITE PAPER
Leverage the Power of APIs to Turbocharge Your Mobile Strategy: 7 Steps to a Successful API Program
SEE ALL
GoInstead of tuning your own service containers like Hadoop, Nginx, or MongoDB, Docker encourages the open source community to collaborate and fine-tune containers in theDocker Hub, a public repository that makes best-of-breed containers available to everyone. Because Docker containers can encapsulate state, they allow you to be more flexible in configuring software to run its best.
Thus, Docker is changing cloud development practices by allowing anyone to leverage encapsulated community best practices in the form of stitching together other people’s containers. It is like having a Lego set for cloud components that finally have standards for sticking them together.
Every so often a new technology appears and disrupts the status quo. Up until now, the cloud has been dominated by on-demand, API-driven virtual machines and services built around virtual machines. This has created a set of tools designed to the limitations of virtual machines.
Docker is rapidly changing the rules of cloud and upending the cloud technology landscape. Smoothing the way for CI/CD, microservices, open source collaboration, and devops, Docker is changing both the application development lifecycle and cloud engineering practices. Every day, thousands of new developers are happily rearchitecting or building new Docker-based apps. Understanding where the Docker fire is spreading is the key to staying competitive in an ever-changing world.
Lucas Carlson is the chief innovation officer of CenturyLink, a global cloud, hosting, network and telecommunications provider. He publishes weekly Docker advice, tutorials, and open source software like Panamax (Docker Management for Humans) in the CenturyLink Labs Blog.