Over the past few years, containers have emerged as possibly the most important trend in enterprise technology since the advent of hardware virtualization—and Docker is the most popular. But what do containers really do, and is adopting Docker the right move for your organization? Read on to find out.
A Crash Course on Containers
The first thing to realize is that containers are actually not a new technology; they have been built into Unix-like operating systems for decades. Essentially, they are a way of bundling applications and their dependencies into a unit that can be easily shipped, deployed, and run in isolation from other processes. Each container encapsulates a running application and its user space and runs on top of the underlying operating system's kernel. As a result, a container can be distributed and deployed independently of the host machine, as long as the OS kernel is the same.
In other words, containers do away with complicated dependency management and the need to set up complex environments by hand, and instead provide a sandboxed environment for applications to run in. The result is a faster and more reliable development and deployment process, as well as increased security.
If you're thinking that this kind of application and dependency bundling sounds a lot like virtual machines, you're right. The difference is that virtual machines bundle the entire operating system, while containers work with individual applications.
This might not seem like a big deal, but the consequences are huge. Containers are drastically smaller and faster than virtual machines. A hardware setup that can support only a few dozen virtual machines can often run hundreds of containers, and a typical container will load in milliseconds, compared to seconds or even minutes for a virtual machine. This makes containers an ideal mechanism for emerging development practices such as microservices, and they can facilitate highly scalable architectures aligned with DevOps best practices, including continuous delivery.
This is one of the reasons containers are so hot right now, and it explains why a company like Google runs all its internal and external services within containers. But Google has an army of top engineers to make sure its container setup works well. What should you do if you want to try containers but don't have those kinds of resources?
Even though containers have been around for decades, chances are you've only heard of them in the last few years. In fact, the growing popularity of containers is directly tied to the growing popularity of Docker, a platform for managing containers on Linux systems.
Docker is not a container system itself; it relies on the container capabilities of the underlying operating system. Instead, Docker simplifies the creation and distribution of container images. Just as importantly, Docker also enables runtime constraints on the way containers use hardware resources, providing the kind of resource sharing that's typical of virtual machines.
Docker came along at the right time to popularize the decades-old container technology. While several alternatives exist, Docker is undeniably the king of the container mountain right now.
Is Docker Right for You?
So far, we've talked about the big benefits of containers relative to virtual machines and to traditional, bare-metal environments. But before you jump on the Docker train, there are a couple of issues you should be aware of.
The first is the pace of development. Docker is being developed at a rapid clip, and there might be subsequent issues with backward compatibility. A new release of Docker typically comes out once or twice a month, and updating the Docker binary currently requires shutting down all containers running on the machine. These hurdles might be blockers for many deployment scenarios.
Second, security within containers is still an open topic. While containers provide additional isolation compared to nonvirtualized environments, they provide less isolation (and therefore protection) than virtual machines.
One solution to this problem is to run containers inside a virtual machine. Another option is to use open source analysis tool that are able to detect open-source components in the container itself (CentOS and Debian) and the software inside, alerting the team to components with known security vulnerability.
Even with these issues, containers hold the promise of easier development and deployment processes. New technologies are also being developed to help manage containers as a service, enabling operations to manage swarms of containers and programmatically address much-needed scalability and reliability. As the increasing adoption of and support for Docker throughout the industry indicate, it’s definitely worth investing some resources to figure out whether your organization could also benefit from this emerging technology.