Blogs Home
Thursday, June 17, 2021 12:00 PM

Docker: The Next Level of Efficiency in Software Delivery

Written by Eric Carter, Team Leader, Specialist Internet,Web Developer - iatricSystems

Healthcare business graph image 2 iStock

Back in the old days, you know, like ten years ago, welcoming a new application into your data center meant purchasing, installing, and configuring hardware. All of this long before you could even begin to contemplate having the vendor proceed with implementation.

While we all miss that fresh new-server smell, the drumbeat of technical evolution marches on, and it wasn’t long before virtualization hit breakaway speed and took off among the masses. In more recent times, data centers that did not migrate to the cloud are now effectively “private clouds” themselves, and the efficiency gained through full virtualization has been substantial.

IT departments transitioned into managers of IaaS, standing up new virtual machines, and decommissioning old machines through a few clicks of the mouse. Additionally, with fewer computing resources sitting idle at any given time, it became far easier to determine how much hardware was really needed. In many cases, this was less than what was being used before virtualization took hold. When you combine the savings in hardware costs and the time savings of automated virtual machine management, it’s clear why nearly everyone has gone virtual.

So, where do we go from here? What’s the next phase in computing that can provide a boost in operational efficiency? While there are many possible answers to these questions, the remainder of this post will discuss Docker and how it can increase efficiency in IT management.

Finding Efficiency with Docker
Docker has been around as an open-source project since 2013 and has been a supported component within the Microsoft Windows Server landscape since Server 2016. And today, it runs happily on a virtual machine.

At its core, Docker provides a means to bundle an application, its dependencies, and the required environment configuration into an image. During application deployment, this image is downloaded to the server, and one or more containers are started using the image as a blueprint. While each of the containers runs on the hardware of the host server, the processes in the containers are isolated from the “outside world.” As a result, containers can be started and stopped at will. The number of containers can also be increased or decreased as desired to handle additional processing volume through a process known as swarming.

Essentially, Docker can be seen almost like a mini-VM – all the isolation, but none of the overhead. There is no massive hypervisor to manage or to consume resources. So, the flexibility is there to increase and decrease application processing as necessary, but without the need to spin up a new virtual machine from scratch.

In addition to all of this, Docker:

  • Allows for fast deployment of updates. No MSIs mean you can simply download the image and go.
  • Facilitates portability from one machine to the next. Moving the application is as easy as moving the image.
  • Provides more granular control over the resources allocated to an application in a container.
  • Is free as a component of Windows Server 2016 and beyond.

At iatricSystems™, we have adopted Docker as a core deployment architecture for our applications. We strongly believe that the rapid deployment and simplified maintenance have made Docker an obvious choice to provide the best service to our customers. And as we all march forward in search of increased operational efficiency, Docker is an important option to keep in your arsenal. Happy computing!