Skip to main content
DevOps icon showing cogs

This post is from the XebiaLabs blog and has not been updated since the original publish date.

Last Updated Dec 30, 2014 — DevOps Expert

Microservices, Containers, & Docker Lead to More Efficient Software Services

DevOps

As organizations strive to become more agile, over the years they’ve embraced technologies that help with software development. Docker is one such technology that has had a profound effect on software development. This open-source platform debuted in 2013 and has grown in popularity ever since. Docker utilizes containers to package code together, and microservices that allow developers to build single-function modules. Containers virtualize the operating system and take up less space than virtual machines, making them a fast and portable development option. 

Docker and other container services have been around for several years, and have demonstrated a significant benefit to many instances of application development and delivery. Most initial questions have been addressed and there are many solutions available. Agile organizations have embraced microservices and IoT as a central part of designing and delivering IT services. 

Storage

One of the major benefits of containers is that they are temporary in nature and don’t take up a lot of space. However, the data is also temporary. How do developers handle persistent, writable storage for containers? Some data storage solutions include: 

  1. Docker data volume - allows persistent data storage and retrieval within a container. But an existing volume can’t be attached to a running or new container.
  2. Data volume container - a dedicated container used to host a volume and mount that volume space to other containers. However, Docker provides no locking or security measures to maintain data integrity. 
  3. Directory mount - a local host directory mounted in a container. The source directory can be any directory on the source running the container, and can be used by one or more containers at the same time. Because they offer unrestricted access to the host file system, you must take care not to overwrite them.
  4. Storage plugins – third-party interfaces map access to external storage devices. However, if you move the container to another host, you’ll lose the storage association. 

Failover

How do you address the issue of failover with containers? In the event of a failure, organizations want an automatic and seamless backup system. The containers themselves don’t need backup. If something happens, a container can be restarted and will be up and running again in a short amount of time.

However, you need to back up the data itself. Depending on how that’s done, it can bottleneck as its restored, causing delays. Today, this is addressed with a number of solutions and open-source applications available to backup and restore data. The main objective here is that organizations must define their failover requirements and have a solution in place that best meets their needs.

Delivery model

Containers allow developers to isolate code function by function and control the environment. This makes it easier to debug, modify, and update. This flexibility helps with agile development and lends itself to organizations that want concurrent software releases by multiple teams. 

Even with these benefits, there’s still the question of what developers will deliver. Organizations must decide if that’s going to be handing over the container descriptor (Dockerfile), a “compiled” image, or using another delivery format. No matter what the decision, the consistent environment makes it easier to test, deploy, and roll back if necessary.

Release strategy

Containers can speed delivery and portability, and can be leveraged for a continuous delivery pipeline. They are cost-effective and allow fast deployment in multiple environments. With the added flexibility, organizations need to address the logistics of each release. What strategies (or combinations of strategies) will it use? 

You may choose to deploy each service on every commit, have a set schedule, or bundle services in a release train-type of setup. Whatever you choose, you are dependent on your budget, schedule, priorities, and how you meet the users’ needs. Containers allow for flexibility, but planning is still key.

Ownership

Part of the decision for the delivery model for containers is to identify who owns or is responsible for which piece of the deliverable. Does Operations provide the container runtime? Are the developers responsible for any problems that happen inside the container? What about security: can you sufficiently isolate your containers to prevent them from affecting other systems

Remember that just because someone can deploy an application, doesn’t mean they should. Have a policy and an approval process so you know who is deploying what.

Patching

Applying patches is fairly simple with containers: update the base image, then rebuild the application image. Your main decision is whether or not you need to be able to make changes to systems that are currently running, and to identify who is responsible. 

Support and licensing

In the early days when Docker was new, support was more of a gray area. Just because your OS was supported by your contract, the containers running in Docker might be running an unsupported OS. This caught some organizations by surprise when they had problems but found out they had no support available. 

Now that Docker has been around for several years, your support contracts should be clearer. Know what support your contract covers, and make sure it’s enough for your needs.

Licensing should also be clear. In theory, containers cut down on the number of licenses needed. Unlike virtual machines, containers are usually not considered operating systems. However, check your other software vendor’s licenses for their policy on containers. They may be considered separate systems, or regarded simply as processes on the system that is actually hosting the container framework. This can make a difference as to whether or not containers make sense.

Containers: an Agile tool

Containers and microservices allow developers the capability to make bigger changes to architecture on an as-needed basis. The organizational agility gained helps companies save time and money while deploying the updates their users need. This functionality allows organizations to focus on value delivery and stay current with technology in an ever-changing environment, without being chained to a specific architecture setup commitment from years ago.

More from the Blog

View more
Ascension Launch Banner
Apr 26, 2022

Get ready for peak performance with Digital.ai’s newest AI-Powered DevOps Platform Ascension Release

DevOps
Today, Digital.ai is excited to announce our latest AI-Powered DevOps ...
Read More
Jan 24, 2022

Digital.ai Value Stream Delivery for SAFe®: The key to amazing business outcomes

DevOps
The Scaled Agile Framework (SAFe) is the world’s leading framework for ...
Read More
Dec 09, 2021

How SaaS and cloud-based solutions helped the U.S. Department of Veterans Affairs achieve digital transformation

DevOps
Modernizing legacy systems was an ongoing goal for the U.S. Department ...
Read More
Nov 29, 2021

Increase velocity and reduce risk with AI and machine learning

DevOps
Artificial Intelligence (AI) and machine learning (ML) have proven use ...
Read More
Contact Us