Microservices, Containers, & Docker Lead to More Efficient Software Services
As organizations strive to become more agile, over the years they’ve embraced technologies that help with software development. Docker is one such technology that has had a profound effect on software development. This open-source platform debuted in 2013 and has grown in popularity ever since. Docker utilizes containers to package code together, and microservices that allow developers to build single-function modules. Containers virtualize the operating system and take up less space than virtual machines, making them a fast and portable development option.
Docker and other container services have been around for several years, and have demonstrated a significant benefit to many instances of application development and delivery. Most initial questions have been addressed and there are many solutions available. Agile organizations have embraced microservices and IoT as a central part of designing and delivering IT services.
One of the major benefits of containers is that they are temporary in nature and don’t take up a lot of space. However, the data is also temporary. How do developers handle persistent, writable storage for containers? Some data storage solutions include:
- Docker data volume - allows persistent data storage and retrieval within a container. But an existing volume can’t be attached to a running or new container.
- Data volume container - a dedicated container used to host a volume and mount that volume space to other containers. However, Docker provides no locking or security measures to maintain data integrity.
- Directory mount - a local host directory mounted in a container. The source directory can be any directory on the source running the container, and can be used by one or more containers at the same time. Because they offer unrestricted access to the host file system, you must take care not to overwrite them.
- Storage plugins – third-party interfaces map access to external storage devices. However, if you move the container to another host, you’ll lose the storage association.
How do you address the issue of failover with containers? In the event of a failure, organizations want an automatic and seamless backup system. The containers themselves don’t need backup. If something happens, a container can be restarted and will be up and running again in a short amount of time.
However, you need to back up the data itself. Depending on how that’s done, it can bottleneck as its restored, causing delays. Today, this is addressed with a number of solutions and open-source applications available to backup and restore data. The main objective here is that organizations must define their failover requirements and have a solution in place that best meets their needs.
Containers allow developers to isolate code function by function and control the environment. This makes it easier to debug, modify, and update. This flexibility helps with agile development and lends itself to organizations that want concurrent software releases by multiple teams.
Even with these benefits, there’s still the question of what developers will deliver. Organizations must decide if that’s going to be handing over the container descriptor (Dockerfile), a “compiled” image, or using another delivery format. No matter what the decision, the consistent environment makes it easier to test, deploy, and roll back if necessary.
Containers can speed delivery and portability, and can be leveraged for a continuous delivery pipeline. They are cost-effective and allow fast deployment in multiple environments. With the added flexibility, organizations need to address the logistics of each release. What strategies (or combinations of strategies) will it use?
You may choose to deploy each service on every commit, have a set schedule, or bundle services in a release train-type of setup. Whatever you choose, you are dependent on your budget, schedule, priorities, and how you meet the users’ needs. Containers allow for flexibility, but planning is still key.
Part of the decision for the delivery model for containers is to identify who owns or is responsible for which piece of the deliverable. Does Operations provide the container runtime? Are the developers responsible for any problems that happen inside the container? What about security: can you sufficiently isolate your containers to prevent them from affecting other systems?
Remember that just because someone can deploy an application, doesn’t mean they should. Have a policy and an approval process so you know who is deploying what.
Applying patches is fairly simple with containers: update the base image, then rebuild the application image. Your main decision is whether or not you need to be able to make changes to systems that are currently running, and to identify who is responsible.
Support and licensing
In the early days when Docker was new, support was more of a gray area. Just because your OS was supported by your contract, the containers running in Docker might be running an unsupported OS. This caught some organizations by surprise when they had problems but found out they had no support available.
Now that Docker has been around for several years, your support contracts should be clearer. Know what support your contract covers, and make sure it’s enough for your needs.
Licensing should also be clear. In theory, containers cut down on the number of licenses needed. Unlike virtual machines, containers are usually not considered operating systems. However, check your other software vendor’s licenses for their policy on containers. They may be considered separate systems, or regarded simply as processes on the system that is actually hosting the container framework. This can make a difference as to whether or not containers make sense.
Containers: an Agile tool
Containers and microservices allow developers the capability to make bigger changes to architecture on an as-needed basis. The organizational agility gained helps companies save time and money while deploying the updates their users need. This functionality allows organizations to focus on value delivery and stay current with technology in an ever-changing environment, without being chained to a specific architecture setup commitment from years ago.