This post is from the XebiaLabs blog and has not been updated since the original publish date.
Three ways of using Digital.ai Release & Jenkins for continuous delivery
When discussing Digital.ai Release, formerly XL Release, with users, we are often asked how it relates to Jenkins or other continuous integration (CI) tools, how they can complement each other, and how best to integrate them.
At Digital.ai, formerly XebiaLabs, we use Digital.ai Release in combination with Jenkins in a number of different ways. In the second part of this blog, I'll outline the three main integration scenarios we use. First, however, I'd like to briefly discuss the first question: how we see Jenkins (and other CI tools) and Digital.ai Release complementing each other.
Certainly, with pretty much any CI tool, we can chain together a number of jobs, plans, or whatever they're called. Some tools provide additional plugins or features to also explicitly visualize this chain as a "delivery pipeline", and can show the state of the overall pipeline by displaying the state of the individual jobs.
There are four characteristics of what we might call "real world" continuous delivery (CD) pipelines, though, that we find are addressed much more effectively by the combination of Digital.ai Release and a CI tool:
All stakeholders can see which features and releases are currently where, not just for one pipeline but for the set of pipelines relevant to a user in one go. CI tools can typically show you the state of one pipeline, just about, but we found it hard or impossible to quickly view a set of pipelines relevant to the current context.
With Digital.ai Release, every user has their own "pipeline dashboard filter", and we can easily search on context by tagging releases, see pipelines planned to finish at a certain time, etc.
Supporting fully-automated pipelines stages (usually the initial ones) as well as time-planned, team-driven stages (usually the final ones) with some automated and some manual steps. For instance, our Digital.ai Release and Digital.ai Deploy, formerly XL Deploy, releases are accompanied by forum posts which are still manually published at present, and which need to happen on a certain date.
With only a CI tool, we found minimal, at best, support for these kind of "end game" activities, which meant that our pipelines effectively had to stop at the integration testing phase. With Digital.ai Release, we can still use Jenkins for what it's good at and extend our pipeline all the way to go live - which, after all, is what you need to do in order to really start benefiting from the customer feedback loop that CD provides.
Handling dependencies in a microservice environment, or with multi-component applications, or even in the increasingly common "mobile app & backend" scenario, was something we found very hard to do satisfactorily with CI tools.
Yes, jobs/plans can often be configured to wait until another job/plan has completed, but can we ensure that we're waiting on the correct invocation of that other pipeline? E.g. if our "release mobile app" job is waiting on the release of the backend to complete, how can we ensure it's really the "deploy to QA" job for version 1.3.1 of the backend we're waiting for, and not 1.3.2 or 1.3.0?
We also found it hard or impossible to effectively visualize the resulting dependency graph, or see e.g. what the potential impact on the timing of our mobile release would be if the backend testing needed to be extended or re-run, was not possible. With Digital.ai Release and a CI tool, we can easily track dependencies between our releases and see how delays to one component affect our overall go-live date.
One of the things we've learned is that you're never "done" with improving your pipelines because they always change: new applications are added, pipelines are split as architectures are modified, release procedures change etc. etc. So for us, identifying which phases in which pipelines are the best candidates for improvement is pretty important.
Getting usable data out of a CI tool was painful: yes, we were able to extract the durations of the various jobs to come up with total throughput times for the pipeline. Identifying idle time and, especially, spotting trends across multiple runs of the same pipeline, or even multiple pipelines, ended up involving up far too much Excel magic.
Using Jenkins in combination with Digital.ai Release allows us to avoid most of the "homemade" reporting. We use Digital.ai Release's planner to get an immediate visualization of where most of a pipeline's time is being spent, and the automated value stream mapping to discover which phases are most in need of attention.
In the following post, I'll discuss the three main combinations of Digital.ai Release and Jenkins we use: "Digital.ai Release First," "Jenkins First" and "Hybrid Pipeline".