Last Updated Feb 28, 2011 — Enterprise Agile Planning expert
Enterprise Agile Planning

TeamForgeI was recently asked if the CLI could assist with build automation, my answer is of course it can.  In fact there are a number of different ways the CLI could be used here, from simple to more complex.  For example, if all you wanted to do was upload your nightly build to a release you could add this shell command to the end of your build process:

ctf go rel1234 upload

Now going a bit further you may want to gather some information from TeamForge that can be used to help drive the build process, and below is an example of how to use the templating engine in the CLI for a simple build.

In this example I’m going to use a more traditional Makefile to create a zip file of velocity templates. You of course could create templates for pretty much any format appropriate to your build process. I’m using the Makefile format here to keep this example simple.

First, a brief review of the format of a Makefile:

    target: dependency
    build commands

    In a Makefile, a target is the name of the file or thing you want to build. That target can have a number of things that have to be done first before it can run the appropriate build commands (see GNU Make for more).

    Then, I create a zip file from content checked out of a Subversion repository.  Here is what I need the final Makefile to look like:

      svn checkout --username dspeers
      zip: templates
      zip templates
      ctf go rel3425 upload

      When invoked as “make zip” the subversion repository will be checked out and then a new zipfile with the contents of the checked out repository will be created and the zip file uploaded to the TeamForge FRS. The next thing to do is convert this to a template.

      First, lets say that we are going to use an artifact in a “Build” tracker to collect the parameters we need for the build.  We will keep the id of the repository in a field called “RepositoryID”, the name of the target we want to build in a field called “Target” and finally the build number in a field called “Build”.   We will also want to know what release to upload the build to and for that we will use the built in “resolvedReleaseId” field.  Given all the above, here is what our Makefile template will look like (which we will name Makefile-template):

        zip: [:field:Target:]
        zip [:field:Target:]-[:field:Build:].zip [:field:Target:]
        ctf go [:field:resolvedReleaseId:] upload [:field:Target:]-[:field:Build:].zip

        Now we need a CLI script that will merge the template with the values from an artifact. In this example the CLI script will take the artifact ID as an argument. It could easily be extended to use a filter to find all artifacts in a “Ready To Build” status and process them, leaving them in a “Build Complete” status. But for now, I’ll keep things simple and handle just the one artifact and call this CLI script “build.ctf”.

        • # Load the template and open an output file to save the results in
          template load Makefile-template
          output Makefile
          # Go to the artifact that has the data we need.
          go $1

          # go to the repository and save the checkout command in SVNCO
          go `print RepositoryID` ctf set -e SVNCO scmco

          # Merge the artifact with the template
          template header

          # Increment the Build number by 1
          set -e Build expr `print Build` + 1

          # Return the CLI to its main level, close the output and run the build.
          shell make zip

        Finally, to invoke this CLI script do this:

        ctf --script build.ctf artf1234

        To see what each command does in detail you can look at the CLI Command Reference, but I will focus on go and template commands.

        The first “go $1” loads the artifact into the CLI so that each subsequent command has access to the data in the artifact.  The second go is being used to capture the Subversion checkout command in the SVNCO variable.  And because it is part of the same go statement, the context does not leave that of the previous go.

        With the template command the load argument tells the template engine to import and prepare the template.  The example template is a simple one with no sections defined, so entire thing is treated as a header.  Mid-way through the script “template header” tells the CLI to merge the values of the current object and send the results to the open output file.

        A template can have many sections and each section ( including header and footer ) can be merged multiple times.  This can be useful when generating several rows of output, such as the data points on this line graph.

        As I noted before, this CLI script could be built as something that iterates over a list of artifacts and and run as a cron job, thus creating what amounts to a build on-demand service.   You could even capture the build output and add it as a comment to the controlling artifact and set the status to note the build return code.

        So there you have it, a simple build script that takes data from a TeamForge artifact to drive the build and then upload the final file back to TeamForge.  The are many possibilities here, but what I hoped to demonstrate with this intentionally simple example is one of the many things you can do with the TeamForge CLI.

        If you have any questions about this or others things you can do with the CLI, post a comment to the CLI forum, I’d like to hear what you think.

        Are you ready to scale your enterprise?


        What's New In The World of

        May 19, 2023

        What is SAFe PI Planning?

        PI Planning aims to bring together all the people doing the work and empower them to plan, estimate, innovate, and commit to work that aligns with the business’s high-level goals, vision, and strategy.

        Learn More
        July 5, 2022

        How to bring external data to Agility

        Silvia Davis, Sr. Product Marketing Manager at, tells her story of how a positive app experience led to the realization that proper data integration is essential to the entire application lifecycle.

        Learn More
        April 19, 2022

        Happy Anniversary!

        This year, turns two! Continue reading for insight on’s journey and what plans we have for the future.

        Learn More