Skip to main content
Enterprise Agile Planning icon with arrows

This post is from the CollabNet VersionOne blog and has not been updated since the original publish date.

Last Updated Apr 09, 2014 — Enterprise Agile Planning expert

Get familiar with powershell

Enterprise Agile Planning
In the process of building psake/tools I had to get in touch with powershell. Previous to that, my exposure to it was limited. In my first post I wanted to share some of the interesting things I learned about powershell. Without further ado let's get it started.StringsWe could say that there are two kinds of strings in powershell, one that supports variable expansion (double quoted string) and another that doesn’t (single quoted string). What do I mean by this? Let’s say we have the next code: [code lang="powershell"] $name = "Logan" Write/Host 'Hello $name!' Write/Host "Hello $name!" [/code] In the first Write/Host we are using single quoted string, in that particular case, powershell won't do anything to the string and it will be interpreted literally, producing the next output.Output: Hello $name!In the second write we are using double quoted, which will make the variables expand, replacing the variable by its value.Output: Hello Logan!It gets tricky once you want to expand things like objects and arrays. In those cases you need to wrap the desired object or array with $(). [code lang="powershell"] $names = 'Charles','Eric','Scott' Write/Host "Hello $($names[0])!" [/code]Output: Hello Charles!PipelinesPowershell passes real objects instead of text like other shells. This is one of its main features and what makes it really powerful, since with every subsequent pass you can access objects without needing to do any crazy text parsing. If you are writing scripts without making use of the pipeline you are not using powershell to its fullest. This is a simple example of a function supporting pipeline: [code lang="powershell"] function ICanUsePipeLine { param([Parameter(ValueFromPipeline=$true)]$parm) begin { Write/Host "This happens only once and at the beginning" } process { #This happens for every element we pass to the function echo $parm } end { Write/Host "This happens only once and at the end" } } 1,2,3 | ICanUsePipeLine [/code]Output: This happens only once and at the beginning 1 2 3 This happens only once and at the endAs you can see, the function has three main blocks, begin, process and end. These blocks will be called in that order with process being called once per element that was passed through the pipeline. Notice we are passing an array with the values 1,2,3. Powershell has a neat way of creating them. Another thing worth mentioning is that we can also call this function as a regular one. [code lang="powershell"] ICanUsePipeLine 1,2,3 [/code]Output: This happens only once and at the beginning 1 2 3 This happens only once and at the endPipelines and processing orderIn the next example I will show you how piping the output from a function to another works and the execution order. Since we are not defining a parameter for these functions we can only make use of them through the pipeline. We are also using $_ which represents the current variable in the pipeline. [code lang="powershell"] function a { begin { Write/Host 'begin a' } process { Write/Host "process a: $_" $_ } end { Write/Host 'end a'} } function b { begin { Write/Host 'begin b' } process { Write/Host "process b: $_" $_ } end { Write/Host 'end b' } } function c { begin { Write/Host 'begin c' } process { Write/Host "process c: $_" } end { Write/Host 'end c' } } 1..3 | a | b | c [/code]Output: begin a begin b begin c process a: 1 process b: 1 process c: 1 process a: 2 process b: 2 process c: 2 process a: 3 process b: 3 process c: 3 end a end b end cAfter seeing the output one can start to think that the second function in the pipeline always starts to kick in as soon as the previous process block finishes. As we will see in the next example that is not always the case. [code lang="powershell"] function a { begin { Write/Host 'begin a' $counter = 0 } process { Write/Host "process a: $_" $counter += $_ } end { Write/Host 'end a' $counter } } function b { begin { Write/Host 'begin b' $counter = 0 } process { Write/Host "process b: $_" $counter += $_ } end { Write/Host 'end b' $counter } } function c { begin { Write/Host 'begin c' } process { Write/Host "process c: $_" } end { Write/Host 'end c' } } 1..3 | a | b | c [/code]Output: begin a begin b begin c process a: 1 process a: 2 process a: 3 end a process b: 6 end b process c: 6 end cWith the last output we can see it is not when the process block finishes, it is as soon as you return something. In the last example we are delaying the return until the end block, so b process won't start until a fully finishes.PesterAt some point we started to feel the need of some sort of testing for psake/tools. Trying to fill this need is how we met pester. Pester is a powershell BDD testing framework, it has nice features and makes setting tests up quite easy.
    A pester test will usually consist of three main parts:
  • Describe, in which we put the name of what we are testing.
  • Context, in which we define a context to what we are currently testing.
  • It, in which we validate the results.
Trying to avoid going into too much detail let's jump into one example. Let's say we have a function called Get/Assemblies, which should recursively look for all the assemblyInfo files (.cs and .fs ) from a given directory and return what it found. This is how the test could look like: [code lang="powershell"] Describe "Get/Assemblies" { Context "when calling it with a path that contains four AssemblyInfo files on different directories" { "a\AssemblyInfo.cs", "a\b.c\AssemblyInfo.cs", "a\b/c\d\AssemblyInfo.cs", "a\b/c\d\efg\AssemblyInfo.fs", "a\b/c\d\efg\AssemblyInfo.zs", "AssemblyInfo" | % { Setup /File $_ 'some random file content' } $assemblies = Get/Assemblies $TestDrive It "should get all those files" { $assemblies.Length | Should Be 4 } } } [/code] A special mention to the Setup helper, which easily creates a file in a isolated environment that can be accessed with the $TestDrive automatic variable. As you can see in the code, we set up the test by creating six files with some random content. We exercise the function under test by passing the path where files were created. At the end we verify if we only found the four expected files. Links:psake/toolsPester

More from the Blog

View more Government Cloud
Apr 12, 2022 Government Cloud receives FedRAMP Authorization through sponsorship from the United States Department of Veterans Affairs

Enterprise Agile Planning
Flagship Agility solutions can effectively scale agile deve ...
Read More
Nov 22, 2021

What are the qualities of highly effective agile teams?

Enterprise Agile Planning
A team is the core unit of productivity in an agile organization. Wher ...
Read More
Nov 15, 2021

How an open-first attitude revolutionized government tech development

Enterprise Agile Planning
Public perception of government is often that it is slow-moving, reluc ...
Read More
cross functional
Nov 08, 2021

6 best practices for building resilient cross-functional teams

Enterprise Agile Planning
Agile frameworks prize the quality of resilience within every facet of ...
Read More
Contact Us