VSoft Technologies Blogs


VSoft Technologies Blogs - posts about our products and software development.

Testing code is something we all do. Whether it be manual usability testing, unit testing, or integration testing, knowing how much of the application is covered by the tests is important. Without knowing what parts of the application are covered, there is no way to know if key features are tested.

When performing unit testing there is an analytical way to determine what parts of the source code are covered by the tests. This is typically call source code coverage. Working with Delphi, one of the tools that performs this task is called DelphiCodeCoverage (open source). It can be located on GitHub (more recent fork) and SourceForge. Under the hood this tool simply marks each line of source code as "hit" when the application calls it at least once. From there it can generate a detailed report giving the overall coverage statistics for the project, as well as the individual lines not hit in the testing.

Code Coverage

What I will go through below is how to setup code coverage on a unit test project, and hook that into a continuous integration process using Continua CI. I will assume that if you require knowledge on how to setup a project on Continua CI you will refer to the Create your First Project wiki page. 

The code that I would like to get a code coverage report on is the Core.Card.pas unit. The unit tests for this class are located in the tests folder and have a corresponding name of Core.CardTests.pas. You may have noticed that some of the code paths are not fully covered in my unit tests. This is intentional, and something that we will come to a little later on. 

An extra consideration to have with a unit testing project is to make sure it can run under continuous integration. This means that it should run to completion and produce an output file that is able to be imported into the build summary. With this in mind I have created a "CI" configuration on my unit testing project. This conditionally compiles the unit testing project so that it does not wait for user input (something my debug configuration does) and generates an XML output file. 

All this code and other related scripts are located in the VSoftTechnologies/DelphiCodeCoverageExample GitHub repository. Feel free to clone it to get a better sense of code coverage and the project structure I am using. 


To generate a code coverage report I decided to use DelphiCodeCoverage. The tool has a number of command line options, all of which are spelt out on the GitHub page. Some of the options are a little overwhelming in the effort they require. An example of this is passing a file that contains all the source directories to search for classes to include in the code coverage report. Thankfully there is a wizard supplied with DelphiCodeCoverage that will help generate a batch file containing the correct parameters to pass to DelphiCodeCoverage. 

In my project I have placed DelphiCodeCoverage into a sub-folder call "CodeCoverage" and included it into source control. There are two reasons I am doing this;
1. The code coverage executable is now available everywhere the source is pulled to. 
2. It simplifies the script I will need for the CI Server. 
If your uncomfortable with placing binaries into your source control, this can be altered without affecting the produced report. 

Running the code coverage wizard your presented with a page to enter the executable, map file, source, and output directory locations. Below are the settings I have used:

The last option for the wizard allows for making all paths relative. This is exactly what we require to have our generated batch file run on any system, however at the time of writing it does not work correctly. This meant that I had to manually change all paths to a version that was relative to the folder in which DelphiCodeCoverage was located.

dcov_execute.bat before

-e "I:\Examples\DUnitX_CodeCoverage\Win32\Debug\DUnitX_And_CodeCoverage.exe" 
-m "I:\Examples\DUnitX_CodeCoverage\Win32\Debug\DUnitX_And_CodeCoverage.map" 
-uf dcov_units.lst -spf dcov_paths.lst 
-od "I:\Examples\DUnitX_CodeCoverage\CodeCoverage\Output\" -lt -html

dcov_execute.bat after

-e "..\Win32\CI\DUnitX_And_CodeCoverage.exe" 
-m "..\Win32\CI\DUnitX_And_CodeCoverage.map" 
-ife -uf dcov_units.lst -spf dcov_paths.lst -od ".\Output\" -lt -html
Note: Line breaks are only included above for readability. There are none in the resulting batch files.

Another alteration that I have made to the batch file is to include the "-ife" option. The option will include file extensions. This means that it will stop a unit like "Common.Encoding" being 'converted' to "Common". As in my project I have unit called "Core.Cards.pas" this option is required to have it included in generated code coverage report. 

Next the relative path change should be applied to the two generated list files dcov_paths.lst and dcov_units.lst. The paths file should be the only one that has path in need of altering to be relative. Both however need to be checked to make sure they contain everything to be covered in the report. If there are source folders missing they need to be added to the dcov_paths.lst file. If there are unit names missing they need to be added to the dcov_units.lst file. 

Now that the batch file and list files have been corrected running the dcov_executable.bat should produce summary out similar to that below. Note that the unit test project needs to be compiled as DelphiCodeCoverage runs the unit test executable.
*              DUnitX - (c) 2013 Vincent Parrett                     *
*                    vincent@finalbuilder.com                        *
*                                                                    *
*        License - http://www.apache.org/licenses/LICENSE-2.0        *
  Fixture : Core
     Fixture : Core.CardTests
        Fixture : Core.CardTests.TCardTest
          Test : Core.CardTests.TCardTest.A_Card_FacingUp_Once_Flipped_Is_Facing_Down
          Executing Test : A_Card_FacingUp_Once_Flipped_Is_Facing_Down
         Running Fixture Teardown Method : Destroy

         Done testing.
         Tests Found   : 1
         Tests Ignored : 0
         Tests Passed  : 1
         Tests Leaked  : 0
         Tests Failed  : 0
         Tests Errored : 0

|   Lines   |  Covered  | Covered % |
|        15 |        11 |      73 % |

Continuous Integration

With the code coverage batch file we are now able to run code coverage on any system, include on a continuous integration system. Our goal with the continuous integration is to have the unit tests built and run each time a set of code is checked into source control. This will allow us to then track if any unit tests fail, and changes in the code coverage. 

To achieve this I have create a Continua CI configuration that builds my unit test project, runs the unit tests under code coverage, and then import the unit test results into the build summary. 

The FinalBuilder action calls the FinalBuilder project responsible for compiling the DUnitX unit test project. It uses the CI configuration so that the unit tests executable will run to completion, and will produce an NUnit XML results file in the same directory as the executable. It is important to build the unit tests each time as the source code for our project would have changed each time we run the continuous integration. Note that you do not have to use FinalBuilder, you can also use MSBuild to build your DUnitX Project - see Integrating DUnitX Unit Testing with Continua CI.

The execute program action simply runs the code coverage batch file generated above. This batch file will run the unit test project we compiled and log code coverage information as it does. The result will be a summary written out to our build log while also html files written to the report folder we specified in the batch file. It is these html files which we will attach to the continuous build report a little later. 

Lastly we want to import the actual unit test results. These are written out by DUnitX as a NUnit compatible XML file which we can import with the "Import NUnit Tests" action. The results from the XML file will be attached to the build report presented by Continua CI.

As all builds for Continua CI are run on agents, and all build reports come from the server, we need to transfer the code coverage report back to the server. This is done through workspace rules on the build stage. In this example DelphiCodeCoverage writes out all html report files to a relative directory of ".\Output\". This means if we run the DelphiCodeCoverage batch file from "Source\Output\CodeCoverage\" the report should appear in "Source\Output\CodeCoverage\Output" (Note that $Source.DelphiCodeCoverage.Path$ was mapped to the \Source\ folder on the agent). Workspace rules use the greater than symbol to signal the files should be copied from the server to the agent, and the less than symbol to copy from the agent to the server. This therefore leaves use the workspace rule of "“/Output/CodeCoverage/ < \Source\CodeCoverage\Output\*.html" to get all code coverage report files back to the server. 

Now that the html reports are on the server, we need to show them against the Continua CI build. To achieve this we use the reports section of our Continua CI configuration. The reports section allows us to specify a file to attach to the build as a report to be displayed or offered as a download. In this case we want to display the report summary html file. All reports work from the server point of view, and each build has it own workspace on the server. To this end the report we want to be display would have been copied to "$Workspace$\Output\CodeCoverage\CodeCoverage_summary.html". 

The Code Coverage Report

The end report appearing in the report section of the Continua CI build summary. 

As shown in the report the example project has some code that is not covered during unit testing. This reduces the overall coverage to 73%. If I had more than one unit each would have their own code coverage summary. In addition I could click on each file and get a line by line report to see what section of the unit is not covered. 

Final Notes

It is worth mentioning that code coverage is only one arrow in a software testing quiver. In my example I purposely chose to include code that was not covered. This showed the power of code coverage in picking up where unit testing should potentially be directed to next. I also included code where the unit tests cover the code, however not fully. The code testing Core.Card.Flip only tests one path through the code, not all the possible paths. Currently the test sees if the code works when going from face up to face down, not from face down to face up. Although in this example it might be benign, it shows that other tools are needed to help cover this gap.

Delphi-Mocks Parameter Matchers

We recently updated Delphi Mocks to allow for better parameter matching on Expectations registered with the Mock. This allows the developer to place tighter controls on verifying that a mocked interface/object method is called. Below is a simple example of when the parameter matchers can be used.

procedure TExample_InterfaceImplementTests.Implement_Multiple_Interfaces;
  sutProjectSaver : IProjectSaveCheck;
  mockProject : TMock<IProject>;
  //Test that when we check and save a project, and its dirty, we save.
  //CREATE - The project saver under test.
  sutProjectSaver := TProjectSaveCheck.Create;
  //CREATE - Mock project to control our testing. 
  mockProject := TMock<IProject>.Create;
  //SETUP - Mock project will show as dirty and will expect to be saved.  

  //NEW! - Add expectation that the save will be called as dirty is returning true. 
  //       As we don't care about the filename value passed to us we 
  //       allow any string to be passed to report this expectation as met. 
  //TEST - Visit the mock element to see if our test works.
  //VERIFY - Make sure that save was indeed called.

Previously the developer writing this test would have to provide the exact filename to be passed to the mocked Save method. As we don't know what the projects filename is going to be (in our example case), we would either have to; 1. Forgo doing this test. 2. Implement a project object to test with. Both of these options are not ideal.

Parameter matchers resolve this situation. It is now simple to either restrict or broaden the parameters passed to mocked methods that will satisfy the expectation defined. To achieve this Delphi-Mocks offers eleven new functions;

function It(const AParamIndx : Integer) : ItRec;
function It0 : ItRec;
function It1 : ItRec;
function It2 : ItRec;
function It3 : ItRec;
function It4 : ItRec;
function It5 : ItRec;
function It6 : ItRec;
function It7 : ItRec;
function It8 : ItRec;
function It9 : ItRec;

The first "function It(const AParamIndx : Integer) : ItRec;" allows the developer to specify the index of the parameter they wish to set for the next expectation setup of a mock method. It(0) will refer to the first parameter, It(1) the second and so forth. Note that the reason for specifying the parameter index is that Delphi's parameter evaluation order is not defined, so we could not rely on the parameters being evaluated in order (which is what we did when we initially wrote this feature). Interestingly, with the 64 bit Delphi compiler, parameter evaluation does appear to happen in order, but we could not be certain this will always be the case. 

The other ten functions It0 through to It9 are simply wrappers of the index call passing the index in their name. All these functions return an ItRec. The ItRec has the function structure;

ItRec = record
    ParamIndex : cardinal;
  constructor Create(const AParamIndex : Integer);
  function IsAny<T>() : T ;
  function Matches<T>(const predicate: TPredicate<T>) : T;
  function IsNotNil<T> : T;
  function IsEqualTo<T>(const value : T) : T;
  function IsInRange<T>(const fromValue : T; const toValue : T) : T;
  function IsIn<T>(const values : TArray<T>) : T; overload;
  function IsIn<T>(const values : IEnumerable<T>) : T; overload;
  function IsNotIn<T>(const values : TArray<T>) : T; overload;
  function IsNotIn<T>(const values : IEnumerable<T>) : T; overload;
  {$IFDEF SUPPORTS_REGEX} //XE2 or later
  function IsRegex(const regex : string; const options : TRegExOptions = []) : string;

Each of the functions creates a different matcher. For example the IsAny<T> will cause the expectation to be met when the parameter passed to the mock is of any value that has the type T. In the example above this type would be a string. You will also notice that each function returns the type T. This is so that each call can be placed within the mock methods call directly. Doing so helps with making sure parameter types match the testing value.

IsEqualTo<T> requires that the parameter matches exactly to the value passed into the IsEqualTo<T>. This could be used to restrict the expectation to a tighter test of the functionality under test.

//Match on the filename being "temp.txt" only.
//VERIFY - Make sure that save was indeed called.

In the future we are looking to provide “And”\”Or” operators. These operators might also live on the ItRec and allow combining with as many other matchers using the same type.

//Match on the filename being "temp.txt" or "temp.doc" only.
//VERIFY - Make sure that save was indeed called.

There might be a better way to make the resulting code a bit cleaner. It would make the tests easier to read, instead of using regex which is also possible in this case. As a result we believe this would be a good edition to the library.

Feel free to clone the repository from GitHub. If you have some time to spare submit a pull requests or two with your ideas/improvements. We believe this is a great little project worthy of some attention. Let us know what you think of the changes so far.

Today we are announcing the new build step for Team Foundation Build 2015. This task will allow users of TFS on-prem and VSO to run FinalBuilder projects on Team Foundation Build agents. The task itself is open source and can be found on GitHub.

Those who use TFS on-prem will be very familiar with our XAML build activity already. This activity took a great deal of confusion out of the XAML build process. Changing a build progress from a complex workflow into a simple to maintain FinalBuilder project. The time and effort saved is huge, especially considering the "default" XAML workflow looks like this:

Thankfully Microsoft have improved on their build system with the release of Team Foundation Build 2015. You can read more about this at Team Foundation Build 2015. In summary, the new build system greatly simplifies the build process into a list of tasks to perform. 

The FinalBuilder task is our custom task for TFS Build 2015. It offers TFS script builders the ability to still have a simplified overview of their build process while still gaining the power of FinalBuilder and all its supported actions. With FinalBuilder TFS build script creators are able to perform a wide number of tasks that would otherwise require breaking out powershell and diving into the TFS agent environment variables. 

When installed, the FinalBuilder Task gives users the following UI. All of the properties present in the UI are easily accessible from within any FinalBuilder script run by the task. FinalBuilder also gives simple access to a list of files that triggered the build.

Installation and Usage

The steps to adding custom build activities to your TFS and VSO instances are quick and easy. We have created a GitHub Repository for explaining how to install, and use our FinalBuilder VSO task.

Repository Clone

To clone this repository use the following command line. You will require git to be installed and available on your path.
> mkdir VSoft
> cd VSoft
> git clone https://github.com/VSoftTechnologies/FinalBuilder-VSO.git
Cloning into 'FinalBuilder-VSO'...
remote: Counting objects: X, done.
remote: Compressing objects: 100% (X/X), done.
remote: Total X (delta 0), reused 0 (delta 0), pack-reused 0
Unpacking objects: 100% (X/X), done.
Checking connectivity... done.
With the repository cloned we require the TFS Extensions Command Line Utility (tfx-cli). It comes as a Node Package Manager (npm) package. Npm comes with both the node.js and io.js installer. Download the installer for your Windows platform and run it.

To check that NPM is working correctly you can use the npm version command
> npm -v
Now your able to install the tfx-cli package using npm. Install this globally so that its accessable on the command line. The command line for this is as follows;
> npm install -g tfx-cli
tfx-cli@0.1.11 C:\Users\<username>\AppData\Roaming\npm\node_modules\tfx-cli
├── os-homedir@1.0.1
├── async@1.4.2
├── colors@1.1.2
├── minimist@1.2.0
├── node-uuid@1.4.3
├── q@1.4.1
├── read@1.0.7 (mute-stream@0.0.5)
├── validator@3.43.0
├── shelljs@0.5.3
├── vso-node-api@0.3.4
└── archiver@0.14.4 (buffer-crc32@0.2.5, lazystream@0.1.0, async@0.9.2, readable-stream@1.0.33, tar-stream@1.1.5, glob@4.3.5, lodash@3.2.0, zip-stream@0.5.2)
To test that tfx-cli is working correctly and is on the path use the tfx command.
> tfx
Copyright Microsoft Corporation
tfx <command> [<subcommand(s)> ...] [<args>] [--version] [--help] [--json]
     TSSf         fSSSSSSSSSSSS
     SS   fSSSSSSST       SSSSS
                        FSFs    (TM)
        manage task extensions and builds
        command help
        login and cache credentials. types: pat (default), basic
        login <collection url> [--authtype <authtype>] [options]
        parse json by piping json result from another tfx command
        parse <jsonfilter> [options]
        output the version
        version [options]
   --help    : get help on a command
   --json    : output in json format.  useful for scripting
For tfx-cli to upload a task to TFS it needs to be logged in. We can do this once so that all following commands will use the some credentials. The method used depends on whether your using VSO or an On Prem installation.

On Premises Login

For on premises TFS basic authentication will need to be enabled. The tfx-cli project has a great guide on how to achieve this Using tfx against Team Foundation Server (TFS) 2015 using Basic Authentication.

Once TFS has been configured to use basic authentication use the tfx-cli login command to connect to TFS. You will be prompted for the TFS collection URL to connect to, and the username and password for accessing that collection.
> tfx login --authType basic
Copyright Microsoft Corporation
Enter collection url > http://<server>:<port>/tfs/<collection>
Enter username > <user>@<domain>
Enter password > <password>
logged in successfully
With a successful login subsequent commands will not require us to provide the credentials again.

Visual Studio Online (VSO) Login

For VSO login you need a personal access token setup under your account. There is a great article to configure an access token located at Using Personal Access Tokens to access Visual Studio Online.

With the personal access token configured use the tfx-cli login command to connect to VSO. You will be prompted for the TFS collection URL to connect to, and access token for accessing that collection.
> tfx login
Copyright Microsoft Corporation
Enter collection url > https://<vsoname>.visualstudio.com/<collection>
Enter personal access token > <access token>
logged in successfully
With a successful login subsequent commands will not require us to provide the credentials again.

Uploading Task

Once logged into TFS we are able to upload the FinalBuilder task to the server. Tasks are uploaded to the server, the server will then pass them onto agents requried to run those tasks.

To upload the task use the tfx-cli tasks upload command. Each command shown below is a sub-command of the previous, so order does matter here. The overwrite option is included so that any previously installed version is overwritten. Note however the highest version number of the task will win when running builds.

Note: This command is run under the directory in which this repositry was cloned to (i.e. FinalBuilderTFS).
> tfx build tasks upload ./FinalBuilder --overwrite
Copyright Microsoft Corporation
task at: ./FinalBuilder uploaded successfully!
To test that the FinalBuilder task is now installed on the builds page for teh collection the task was uploaded to. Create a new empty Team Foundation Build definition. After clicking "Add build step" a FinalBuilder task should appear in the "Build" category.

Further Steps

For more information on the following subjects please follow the links;

How to configure the build task refer to Task UI.
How to install FinalBuilder on an agent refer to Installing FinalBuilder.
How to create a FinalBuilder VSO agent refer to Creating a VSO FinalBuilder Agent.

Team Foundation Server XAML Builds

Team Foundation Server XAML Builds

Today we have released an update for Team Foundation Server XAML activities for FinalBuilder 7 and 8. These updates are to deal with conflicts caused by GAC installing these activities.

Since 2010 Team Foundation Server has allowed custom activities to allow provide extra functionality in XAML build workflows. For those that have not had the (dis)pleasure, XAML workflows can be difficult to work with. To alleviate this pain point we implemented XAML build templates and custom activities to allow people to run any FinalBuilder script within the build workflow without the need to edit XAMAL. Customers we have talked to say this simplifies their TFS XAML build workflows a great deal.

In previous releases, FinalBuilder’s installer automatically installed the TFS XAML activity into the GAC. The version of the activity installed was based on the TFS Agent detected on the machine in question. Installing into the GAC was done to simplify the process. This way the developer would simply use the activities in their build workflow and it would be picked up through the GAC.

With the introduction of the new TFS build in TFS 2015 and TFS-Git in TFS 2013 this is no longer advisable. In the case of TFS-Git Workflows, assembly conflicts can cause assembly load issues.  If the activity requires a different assembly version to those already loaded by the TFS agent, you would see assembly load errors with lib2gitsharp. With TFS 2013 having five updates this is increasingly possible .

The way to avoid any assembly loading issue for custom activities is to use “version control paths for custom XAML activities”. To be clear, we have left the GAC installation option in both FinalBuilder 7 and 8. We however do recommend switching to using custom XAML activity paths, especially if you’re using TFS-Git.

To this end, the FinalBuilder installers now give the option as to whether to install the XAML build activities into the GAC or not. Only FinalBuilder 7 will automatically install TFS activities into the GAC for TFS 2012 and earlier agents, with FinalBuilder 8 you much chose whether to GAC install the assemblies.

Note that if activities were previously installed in the GAC restarting the TFS Build Controller is required. This refreshes the build controller and releases any assemblies that it may have previously loaded.

Creating a build definition

The creation of the build definition is exactly the same as before. If you’re interested in how to setup a build definition from scratch using the FinalBuilder XAML templates please review the “FinalBuilder and Team Foundation Server” article.

Custom XAML activities

Version control paths for custom XAML activities is a feature in TFS XAML build controllers. This feature allows the build controller to source all assemblies required for an activity from a known location. If a required assembly is missing from this location the standard .Net assembly lookup methodology is used.

Using version control paths for custom XAML activities requires the activity assemblies to be added to a repository on the TFS system. This repository can be shared with other code, or can be a repository just for the assemblies. As the custom folder does not change based on the build being performed, we suggest a separate TFS repository for the custom activities. The repository can also be either a TFS or Git-TFS repository. Both will work the same.

To add the custom activity assemblies to a repository connect to the repository through team explorer in visual studio.

In the source control view create a new folder that will hold the custom activity assemblies.

In the GAC sub-folder in your FinalBuilder installation (typically “%ProgramFiles(x86)%\FinalBuilder 8\GAC\”) there are folders for each version of TFS custom activities are provided for. From the folder relating the TFS version copy all assemblies contained within into the newly created repository folder.

From the source control explorer add these files to source control.

Updating the XAML Build Controller

Next the build controller needs to be configured use the repository location. In the builds tab of team explorer click on the “Actions” link. A drop down will appear with the option to “Manage Build Controllers…”. Click the “Manage Build Controllers…” menu item.

From the build controllers window that opens select the build controller responsible for the FinalBuilder builds. If there is more than one controller simply follow these steps for each controller. Next click the “Properties” button.

This will present the build properties dialogue in which the “Version control path to custom assemblies” can be set. Select the folder that was created in the repository that is responsible for the custom activity assemblies. Now confirm this change by clicking OK on the build controller properties dialogue.

This has now setup the build controller to source custom assemblies from the configured repository folder. The most recent checked in assemblies will always be sourced. Therefore keeping this repository folder in sync with the custom activities used in build workflows is very important.

Once the custom path is set the build controller can run builds using the custom activities. To test simply queue a build using the FinalBuilder custom activities. 

Version 1.7 of Continua is now released. A big thank you to all those who downloaded the beta and especially those of you who reported issues and bugs.

This version introduces several new features, many of which have been requested by users over the past few months. These features are built upon the various improvements and bug fixes applied in revisions to version 1.6. Please don’t dismay if your requested feature is not included yet, it is still high on our to-do list. Indeed we have several other features specced out, and some partially developed in the background.

Version 1.7 Features

New Builds View dashboard

This view is useful for project administrators and shows a list of active builds across all viewable configurations. This includes running builds, queued builds and builds awaiting promotion.

New panel of indicators

Some important numbers including the total count of queued and running builds, as well as available agents and concurrent build licenses.

New Repositories tab

This is accessed via the Configurations view and shows status of each repository. We've also included “Check  Now” buttons for immediately polling each repository. You can also initiate repository checking from all existing repository pages

Project-wide and configuration versioning options.

We've added some new options in the details section of the project and configuration wizards

  • Project-wide versioning: The build version number can now be incremented across many configurations within a project.

  • Build number re-use: A new option at the project or configuration level to decrement the version counter when a build is discarded while initialising. e.g. due to configuration conditions. Please note that the build number will be decremented only if no other build has started in the mean time and is using a later build number.  

Improvements to Build Completed triggers.

  • Variable expressions: You can now use expressions when defining variables allowing you to pass information from triggering to triggered build.

  • New conditions tab: This allows you to use expressions to control whether a build is triggered

Improvements to Repository triggers.

  • Trigger on specific file change types: Triggers can now be set to start only when the changeset contains certain types of file changes e.g. additions, modifications and deletions.

  • Trigger file pattern: You can now specify a file pattern for repository triggers to restrict triggering only to changesets containing matching files.

  • Trigger comment pattern: You can also limit triggering to changesets with specific text in the comment.

Other build features

  • New force repository check option in queue build dialog allowing control over whether to recheck repository when building. There is also a default setting for each configuration

  • Improvements to Stop Build buttons on dashboard view to ensure that the build stopped is always the latest build at the time when the button was clicked. Stop build dialogs also now display the build number of the build being stopped.

Actions and event handlers

  • New node.js actions
    • Package management with Npm and Bower
    • Grunt and Gulp build runners
    • Unit testing with Mocha

  • New build event handler for posting status updates to a Stash server

  • Log Entry action now allows you to add the message as a build comment. This can be useful for showing additional build details on the build view page.

  • New comments field on all actions – displayed as a tooltip in Stages editor.

  • New ContinuaCI.* system environment variables are now available to all executable actions.


  • Execute Program, DOS Command and PowerShell actions now include an option to generate a context XML file. This file contains details of the build including repositories, changesets and files for you to parse with your own script or program.

Git repositories

  • Case-only renames are now recorded in the repository cache.

  • New option to list author instead of committer as changeset username

Version 1.7 is ready for you to download and install. All feedback is welcome!