VSoft Technologies Blogs


VSoft Technologies Blogs - posts about our products and software development.

Automise 5 Beta

Today we are delighted to release the Automise 5 BETA, which contains our new Stepping Engine and Action List Dependencies as the headline features.

For five years we have updated and improved Automise 4 through our continuous delivery cycle. This has worked well. Allowing for improvements to actions (like FTP\SFTP\FTPS suite) to come out gradually and consistently. Allowing everyone to pick and choose at which point to take feature updates.

The Automise 5 signals a major "tick" to this regular flow of updates. The majority of these updates are at the core of what Automise does to solve your automation challenges.

What's new in Automise 5a

Stepping Engine

We have undertaken a major rewrite of the internal stepping engine for Automise 5. This has reduced the moving parts, while also enabled extra features to be implemented. In the end this will mean your projects will run faster, consume less resources, while also providing some extra tools for debugging projects.

Action List Dependencies

Action Lists now allow for listing of other Actions Lists they are dependent on. Dependencies are always run before the action lists which depend on them. For example this allows specifying a UploadAndClean Action List that depends on the Clean and Upload Action List. When UploadAndClean is run, if the Clean and Upload Action Lists have not been run they will be.

Action List Dependencies

Step into included projects

Due to the previous version of the stepping engine stepping into included projects was not possible. Instead the user had to wait for the included project to complete before continuing with debugging. Stepping into included projects with Automise 5 will now open the included project, and continue stepping from inside the included project.

Breakpoint Conditions

Another addition to the debugging experience is breakpoint conditions. These allow stopping the executing of a script at a certain point in time. Conditions can be a number of passes over the breakpoint, or when a variable equals a certain value.

Action List Dependencies

IDE Themes (Light and Dark)

After five years we thought it was time Automise got a new coat of paint. We have implemented two new themes, a light and dark theme (defaulting to the dark on first run up).

Action List Dependencies Action List Dependencies

Action List Out Parameters

Action Lists now allow for retrieving any number of values from them. A variable assigned to the out parameter on the Action List will be given the value of that parameter when the Action List has completed. This will allow for more Action Lists that generate values for use else where in the Automise Project.

Project Formats

Since the start of Automise the project files have used XML for their structure. As Automise has grown, so too have the elements in the projects XML file. This has placed more strain on those left to diff versions of Automise projects.

To aleavate this challenge Automise 5 has introduce two major updates to the Automise project file structure.

   1. A new DSL project file format (the new default format)

   2. A new XML project file format

The new Automise DSL structure is concise, and very simple to diff.

    projectid = {04710B72-066E-46E7-84C7-C04A0D8BFE18}
    Action List
        name = Default
        Action Listid = {E6DE94D6-5484-45E9-965A-DB69885AA5E2}
                id = {D860420B-DE46-4806-959F-8A92A0C86429}
The new Automise XML structure is a great deal less verbose than the older format.
<?xml version="1.0" encoding="UTF-8"?>
        <Action List>
            <Action Listid>{E6DE94D6-5484-45E9-965A-DB69885AA5E2}</Action Listid>
        </Action List>


New Actions

Not much to report here, most of the focus has been on the Stepping engine and the IDE. We do have some updates to AWS EC2 and Azure in progress, they will most likely be added in an update when they are ready. 

How do I get the Beta?

Links to the beta downloads will be published to the Automise Downloads page. 

What if I find a bug?

Email support (please added Beta to the subject). When reporting an issue, be sure to include the beta build number and details about your environment. Please test with the latest beta build before reporting bugs. 

We are particularly keen for people to load up their existing projects from older (ie 4 or earlier) versions of Automise, save them in AT5 format, and load them again and confirm that everything loaded ok. 

When will it be released?

When it's ready ;) Seriously, though, we expect the release to happen in the next few weeks. Automise 5 is based on FinalBuilder 8, which has been out for several months now and is quite stable. 

Code Signing Changes for 2016

Microsoft announced some time ago that windows 7 & higher would no longer trust anything that is code signed with an SHA1 (https://en.wikipedia.org/wiki/SHA-1 ) certificate as of 1st Jan 2016. The reason for this is well documented, SHA1 has become increasingly vulnerable and is no longer secure enough to be trusted.

What do I need to do?

First things first, check your code signing certificate. If it's current, and uses SHA1, contact your certificate issuer for a replacement SHA256 certificate. Most issuers have a formal process for this since this is something they have known about for a while and should not charge extra (KSoftware/Comodo do not charge). In our case, our certificate was renewed in Nov 2014 and was already an SHA256 certificate. 

What if I need to support Windows XP/Server 2003?

Windows XP & Server 2003 do not support SHA256, so this deprecation of SHA1 does not apply to those versions of windows. If you sign with your SHA256 certificate using the SHA256 digest algorithm, you will find you code is not trusted on those versions of windows. The trick is to use the SHA1 digest algorithm.   

So do I need separate installers for XP and Windows 7+ ?

Well that's one way to do it, but you can support XP and windows 7+ with a single installer or exe, by signing twice, with SHA1 and SHA256. 

NOTE: If you are a long time FinalBuilder user and still using the Authenticode action, then don't, it's been deprecated for some time as it uses the deprecated capicom.dll api. The only reason we haven't removed it is to avoid errors when you load your old projects. The correct actions to use for code signing are the SignTool actions.

Recent versions of Signtool.exe  include a switch (/as) to append a signature ( the default operation is to replace the primary signature). I believe the windows 8.1 sdk  was the first version to include this option (and other related options).

In my experiments, I found that you need to sign with SHA1 first, then SHA256. The reason for this is that WinXP only looks at the first signature and would not recognise the timestamps from any RFC3161 timestamp servers that I tried. The signtool options that allow adding additional signatures (/as for signing, /tp for timestamping) only work with RFC3161 compliant timestamp servers, so the SHA1 signature and timestamp must be done first since we can't use /as or /tp with a non RFC3161 timestamp server. 

Sign, then TimeStamp

Whilst signtool can sign and timestamp in a single operation (and the SignTool Sign action in FinalBuilder can too), I prefer to do the timestamp step separately. The reason for this is that signing rarely fails (typically only when the certificate has expired or you get the password wrong!), but timestamping fails often, because the timestamp server may be unreachable or it just has some issue and doesn't respond correctly.

By doing the timestamp operation separately, we can retry if timestamping fails. Often, just a few seconds between retries is enough (unless your internet connection is down), and there is always the option of using a different timestamp server.  

So the order of events is :

Sign SHA1.
Sign SHA256 (with append signature /as).
Timestamp SHA1 - using older style authenticode timestamp server.
Timestamp SHA256 (/tp with index 1 ) - using an RFC3161 compliant timestamp server.

Show me how!

I have created an examples repository on github : FinalBuilder Examples - you can find a nice example there showing how to double (optionally) sign and then timestamp (with retries).  This example requires FinalBuilder or later (added support for /tp option on timestamp action). 



A while back I published the VSoft.CommandLineParser library on github, which makes it simple to handle command line options in delphi applications. The first version only did enough to satisfy the needs I had in DUnitX.

In another project I’m working on, I needed a command mode, where each command had a unique set of options, but keeping the ability to have global options.  I have tried to implement the command mode in a backwards compatable manner, and so far the only change I had to make to an existing project was adding a const to a parameter.

Adding Commands

Adding commands is quite simple, using TOptionsRegistry.RegisterCommand.

cmd := TOptionsRegistry.RegisterCommand('help','h','get some help','','commandsample help [command]'); 
option := cmd.RegisterUnNamedOption<string>('The command you need help for', 
  procedure(const value : string) 
    THelpOptions.HelpCommand := value; 

Note: this method returns a TCommandDefinition record that you can add options to. The reason for using a record rather than an interface here, is because delphi interfaces do not suport generic methods. Records do, so we use the record type as a wrapper around the ICommandDefinition interface.

The helpstring parameter allows you to specify a longer help message that can be displayed when showing command usage.

Handling Commands

The ICommandLineParseResult interface has a new Command property (string) which is used to determine the selected command. It’s up to you how to actually run the commands.

Showing Usage

The PrintUsage method now has some overloads and has some formatting improvements, and TOptionsRegistry also has new EnumerateCommands and EmumerateCommandOptions methods which make it relatively simple to handle showing usage etc yourself if you want to.

Where is it?

The source with samples is available on GitHub - https://github.com/VSoftTechnologies/VSoft.CommandLineParser

This topic is something that I pulled from our support system, it's something we get asked about more that once a year, that is, how do I modify the xml manifest file using FinalBuilder. Typically its the assembly version attribute that users want to modify, so that's what I'll show here, but you can use the same technique to edit other parts of the manifest file. 

So lets define our XML Document by adding an XML Document Define Action, and point it at our manifest file.

Define XML Document

If you open your manifest file in notpad, you will notice assembly element looks something like this :

<assembly xmlns="urn:schemas-microsoft-com:asm.v1"  manifestVersion="1.0" >

Note the xmlns attribute, this is what causes users problems with the xml actions in FinalBuilder, XML Namespaces. The MXSML Parser is very strict when it comes to namespaces, and it requires that we make use of them when using XPath to select nodes. On the XML Document action, switch to the MSXML Parser tab and in the Extra Namespaces grid, add the following. 

Namespace Prefix

What we are doing here is assigning a prefix (x in this case) to the namespace. This prefix will be used in our XPath statements.

Add an Edit XML File action - set the XML File to use an XML Document and select the document we defined with the previous action. Now we need to define the XPath statement to the version attribute that we are going to modify.

Edit XML

And finally, add a Save XML Document Action to save our changes to the file. Note that if you are editing other parts of the manifest file, make sure you add the namespaces and different prefixes, and use this prefixes appropriately in your XPath statements.

Note, all of this could be done in a single Edit XML File action, however, if you want to make more than one modification to the manifest file then it's more efficient to use the xml document define action to avoid loading/parsing/saving the document for each edit. 

Testing code is something we all do. Whether it be manual usability testing, unit testing, or integration testing, knowing how much of the application is covered by the tests is important. Without knowing what parts of the application are covered, there is no way to know if key features are tested.

When performing unit testing there is an analytical way to determine what parts of the source code are covered by the tests. This is typically call source code coverage. Working with Delphi, one of the tools that performs this task is called DelphiCodeCoverage (open source). It can be located on GitHub (more recent fork) and SourceForge. Under the hood this tool simply marks each line of source code as "hit" when the application calls it at least once. From there it can generate a detailed report giving the overall coverage statistics for the project, as well as the individual lines not hit in the testing.

Code Coverage

What I will go through below is how to setup code coverage on a unit test project, and hook that into a continuous integration process using Continua CI. I will assume that if you require knowledge on how to setup a project on Continua CI you will refer to the Create your First Project wiki page. 

The code that I would like to get a code coverage report on is the Core.Card.pas unit. The unit tests for this class are located in the tests folder and have a corresponding name of Core.CardTests.pas. You may have noticed that some of the code paths are not fully covered in my unit tests. This is intentional, and something that we will come to a little later on. 

An extra consideration to have with a unit testing project is to make sure it can run under continuous integration. This means that it should run to completion and produce an output file that is able to be imported into the build summary. With this in mind I have created a "CI" configuration on my unit testing project. This conditionally compiles the unit testing project so that it does not wait for user input (something my debug configuration does) and generates an XML output file. 

All this code and other related scripts are located in the VSoftTechnologies/DelphiCodeCoverageExample GitHub repository. Feel free to clone it to get a better sense of code coverage and the project structure I am using. 


To generate a code coverage report I decided to use DelphiCodeCoverage. The tool has a number of command line options, all of which are spelt out on the GitHub page. Some of the options are a little overwhelming in the effort they require. An example of this is passing a file that contains all the source directories to search for classes to include in the code coverage report. Thankfully there is a wizard supplied with DelphiCodeCoverage that will help generate a batch file containing the correct parameters to pass to DelphiCodeCoverage. 

In my project I have placed DelphiCodeCoverage into a sub-folder call "CodeCoverage" and included it into source control. There are two reasons I am doing this;
1. The code coverage executable is now available everywhere the source is pulled to. 
2. It simplifies the script I will need for the CI Server. 
If your uncomfortable with placing binaries into your source control, this can be altered without affecting the produced report. 

Running the code coverage wizard your presented with a page to enter the executable, map file, source, and output directory locations. Below are the settings I have used:

The last option for the wizard allows for making all paths relative. This is exactly what we require to have our generated batch file run on any system, however at the time of writing it does not work correctly. This meant that I had to manually change all paths to a version that was relative to the folder in which DelphiCodeCoverage was located.

dcov_execute.bat before

-e "I:\Examples\DUnitX_CodeCoverage\Win32\Debug\DUnitX_And_CodeCoverage.exe" 
-m "I:\Examples\DUnitX_CodeCoverage\Win32\Debug\DUnitX_And_CodeCoverage.map" 
-uf dcov_units.lst -spf dcov_paths.lst 
-od "I:\Examples\DUnitX_CodeCoverage\CodeCoverage\Output\" -lt -html

dcov_execute.bat after

-e "..\Win32\CI\DUnitX_And_CodeCoverage.exe" 
-m "..\Win32\CI\DUnitX_And_CodeCoverage.map" 
-ife -uf dcov_units.lst -spf dcov_paths.lst -od ".\Output\" -lt -html
Note: Line breaks are only included above for readability. There are none in the resulting batch files.

Another alteration that I have made to the batch file is to include the "-ife" option. The option will include file extensions. This means that it will stop a unit like "Common.Encoding" being 'converted' to "Common". As in my project I have unit called "Core.Cards.pas" this option is required to have it included in generated code coverage report. 

Next the relative path change should be applied to the two generated list files dcov_paths.lst and dcov_units.lst. The paths file should be the only one that has path in need of altering to be relative. Both however need to be checked to make sure they contain everything to be covered in the report. If there are source folders missing they need to be added to the dcov_paths.lst file. If there are unit names missing they need to be added to the dcov_units.lst file. 

Now that the batch file and list files have been corrected running the dcov_executable.bat should produce summary out similar to that below. Note that the unit test project needs to be compiled as DelphiCodeCoverage runs the unit test executable.
*              DUnitX - (c) 2013 Vincent Parrett                     *
*                    vincent@finalbuilder.com                        *
*                                                                    *
*        License - http://www.apache.org/licenses/LICENSE-2.0        *
  Fixture : Core
     Fixture : Core.CardTests
        Fixture : Core.CardTests.TCardTest
          Test : Core.CardTests.TCardTest.A_Card_FacingUp_Once_Flipped_Is_Facing_Down
          Executing Test : A_Card_FacingUp_Once_Flipped_Is_Facing_Down
         Running Fixture Teardown Method : Destroy

         Done testing.
         Tests Found   : 1
         Tests Ignored : 0
         Tests Passed  : 1
         Tests Leaked  : 0
         Tests Failed  : 0
         Tests Errored : 0

|   Lines   |  Covered  | Covered % |
|        15 |        11 |      73 % |

Continuous Integration

With the code coverage batch file we are now able to run code coverage on any system, include on a continuous integration system. Our goal with the continuous integration is to have the unit tests built and run each time a set of code is checked into source control. This will allow us to then track if any unit tests fail, and changes in the code coverage. 

To achieve this I have create a Continua CI configuration that builds my unit test project, runs the unit tests under code coverage, and then import the unit test results into the build summary. 

The FinalBuilder action calls the FinalBuilder project responsible for compiling the DUnitX unit test project. It uses the CI configuration so that the unit tests executable will run to completion, and will produce an NUnit XML results file in the same directory as the executable. It is important to build the unit tests each time as the source code for our project would have changed each time we run the continuous integration. Note that you do not have to use FinalBuilder, you can also use MSBuild to build your DUnitX Project - see Integrating DUnitX Unit Testing with Continua CI.

The execute program action simply runs the code coverage batch file generated above. This batch file will run the unit test project we compiled and log code coverage information as it does. The result will be a summary written out to our build log while also html files written to the report folder we specified in the batch file. It is these html files which we will attach to the continuous build report a little later. 

Lastly we want to import the actual unit test results. These are written out by DUnitX as a NUnit compatible XML file which we can import with the "Import NUnit Tests" action. The results from the XML file will be attached to the build report presented by Continua CI.

As all builds for Continua CI are run on agents, and all build reports come from the server, we need to transfer the code coverage report back to the server. This is done through workspace rules on the build stage. In this example DelphiCodeCoverage writes out all html report files to a relative directory of ".\Output\". This means if we run the DelphiCodeCoverage batch file from "Source\Output\CodeCoverage\" the report should appear in "Source\Output\CodeCoverage\Output" (Note that $Source.DelphiCodeCoverage.Path$ was mapped to the \Source\ folder on the agent). Workspace rules use the greater than symbol to signal the files should be copied from the server to the agent, and the less than symbol to copy from the agent to the server. This therefore leaves use the workspace rule of "“/Output/CodeCoverage/ < \Source\CodeCoverage\Output\*.html" to get all code coverage report files back to the server. 

Now that the html reports are on the server, we need to show them against the Continua CI build. To achieve this we use the reports section of our Continua CI configuration. The reports section allows us to specify a file to attach to the build as a report to be displayed or offered as a download. In this case we want to display the report summary html file. All reports work from the server point of view, and each build has it own workspace on the server. To this end the report we want to be display would have been copied to "$Workspace$\Output\CodeCoverage\CodeCoverage_summary.html". 

The Code Coverage Report

The end report appearing in the report section of the Continua CI build summary. 

As shown in the report the example project has some code that is not covered during unit testing. This reduces the overall coverage to 73%. If I had more than one unit each would have their own code coverage summary. In addition I could click on each file and get a line by line report to see what section of the unit is not covered. 

Final Notes

It is worth mentioning that code coverage is only one arrow in a software testing quiver. In my example I purposely chose to include code that was not covered. This showed the power of code coverage in picking up where unit testing should potentially be directed to next. I also included code where the unit tests cover the code, however not fully. The code testing Core.Card.Flip only tests one path through the code, not all the possible paths. Currently the test sees if the code works when going from face up to face down, not from face down to face up. Although in this example it might be benign, it shows that other tools are needed to help cover this gap.