Duplicate files

(Jim O'Connor) #1

When copying files in Automise we are attempting to identify duplicate files and when a duplicate is found have the job move the duplicate files to a unique folder.

What method should we use to identify duplicates during a transfer job?



(Jason Smith) #2

It really depends on how many files / folders you have, and how much time you desire the task to take.

The simply approach is to use the fileset iterator, and loop through each file testing to see if its a duplicate before using a copy command. If it is a duplicate you can then change the destination location for the copy (through the use of a variable). Be sure to change the variable back to the default location for the next loop.

The action to use for testing if the file is a duplicate would be the file compare action. Note that this is only an external look at the file based on date/time, size, checksum or any combination of the three. It does not do file contents inspection.

If you have a large number of files / folders and require the task to execute quickly, I suggest looking at robocopy or xcopy command lines as these are optimised for large volume processing.