Decision:
This post will be rather lengthy so I will get right to the point. I am trying to decide which of three alternatives to use for our build process. I hope those of you with experience will be able to provide some insight- I am hoping to hear that some of you are using 1 of these methods (or combination of them).
1) Use FinalBuilder (FB) project on Configuration Management (CM) server to execute scripts remotely on target servers (using PSEXEC) which will perform pre-copy & post-copy tasks (at object level AND server level), as well as copy the objects. (target pulls from master using scripts)
2) Use FB project on CM server to execute FB projects on target servers which will execute pre-copy & post-copy tasks (at object level AND server level), as well as copy the objects. (target pulls from master using FB)
3) Use FB project on CM server to execute pre-copy & post-copy tasks (at object level AND server level), as well as copy the objects on remote servers. (master pushes to target using FB)
Background:
We are an ASP (application service provider) Windows shop, and have a mix of VB6, VB.NET, ASP.NET, SQL, SQL2005, classic asp. All of these file types get deployed every week. We have a weekly PROD build cycle, and a twice-daily Alpha (QA) build cycle.
When we started out, the developers wrote some rather cool VB apps that let them drag and drop their files/sql scripts into it, and then let you generate a Change Request (CR) batch file. Another app was created which allowed the build deployer to drag and drop CR files into it that generated the “build manifest” (BM) batch file. The BM.bat is executed on each target prod server- and simply executes all of the CRs with a few parameters.
Last year I was a lowly developer, unsatisfied with the build process (I had some previous experience in CM) so I looked around and found FinalBuilder. I began a rather ambitious plan to make our entire build process data-driven, so that we can keep track of all of the objects in our company (it’s a HUGE mess right now). About 3 months ago I actually took the Configuration Manager position so am now in a position to transition our current build process to what we want it to be.
Before I took the position, the weekly build involved the CM performing the following steps:
a) Copy the weeklybuild folder (which houses all of the CR.bats, and BM.bats as well as the SQL scripts in a central location) from our alpha platform to ALL of the production servers
b) Log on to each production server and:
a. Perform any necessary pre-copy tasks on the server that were not already scripted in the CRs.
b. Execute the BM.bat file and make sure it had no errors.
c. Perform any necessary post-copy tasks on the server that were not already scripted in the CRs.
When this process was first developed there was no .NET (no gacutil, no regasm, no services to stop). Essentially the pre/post copy tasks were shutting down/restarting IIS, and registering dlls. We also rely heavily on MSMQ, so queue creation was a big deal, but it was never scripted – the build deployer just did it manually, as well as executing registry entries. There were also only 3 production servers and only one (overall) web application. My point is that the build process worked then- it only took about 10-30 minutes.
Fast-forward 9 years. Much of what we do now is VB.NET or ASP.NET, and there are about 30 production servers and several applications and a lot of services. The build tools I mentioned before had not been updated significantly in 2-3 years. When I started it was taking 2 people about 1-2.5 hours to get the build done! I did it once by myself and it took me 3.5 hours!
Thank GOODNESS for FinalBuilder. I made some modifications to the existing scripting tools we wrote to handle some .NET things (gacutil, service stopping and starting) as well as MSMQ creation. I then automated the whole process with FB as well as added some pre-build back-up of specific files, and post-build verification using BeyondCompare2. I can do the whole weekly build in anywhere from 15-30 minutes (if there are no problems).
Decision revisited:
I am now going to start transitioning to my data driven model and need to choose one of the three alternatives I mentioned earlier. The way I implemented Finalbuilder, we are currently using option one. Under my new model however, instead of having each developer create CR scripts, I am going to have them select the objects from the database (using a GUI). When each build is ready, all of the objects (along with all of their publishing info) will come out of the database.
My preferred option is number 2, mostly for time and processing power. Also- I think FB does some tasks better than I could script (am having a hard time figuring out how to create virtual directories for IIS 5). Since there will be no CR files, the object info and the publishing instructions will be coming out of the database right into FB. I thought about having FB recreate the CR files and that could be a way to go. I don’t like Option 3 because I think it would be too slow. The build goes quickly now because I can kick off the scripts on the target servers almost immediately after each other (I have them spaced out so that we are never down during the build). If I were to choose option 3, I would lose that and would have to do each server 1 at a time. I thought about doing some kind of asynchronous task, but I’m not sure how it would work. And there is still the processing power issue.
I guess all I am looking for is some feedback from anyone with enough patience to actually make it this far down in my post that has faced this dilemma themselves. PS- for the FB team- I had another username: "eoin" I forgot my password, but your "e-mail me my password" app errored when I tried it.