We deploy our app using TFS Teambuild. To deploy to multiple environments (dev, tst, acceptance and  production) we use solutionconfiguration and config transformations.

Because we want to leverage .NET 4.5 for multiple reasons we needed to use the Azure 1.8 sdk. I noticed the ServiceDefinition.csdef transformation was not working as before when deploying using the 1.8 azure sdk.

The .ccproj file contained the following xml:

  <ServiceDefinition Include="ServiceDefinition.csdef" />
  <None Include="ServiceDefinition.Local.csdef" />
  <None Include="ServiceDefinition.Development.csdef" />
  <None Include="ServiceDefinition.Test.csdef" />
  <None Include="ServiceDefinition.Acceptance.csdef" />
  <None Include="ServiceDefinition.Release.csdef" />
  <ServiceConfiguration Include="ServiceConfiguration.cscfg" />
  <None Include="ServiceConfiguration.Local.cscfg" />
  <None Include="ServiceConfiguration.Development.cscfg" />
  <None Include="ServiceConfiguration.Test.cscfg" />
  <None Include="ServiceConfiguration.Acceptance.cscfg" />
  <None Include="ServiceConfiguration.Production.cscfg" />
  <None Include="ServiceConfiguration.Release.cscfg" />
<UsingTask TaskName="TransformXml" AssemblyFile="$(MSBuildExtensionsPath)\Microsoft\VisualStudio\v10.0\Web\Microsoft.Web.Publishing.Tasks.dll" />
<Target Name="TransformServiceConfiguration" BeforeTargets="ValidateServiceFiles" Condition="exists('$(ServiceConfigurationTransform)')">
  <TransformXml Source="@(ServiceConfiguration)" Destination="%(Filename)%(Extension).tmp" Transform="$(ServiceConfigurationTransform)" />
<Target Name="TransformServiceDefinition" BeforeTargets="ValidateServiceFiles" Condition="exists('$(ServiceDefinitionTransform)')">
  <TransformXml Source="@(ServiceDefinition)" Destination="%(Filename)%(Extension).tmp" Transform="$(ServiceDefinitionTransform)" />
<Target Name="CopyTransformedEnvironmentConfigurationXmlBuildServer" AfterTargets="AfterPackageComputeService" Condition="'$(AzureDeployEnvironment)'!='' and '$(IsDesktopBuild)'!='true' ">
  <Copy SourceFiles="ServiceConfiguration.cscfg.tmp" DestinationFiles="$(OutDir)ServiceConfiguration.cscfg" />
  <Copy SourceFiles="ServiceDefinition.csdef.tmp" DestinationFiles="ServiceDefinition.build.csdef" />

Listing 1

As you can see in line 31 of listing 1, the ServiceDefinition.csdef.tmp (holds transformed xml) is copied to ServiceDefinition.build.csdef. CSPack knew about the convention that the .build.csdef file needed to be used to create the package. The package than can be deployed to Azure.

When we started to use the 1.8 sdk this did not work anymore. I found the following stack overflow post describing a change in this process, but not my issue.
After a lot of reading log files and studying .target files, I came to the conclusion that the convention using .build.csdef  does not function anymore, instead I tried changing the DestinationFiles atrribute value to “$(OutDir)/ServiceDefinition.csdef”. That works!

Mind the / after $(OutDir), if you do not use it the filename will be longer.
So line 31 will be:

<Copy SourceFiles="ServiceDefinition.csdef.tmp" DestinationFiles="$(OutDir)\ServiceDefinition.csdef" />

Listing 2

Henry Cordes
My thoughts exactly…

We needed to deploy a WindowsService in our nightly build. We deploy to a Development, a Test and Acceptance environment. That for the moment all these ‘environments’ are on the same physical server does not really matter. We are going to change this, but for now this is how it works.

This situation creates the requirement to deploy different instances of the same WindowsService on the same server (a Development, Test and Acceptance instance).
To install a WindowsService through our Build we created a WindowsService that is capable of installing itself when calling it with some arguments via the Command Line. 
Maybe I will blog about how we did this in the future, but for now, I think the TeamBuild Workflow is more interesting.
In the main sequence of the build workflow some extra elements exist, we added these elements because of the special build functionalities we need.

The main sequence used in this workflow looks like image 1:

Img 1: Complete Build Sequence

The Deploy of the services is taking place in the fifth element of the workflow, another nested sequence I call ‘Deploy WindowsServices’ as shown on image 2:

Img 2: Deploy WindowsServices (Sub)Sequence

Inside the sequence an If then activity is placed that checks if compilation succeeded and if the tests have run succesfully (image 3):

Img 3: If compilation and Tests are successful activity

The Condition has the syntax as is shown in listing 1

BuildDetail.CompilationStatus = BuildPhaseStatus.Succeeded And (BuildDetail.TestStatus = BuildPhaseStatus.Succeeded Or BuildDetail.TestStatus = BuildPhaseStatus.Unknown)

Listing 1

If you need variables that you want to influence in the Builddefinition the best option is to use Arguments, this are variables that are accessible in the Builddefinition. The Arguments can be added or changed by clicking the tab ‘Arguments’ in the bottom-left of the workflow editor (image 4):

Img 4: Arguments Build (can be set in builddefinition)

In the Arguments I added the Argument: WindowsServicesToDeploy the datatype (or argumenttype) is a string array. The WindowsServicesToDeploy argument holds the projectnames of windowsservices that have to be deployed and that are part of the solution that has to be build.

Argument WindowsServicesToDeploy:

The Argument of type string[] that is called WindowsServicesToDeploy has the following value:

New String() {"WebSocketServerService"}

Listing 2

The string array, has only one value for now, but when I add more windowsservices to the solution, all I have to do is add the name of the project in the array of the values in the Builddefinition and we're done.

In the If Then activity shown on image 3, in the Then section I added a ForEach activity. Image 5 shows the properties of this Activity:

Img 5: For each Service in Argument (List<string>)

Listing 2 and 3 show the syntax and value for the TypeArgument and the Values properties of the System.Activities.ForEach<System.String> activity:



Listing 3



Listing 4

As I mentioned our Development, Test and Acceptance environments are all on one box (for now), to differentiate between these environments, we created Solution Configurations for all these environments in our solution. By using pre processing directives we then change configuration of connectionstrings, or url’s etc. in our code and our automated tests. So we have to do a different deployment for all Solution Configurations that are defined in the Builddefinition.

The ForEach activity that loops through all strings in the string array that is defined in the argument WindowsServicesToDeploy (that holds the projectnames of windowsservices that have to be deployed) contains yet another ForEach activity shown on image 6. In this ForEach I loop through all SolutionConfigurations.

Img 6: For each Solution Configuration in BuildDefinition

When we double-click to view the contents of this Foreach activity, we see the contents of image 7:

Img 7: Try Catch in For Each

The ForEach that loops through all SolutionConfigurations contains a TryCatch activity, as shown on image 8 an InvalidOperationException and Exceptions are caught in this TryCatch activity:

Img 8: Deploy WindowsService Sequence inside try Catch

In the Catches, BuildErrorMessages are written. In the Try block another Sequence is added that is called ‘Deploy windowsService’.

To hold state variables can be used, variables can be added or changed by clicking the tab ‘Arguments’ in the bottom-left of the workflow editor (image 9):


Img 9: Variables

The variables, psExecOutput (Int32) and matchingFileResult (IEnumerable<String>) are added.

The ‘Deploy WindowsService’ activity
As we can see on image 8, the sequence Deploy WindowsService is executed inside the Try block.
In the sequence I start off with a FindMatchingFile activity, this is a standard build workflow activity that is available in team build 2010.

Img 10: Deploy WindowsService Sequence

The FindMatchingFile activity has the properties that are shown on image 11, where the MatchPattern is the most important:

Img 11: FindMatchingFile Activity

The MatchPattern describes the pattern where the files you need are going to be found by.



String.Format("\\\\\{0}_{1}\{0}.exe", windowsService, solutionConfiguration.Configuration)

Listing 5

The next activity is another ‘If Then’, that checks if the FindMatchingFile activity found one file, as shown on image 12:

Img 12: If then containing a Invoke Process Activity

If one file is found, than an ‘Invoke Process Activity’ is executed, when not a buildmessage is written.   Image 13 shows all properties of the InvokeProcess Activity, used to  invoke PsExec to execute the uninstallation of the WindowsService, because if the file is found on the location where the matchpattern says the file could be present, an older version of the windowsservice is installed. So we need to uninstall it.

Img 13: Invoke Process Activity's properties

We renamed the psexec.exe, to prevent security breach. 
Listing 6,7 and 8 show the values of the properties for the uninstall action for our WindowsService:


String.Format("\\ -d C:\Services\\{1}_{0}\{1}.exe -uninstall -name ""{0}"" /accepteula", solutionConfiguration.Configuration, windowsService)

Listing 6

Listing 6 shows the value for Arguments property, the arguments that you pass to the process you invoke. In our case we invoke psexec.exe that calls our windowsservice on another machine, so the arguments are arguments that we pass to psexec. The arguments used are:

  • –d which instructs psexec not to wait for the application to terminate, so the build will can continue in case something takes a very long time;
  • /accepteula which takes care of the dialog that will popup the first time a user calls psexec and will fail the build if it pops up, because the dialog will never be closed
  • C:\Services\\{1}_{0}\{1}.exe -uninstall -name ""{0}"" will result in something like: ‘C:\Services\\servicename_Development\servicename.exe -uninstall -name "servicename"’ this is the exe that psexec will call, we pass the parameters for the service right in there (-uninstall -name "servicename").


Invoke PSExec Process Uninstall

Listing 7




Listing 8

Listing 8 shows the Local Path to PSExec.exe on the buildserver.


psExecOutput (Int32)

Listing 9




Listing 10

Listing 10 shows the working directory for PSExec on the buildserver)

After the deinstallation, or if the service is not installed on the server the CopyDirectory activity is executed

Img 14: CopyDirectory Activity

The CopyDirectory Activity uses the following properties:


String.Format("\\\Services\{0}_{1}", windowsService, solutionConfiguration.Configuration)

Listing 11


String.Format("{0}\{1}\bin\{2}", BuildDetail.DropLocation, windowsService, solutionConfiguration.Configuration)

Listing 12

When the copy action is ready another Invoke Process Activity is executed that calls psexec to install the windowsservice.

Img 15: Invoke Process (Install Service)

The details of this activity looks like image 16:

Img 16: Invoke Process details

The properties are shown on image 17:

Img 17: Properties Invoke Install Process

All properties have almost the same values as the uninstall invoke process in image 13, only the Arguments are sligthly different.


String.Format("\\<servername> -d C:\Services\TX.Communication.Prototype\{1}_{0}\{1}.exe -install -name ""{0}"" /accepteula", solutionConfiguration.Configuration, windowsService)

Listing 12

These steps take care of installing our WindowsServices in our build, so we can deploy in an automated fashion. After the depoloyment, in our nigthly build we also run automated UI Tests. Another topic that is quite interesting…

Henry Cordes
My thoughts exactly…

At my new project I am responsible for configuration management. We are using Visual Studio 2010 Premium and Team Foundation Server 2010 (TFS). The app we are going to build is an ASP.NET MVC Web application as the presentation layer that communicates to the data store via a services layer . The data store will be a SQL Server database. For now the service layer is not implemented, because the build is the first thing I do.

Continuous Integration
Because the app is going to be a product that is very important for the organization (core business), I am going to use Continuous Integration (CI), every check-in will result in building of the product and running all our unit-tests. CI is very easy to achieve with TFS.

Nightly Build
In addition to the unit-testing and building of the product, I want a nightly build where the build also deploys the application to a test web server, deploys the database to a separate test SQL Server and when all these tasks have succeeded run automated web tests on the web application, so the state of the application is always known and the testers will have a report with the test results first thing in the morning. I am aware of Lab Management, which makes these configuration’s easier to manage, but as I mentioned we are using VS Premium for now, so I can not leverage Lab Management at this moment in time.

To deploy a web application using Team Build I want to leverage MSDeploy. I had to install MSDeploy on the test webserver, the application will be deployed on. The installation and configuration of MSDeploy was not straightforward to say the least.

Here are some links to help with that:

Auto deploy Web application using Team Build
After MSDeploy is working on the webserver, the deployment of web applications using Team Build 2010 is a question of setting the right MSBuild arguments in the new Build settings window of the Build definition feature.

Process tab build settings build definition Team Build 2010
Process tab build settings build definition Team Build 2010

The following arguments where needed to make my configuration work, deploying via MSDeploy to another machine running IIS:


/p:DeployOnBuild=True /p:DeployTarget=MSDeployPublish /p:MSDeployPublishMethod=RemoteAgent /p:MsDeployServiceUrl="machinename webserver/msdeployagentservice" /p:DeployIisAppPath="BuildTest" /p:username="domain\Username" /p:password=P@ssword

Listing 1: MSBuild arguments

The most interesting arguments are:
MSDeployPublishMethod: InProc or RemoteAgent
MSDeployServiceUrl, because I use RemoteAgent, I could not use https://machinename:8172/msdeploy.axd (msbuild puts http before the url…), I took some time to figure out that the service listens to http://machinename/MSDEPLOYAGENTSERVICE also.

Web.config transformation
.NET Framework 4.0 comes with the web.config transformation feature, a web.config has a shadow file per build type, so a web.debug.config and a web.release.config.

<connectionstrings><add name="BuildtestConnectionString" connectionstring="Data Source=SQLMACHINE;Initial Catalog=DATABBASENAME;Integrated Security=false;uid=user;password=P@ssw0rd" providername="System.Data.SqlClient" xdt:transform="SetAttributes" xdt:locator="Match(name)" /></connectionstrings> 

Listing 2: web.config transform web.release.config

When the project is build as a release build, the values of the attributes in the connectionstring are changed to reflect the values in the connectionstring in listing 2.

This was all I needed to do to make Team Build deploy my web app to another server, I will report on how I configured Team Build to deploy the database (an vsts database project) in a follow up post.

Henry Cordes
My thoughts exactly…

On March 7th I attended some sessions at the Dutch Developer Days 2006. A session I attended was: Dynamics of Microsoft Solution Framework and Visual Studio Team System by Anko Duizer from A-Class.
He told about how he always worked with and liked MSF. He had the opportunity to see how Microsoft uses MSF in real life and told us how much he would like to share this knowledge. The reason why he wanted to talk about MSF and Team System was:

  • Software development is more than just good programming
  • Microsoft Solutions Framework is the result of many years of experience
  • Visual Studio Team System is a great enabler for the use of MSF 

There are a few methodologies that MSF supports: MSF for Agile Software development, MSF for CMMI process improvement.
Team System ofcourse is all about team development. Anko said he learnt a lot from reading a book written by Jim McCarthy called Dynamics of Software Development.
He claimed it opened his eyes and he thinks it still is a book that everybody in software development should read.
A few rules from this book where highlighted, because they are real important. In the book there should be much more. The rules I saw made sense and I think I will pick up this book one of these days. I must say Anko had a software problem, so I think a lot of the demonstation he had planned could not be shown, which in my opinoion was a shame. So I will  write down some rules from which I think they make sense (although some are really obvious):

  • Establish a shared vision (Make everybody aware of what they are doing and why)
  • Create a multi-release technology plan (Make  plans for the future, if you can not get features in this release, you can get them in future ones)
  • Don’t flip the bozo bit (Every department has got one, an employee from who nobody really knows what he is doing, we do'nt want 'bozo's' on our team, or become one)
  • Use feature teams (Use teams for small parts of an application)
  • Use program managers (but it is important to make the distinction that he is servant not master! He supports the team, not tells what to do)
  • Design time at design time
  • Remember the triangle: Features, Resources, Time (You can not get features, if you have not got the time and/or resources)
  • Don’t know what you don’t know (If you can not know something, aknowledge this)
  • Don't go dark (Don't let people get away with doing things without anybody specifically knowing what they do)
  • If you build it, it will ship (Make sure code will built!)
  • Get to a known state and stay there (It is better to release something, than trying to built new features that are late)
  • Don’t trade a bad date for an equally bad date (If you do not make a deadline, don't say we release a week later, but will include the new feature you need!)
  • Triage ruthlessly (Like the war movies, only in software development)

Henry Cordes
My thoughts exactly...

TFSC or Team Foundation Source Control is built from ground up, Microsoft did not update Visual SourceSafe and call it TFSC. It is a multitiered architecture.
On the other hand, if your familiar with Visual SourceSafe, the look and feel is basically the same, there are just more features.
There are some new features in TFSC compared to Visual SourceSafe, there are:
  • Changesets
  • Branching
  • Merging
  • Shelving
TFSC introduces a concept called changeset. With Visual SourceSafe and other source control products, the files under source control had no linkage, they all where individual files.
Changesets describe a group of associated file modifications, each changeset is given a unique identifier for tracking and reporting.

Branching in TFSC is intelligently copying items from one project to another. The origin, context and history is maintained and future changes can be merged back into the original path. This allows multiple builds and releases to be maintained efficiently. Another benefit of branching is storage space efficiency. The source control server minimizes the required storage by maintaining one copy of the content.

Merging (in CVS this process exists for a long time already) reconciles all the changes from branched code (“the source”) with the original code (“the target”). This is more than blending text, it will merge additions, deletions, undeletions and renames from the source to the target.

Multiple checkout
Team System projects can be configured for multiple checkout . this feature allows more than one user to edit the same file simultaneously. The same engine that merges changes from branched projects merges changes from two or more checked out projects back to the source (people working with CVS know this feature is nothing to be afraid of).

Shelving is another new key concept to TFSC. Shelving allows a developer to store pending changes to the server without checking them in (in the form of a shelveset).
A shelveset is similar to a changeset, except the files are stored on personal space on the server.
Reasons for shelving:
  • Switching to another project with higher priority;
  • Code fails a check in policy and can’t be fixed immediately;
  • Code is not complete enough to be shared
  • Developer needs to leave and wants to keep his code safe

Visual SourceSafe
SourceSafe continues to be available, it will have some new features also:
HTTP access through a Web Service interface
Copy, modify, merge model
A LAN performance booster
Asynchronous file opening, start working before loading is complete
Better support for projects in multiple timezones and multiple languages and Unicode

Henry Cordes
My thoughts exactly...