Monday, 17 November 2014

Shelveset Comparer Updated

Shelveset Extension is a visual studio extension that I first published at the start of this year. The extension provides a functionality that is otherwise missing in both Visual Studio and Team Foundation Server that is to compare the contents of two shelvesets. I felt that need for it as our team used shelvesets to pass work around and tracking what has changed since the time a shelveset was taken was not always obvious.

There extension has proved popular I have been trying to keep up with comments and feedback on it. This update was due for some time. The view of the extension in team explorer has changed a bit to show options for typing in two users. This allows for comparing shelvesets between two users. However, unlike the first release of the extension, there is still one list to display all users. To separate out shelvesets of two users, an “Owner” column has been added. The column headers are made clickable as well and will sort the rows based on the clicked column.

Screenshot2

Another feature added is the Options panel, allowing users to select whether they want to view the extension as a Team Explorer button or not. Another option is to hide the second user.

 

Screenshot4

The options are there to allow users to customise the view as per their needs.

 

Apart from the new functionality, several fixes and performance improvements have been made

 

Going forward, there is going to be another release by the end of this year, where I will be adding feature to search on a shelveset name. There will be further optimisation in the performance when comparing the contents of two shelvesets.

Thursday, 15 May 2014

Feature Toggler – a Simple feature toggle library for .Net

So, you have decided to use Feature Toggling as your branching strategy. You don’t want the hassle of merging and branching and are confident that developers and testers can handle the additional complexity that comes with Feature Toggles. The next step is to decided how to go about using toggles. The simplest and most popular method of doing is to have feature toggles set in configuration files

Ideally, you would want a library that would take care of feature toggling. All you would need to do is to define the features and their toggle value in the configuration file and be able to check if a feature is available with a simple check. Some thing which for a configuration like below

<featureConfiguration>
  <features>
    <add name="PrivateProfiles" toggle="on" />
    <add name="Photosharing" toggle="off" />
    <add name="Videos" toggle="1" />
    <add name="bookmarks" toggle="true" />
  </features>
</featureConfiguration>

would allow having code like following

if (FeatureManager.HasFeature("PrivateProfiles")){

}

Having looked around, there were three libraries of note already available, which were

  1. NFeature
  2. FeatureToggle, and
  3. FeatureSwticher

This blog post gives a good comparison of them and their usability. Having used all three, I felt that all of them, though thorough, were overly complicated for the very simple scenario that I wanted to use. For example, NFeature requires you to create enumerations for all features added in configuration file.

I decided to create a new very simple feature toggling library. https://github.com/hamidshahid/FeatureToggler

The library is available as NuGet. Simply type “Install-package FeatureToggler” in the package manager window of your application. It will add references, add a configuration section in your configuration files and adds a few sample features in your configuration file.

Once you have the reference added, simply add features in the features collection and use them in your code using the FeatureManager.HasFeature(“”) method. Happy Coding!!

Technorati Tags: ,,

Thursday, 8 May 2014

Feature Toggles and their limitations

 

This month's MSDN magazine contains an article on feature toggles. The subject has been close to my heart in the last few weeks and I have been weighing up whether they would work for our projects or not .

For those, who are unaware of the term, here is a good post by Martin Fowler describing feature toggles and their merits. He is convinced that feature toggles is the way forward and should be used instead of feature branches. Here is another great blog post explains the differences and recommends to use Feature toggles.

I love the idea of having no features branches … makes life easier. However, my take is that feature toggles is not for everyone and every team. For someone like Plural soft who does continuous delivery (and they use feature toggles), the process is simple. Each release, results in some new "features" being added. The process is generally additive with software becoming more "feature rich" and there is control on the release pipeline.

Now, turn our attention to a simple "message broker" kind application that interface with multiple systems and has no UI. The application handles message say M1 from one applications, does something to it and pass on message M2 to another application. Now, let's say there is a change in message interface because the sending application is changing. We start with a feature to handle the new message interface. Since, the change is a few months away, we need to keep supporting the existing interface. In this case, if feature toggling is involved, we would have to create a parallel code path to handle the new message interface and direct to that code path with feature toggle. If it wasn't the case and we were using branching instead, the change in code would have been much simpler. So in essence, we have replaced the complexity of merging by having a more complex code change.

Take another example, this time we have to delete something from the application, let's say a web service from the system. The feature toggle mechanism would require us to modify it to error on invocation when the feature is on. Compare it with the alternative of removing the service altogether.

Similarly, let's consider a windows/web UI application. One of the features is to re-design of the screens. The redesign includes jigging around all the form controls and include some new graphics. With feature toggling approach, we will either have a condition on display of each of these changes or have a new form created altogether, choosing between the two based on toggle value.

These were only some of the scenarios where feature toggle wouldn't essentially simplify things in my opinion. Others might disagree and I would love to listen to them, so please post your comments if you have any.

 

Technorati Tags:

Wednesday, 9 April 2014

TF10201 Source control could not start the manual merge tool

A quick post in the “How I got burnt” today category. I was attempting a merge from one TFS branch to another, when I start getting the following error

screenshot

The error is pretty random in that it doesn’t tell what has gone wrong. However, if you look at the Output window, you will find the real reason for this error, which is that the target merge file doesn’t exist. The error happens when you have a TFS workspace but have deleted the files on your local machine. TFS at this point things that you have the latest source and attempts to merge the file. However, since the files are not there, it throws this error. Please note that this error would only happen for files that have merge conflicts.

The fix is quick. Just do a forced get latest of the files involved and this would go away.

Technorati Tags: ,

Tuesday, 1 April 2014

PowerShell – Log off all remote sessions

Needed to have a script that would log off all remote sessions from a given machine. The task is simple. The qwinsta commandlet lists all sessions and rwinsta logs off the session. I couldn’t find a script anywhere that would use the two together, so wrote the following. Enjoy!!

param (
    [String]$computer
)
$sessions = qwinsta /server:$computer
$sessions = $sessions[1..$($sessions.Count - 1)]
foreach ($Result in $sessions) {
    $userName = $Result.Substring(19,22).Trim()
    $id = $Result.Substring(41,7).Trim()                        
    if ($userName -ne ""){
                    rwinsta /server:$computer $id

    }
}   

Technorati Tags: ,

Tuesday, 25 March 2014

ALM with Microsoft Dynamics CRM – Deployment

This is the fourth and final post of a multi-part series suggesting an ALM process in projects where Microsoft Dynamics CRM is used as a data store.

In my previous blog post, I explained about including Microsoft Dynamics CRM customisations in your Team Build, how to structure CRM customisations and scripts in TFS and how to produce a deployable managed and/or unmanaged solution as an output. In this post, I will write about a deployment process that enables you to deploy the package produced from Team Build to a target environment.

 

Deployment Overview

The importance of having a reliable, repeatable and well-documented deployment process cannot be understated. Deployment should be planned from the very outset of starting the project scaling it up from a single machine environment to test and staging environments eventually scaling it up for production. Having a repeatable process prevent surprises in the all important go-live process. It also allows you to make regular continuous deliveries.

In this scenario, we are considering deploying a new CRM solution to a new CRM target environment that is to say we are not upgrading to an existing system or deploying to an existing CRM instance. The deployment involves the following steps

  1. Create new CRM Organisation
  2. Set CRM organisation settings such as Currency, Time Zone, etc.
  3. Import Data Maps required before importing CRM Solution
  4. Import Data required before importing CRM Solution.
  5. Import CRM Solution
  6. Import Data Maps for initial data population.
  7. Import Data for initial data population.
  8. Publish SSRS reports.
  9. Import Team Associations.
  10. Publish workflows.

All the  steps apart from step (1) and (4) are optional and applicable only if your CRM customisations require it.

In my last post, I suggested to structure CRM deployable in the following format and we will use the same when writing our deployment scripts.

[Sample%2520CRM%2520Folder%2520Structure%255B2%255D.jpg]

For my deployment scripts, I will use MSBuild using a library called MSBuild Extension Pack. The library provides a rich set of functionality and the March release of the library has Tasks for Microsoft Dynamics CRM as well.

Sample Deployment

Following is the sample listing of the deployment process listed above. For simplicity, I have only included steps 1, 2, 5, 6 and 7 of the above mentioned process.

<Target Name="DeployCrmOrganisation64">
<!-- Creating Crm Organisation-->
<MSBuild.ExtensionPack.Crm.Organization TaskAction="Create" DeploymentUrl=http://CRMServer/XRMDeployment/2011/Deployment.svc Name="organization1" DisplayName="Organization 1" SqlServerInstance="MySqlServer" SsrsUrl="http://reports1/ReportServer" Timeout="20" />

<!-- Update an Organization's Settings -->
<ItemGroup>
        <Settings Include="pricingdecimalprecision">
          <Value>2</Value>
        </Settings>

        <Settings Include="localeid">
          <Value>2057</Value>
        </Settings>  

        <Settings Include="isauditaneabled">
          <Value>false</Value>
        </Settings>
     
<ItemGroup>

<MSBuild.ExtensionPack.Crm.Organization TaskAction="UpdateSetting" OrganizationUrl="http://CRMServer/organization1" Settings="@(Settings)" />

<!-- Import Solutions –>

<MSBuild.ExtensionPack.Crm.Solution TaskAction="Import" OrganizationUrl=”http://CRMServer/organization1 Name="CrmSolution" Path="C:\Solutions" Extension="zip" OverwriteCustomizations="true" EnableSDKProcessingSteps="True" />

<!—Import Data Map-->
<MSBuild.ExtensionPack.Crm.DataMap TaskAction="Import" OrganizationUrl="http://CRMServer/organization1" Name="Organization1" FilePath="C:\DataMapFile1" />

<!—Import Data-->
<MSBuild.ExtensionPack.Crm.Data TaskAction="Import" OrganizationUrl="http://CRMServer/organization1" DataMapName="Entity1DataMap" SourceEntityName="entity1" TargetEntityName="entity1" FilePath="C:\DataFile1.csv" />

</Target>

The first step in the script is creating a new CRM organisation. The task used is “MSBuild.ExtensionPack.Crm.Organization” with a task action of “Create”. It takes a parameter the CRM instance’s deployment URL, the name and display name of the organisation as well as the name of SQL Instance and SSRS instance. The time out parameter is optional and I am specifying it to prevent the deployment script to wait indefinitely.

Once the organization is created, the next step is to set certain organisation settings. Again the task “MSBuild.ExtensionPack.Crm.Organization” with task action “UpdateSetting” allows this. The task takes in an ItemGroup of setting names and values as parameter.

The next step in to import a managed solution into the newly created organization. For this the task used is “MSBuild.ExtensionPack.Crm.Solution” with task action of “Import”. The task requires the path where the solution file is placed, the name and extension of the solution file. Also required are parameters to specify whether to overwrite any already existing customisation in the target organisation and also whether to trigger CRM Plug-ins and workflows as the solution is imported.

The final two steps are simply importing a data map and a data file to the newly created organisation. The parameters are self-explanatory. MSBuild extension pack contains some other useful CRM tasks. For more details read the project documentation at http://msbuildextensionpack.com/.

This culminates our discussion about ALM process for solutions involving Microsoft Dynamics CRM. I hope you find this series useful and do give your feedback.

Sunday, 16 March 2014

ALM with Microsoft Dynamics CRM – Setting up Team Build

This is the third of a multi-part series suggesting an ALM process in  projects where Microsoft Dynamics CRM is used as a data store.

My pervious blog post was about setting a Development Build for the developers so that they can build the system (including all latest Microsoft CRM Dynamics artefacts) end to end. In this post, I will write about about setting up a Team Build.

The purpose of the Team build is to compile and build all system artefacts to produce a deployable package. The package is then read by the deployment scripts to deploy the system to a target environment. For a very simple project, the deliverable may be an executable or an MSI. For a more complicated system, it may include published websites, assemblies, databases, etc. For Microsoft Dynamics CRM, the deliverables will be a managed / unmanaged solution along with artefacts such a data maps, data import files, de-duplication rules, SSRS Reports, etc.

CRM Deployment Overview:

Before describing the team build, let’s first take a brief look of what the CRM deployment script would do.

  1. Create a new CRM Organisation.
  2. Set Organisation settings.
  3. Import Solution.
  4. Import Data Maps.
  5. Import Data.
  6. Import Bulk Deletion Operations.
  7. Publish SSRS Reports.
  8. Set Field Level Security.
  9. Publish unpublished workflows.

Above is one of the several possibilities and might not meet your exact requirements. For example, your solution might have to be deployed to an existing organisation, in which case step 1 is not needed. I will detail the deployment process in more detail in the next blog.

Structuring CRM Package

Having taken a look at how the deployment of CRM would take place, let’s take a look at the Dynamics CRM deliverables and how to structure them in the deployment package. Some of the deliverables (such as plug-in assemblies) needs to be compiled, some needs to be taken straight from the source control. In any case, it is essential that the deliverables are taken from source control and not from the a CRM development instance. The following diagram describes how I would structure the deliverables in the CRM folder.

Sample CRM Folder Structure

All these folders are contents of the CRM folder that is included in the cabinet file produced by the build. Let’s have a look at each of the folders

  1. Assemblies: The folder contains Microsoft Dynamics CRM deployment assemblies such as Microsoft.xrm.sdk.deployment.dll.
  2. BulkDeleteOperations: The folder contains the exported Bulk deleted operations files from the Development instance of CRM.
  3. Data: The folder contains initialisation data for the system. The folder contains a csv file for each entity that needs to have initialization data as well as a data map file.
  4. DedupeRules: Contains the de-duplication rules for entities.
  5. FieldLevelSecurity: Contain team association for field level security of custom and out-of-the-box entities.
  6. Reports: Contain details of the the reports to be published.
  7. Settings: Contains organisation setting details.
  8. Solutions: Contains the managed or unmanaged solutions that contains all the customisations.
  9. Workflows: Imported solutions do not have workflows enabled automatically for themselves. This folder contains information of workflows that would need to be enabled.
  10. Structuring CRM Package

Structuring CRM Source

The Microsoft Dynamics CRM source code would be structured as CRM SDK creates them. These will be built as part of compilation of CRM solution in team build. The following diagram describes a typical structure of CRM source code.

image

Team Build:

The Team build will take the contents of the above-mentioned folders, apart from the Solution folder, straight from the source control. The data csv files would be maintained in source control, while other files such as Data Import files,bulk  deletion operations, de-duplication rules, etc would be exported from the CRM development instance and checked-in into source control.

The solutions, on the other hand, would be created by Team Build using the CRM’s SolutionPackager utility. However, before the solution is package, a mapping file should be created to map plug-in assemblies correctly. The FileMapping.ps1file perform this action. The following target in your team build will package the CRM solution for you.

<Target Name="PackageCRMSolution">
  <BuildStep TeamFoundationServerUrl="$(TeamFoundationServerUrl)" BuildUri="$(BuildUri)" Message="...Packaging CRM Solution" Status="Succeeded" Condition="'$(BuildUri)' != ''"/>

  <!-- Copying CRM deployment files-->
  <BuildStep TeamFoundationServerUrl="$(TeamFoundationServerUrl)" BuildUri="$(BuildUri)" Message="......Copying CRM Deployment files" Status="Succeeded" Condition="'$(BuildUri)' != ''"/>
  <ItemGroup>
    <CrmDeploymentFiles Include="$(SolutionRoot)\Build\Deployment\CRM\**\*.*" Exclude="$(SolutionRoot)\Build\Deployment\CRM\Solutions\SolutionFiles\*.zip"/>
  </ItemGroup>
  <Microsoft.Build.Tasks.Copy SourceFiles="@(CrmDeploymentFiles)" DestinationFiles="@(CrmDeploymentFiles-&gt;'$(BinariesRoot)\Release\Server\CRM\%(RecursiveDir)%(Filename)%(Extension)')" />

  <!-- File Mapping -->
  <BuildStep TeamFoundationServerUrl="$(TeamFoundationServerUrl)" BuildUri="$(BuildUri)" Message="......Run File Mapping" Status="Succeeded" Condition="'$(BuildUri)' != ''"/>
  <Microsoft.Build.Tasks.Exec command="powershell $(BuildToolsPath)\FileMapping.ps1 -binarySearchLocation &quot;$(BinariesRoot)\Release\Server\CRM&quot; -unpackFolderLocation &quot;$(SolutionRoot)\Source\CRM\Solution1&quot; -outputLocation &quot;$(BinariesRoot)\Release\Server\CRM&quot;" />

  <!-- Solution Packager UnManaged-->
  <BuildStep TeamFoundationServerUrl="$(TeamFoundationServerUrl)" BuildUri="$(BuildUri)" Message="......Package CRM solution - Unmanaged" Status="Succeeded" Condition="'$(BuildUri)' != ''" />
  <Microsoft.Build.Tasks.Exec command="$(BuildToolsPath)\SolutionPackager /a:Pack /z:&quot;$(BinariesRoot)\Release\Server\CRM\Solutions\CrmSolution1_1_0_0_0_unmanaged.zip&quot; /f:&quot;$(SolutionRoot)\Source\CRM\Solution1&quot; /p:Unmanaged /m:&quot;$(BinariesRoot)\Release\Server\CRM\mapping.xml&quot;"/>

  <BuildStep TeamFoundationServerUrl="$(TeamFoundationServerUrl)" BuildUri="$(BuildUri)" Message="......Package CRM solution - Managed" Status="Succeeded" Condition="'$(BuildUri)' != ''"/>
  <Microsoft.Build.Tasks.Exec command="$(BuildToolsPath)\SolutionPackager /a:Pack /z:&quot;$(BinariesRoot)\Release\Server\CRM\Solutions\CrmSolution_1_0_0_0_managed.zip&quot; /f:&quot;$(SolutionRoot)\Source\CRM\Solution1&quot; /p:Managed /m:&quot;$(BinariesRoot)\Release\Server\CRM\mapping.xml&quot;"/>

  <ItemGroup>
    <CRMFilesToCleanUp Include="$(BinariesRoot)\Release\Server\CRM\*.*" Exclude="$(BinariesRoot)\Release\Server\CRM\*.zip" />
  </ItemGroup>
  <Delete Files="@(CRMFilesToCleanUp)" Condition="@(CRMFilesToCleanUp) != ''" />

  <BuildStep TeamFoundationServerUrl="$(TeamFoundationServerUrl)" BuildUri="$(BuildUri)" Message="...Completed Packaging CRM Solution" Status="Succeeded" Condition="'$(BuildUri)' != ''"/>   
</Target>

Once the above target is included in your team build, your build will produce the CRM deployment folder as described above. I like to create a cabinet (.cab) file out of all files in the Drop folder, but it is certainly optional. In my next blog, I will write about the deployment script to deploy the CRM deliverable files produced by the team build.

Technorati Tags: ,,