Tuesday, 25 March 2014

ALM with Microsoft Dynamics CRM – Deployment

This is the fourth and final post of a multi-part series suggesting an ALM process in projects where Microsoft Dynamics CRM is used as a data store.

In my previous blog post, I explained about including Microsoft Dynamics CRM customisations in your Team Build, how to structure CRM customisations and scripts in TFS and how to produce a deployable managed and/or unmanaged solution as an output. In this post, I will write about a deployment process that enables you to deploy the package produced from Team Build to a target environment.

 

Deployment Overview

The importance of having a reliable, repeatable and well-documented deployment process cannot be understated. Deployment should be planned from the very outset of starting the project scaling it up from a single machine environment to test and staging environments eventually scaling it up for production. Having a repeatable process prevent surprises in the all important go-live process. It also allows you to make regular continuous deliveries.

In this scenario, we are considering deploying a new CRM solution to a new CRM target environment that is to say we are not upgrading to an existing system or deploying to an existing CRM instance. The deployment involves the following steps

  1. Create new CRM Organisation
  2. Set CRM organisation settings such as Currency, Time Zone, etc.
  3. Import Data Maps required before importing CRM Solution
  4. Import Data required before importing CRM Solution.
  5. Import CRM Solution
  6. Import Data Maps for initial data population.
  7. Import Data for initial data population.
  8. Publish SSRS reports.
  9. Import Team Associations.
  10. Publish workflows.

All the  steps apart from step (1) and (4) are optional and applicable only if your CRM customisations require it.

In my last post, I suggested to structure CRM deployable in the following format and we will use the same when writing our deployment scripts.

[Sample%2520CRM%2520Folder%2520Structure%255B2%255D.jpg]

For my deployment scripts, I will use MSBuild using a library called MSBuild Extension Pack. The library provides a rich set of functionality and the March release of the library has Tasks for Microsoft Dynamics CRM as well.

Sample Deployment

Following is the sample listing of the deployment process listed above. For simplicity, I have only included steps 1, 2, 5, 6 and 7 of the above mentioned process.

<Target Name="DeployCrmOrganisation64">
<!-- Creating Crm Organisation-->
<MSBuild.ExtensionPack.Crm.Organization TaskAction="Create" DeploymentUrl=http://CRMServer/XRMDeployment/2011/Deployment.svc Name="organization1" DisplayName="Organization 1" SqlServerInstance="MySqlServer" SsrsUrl="http://reports1/ReportServer" Timeout="20" />

<!-- Update an Organization's Settings -->
<ItemGroup>
        <Settings Include="pricingdecimalprecision">
          <Value>2</Value>
        </Settings>

        <Settings Include="localeid">
          <Value>2057</Value>
        </Settings>  

        <Settings Include="isauditaneabled">
          <Value>false</Value>
        </Settings>
     
<ItemGroup>

<MSBuild.ExtensionPack.Crm.Organization TaskAction="UpdateSetting" OrganizationUrl="http://CRMServer/organization1" Settings="@(Settings)" />

<!-- Import Solutions –>

<MSBuild.ExtensionPack.Crm.Solution TaskAction="Import" OrganizationUrl=”http://CRMServer/organization1 Name="CrmSolution" Path="C:\Solutions" Extension="zip" OverwriteCustomizations="true" EnableSDKProcessingSteps="True" />

<!—Import Data Map-->
<MSBuild.ExtensionPack.Crm.DataMap TaskAction="Import" OrganizationUrl="http://CRMServer/organization1" Name="Organization1" FilePath="C:\DataMapFile1" />

<!—Import Data-->
<MSBuild.ExtensionPack.Crm.Data TaskAction="Import" OrganizationUrl="http://CRMServer/organization1" DataMapName="Entity1DataMap" SourceEntityName="entity1" TargetEntityName="entity1" FilePath="C:\DataFile1.csv" />

</Target>

The first step in the script is creating a new CRM organisation. The task used is “MSBuild.ExtensionPack.Crm.Organization” with a task action of “Create”. It takes a parameter the CRM instance’s deployment URL, the name and display name of the organisation as well as the name of SQL Instance and SSRS instance. The time out parameter is optional and I am specifying it to prevent the deployment script to wait indefinitely.

Once the organization is created, the next step is to set certain organisation settings. Again the task “MSBuild.ExtensionPack.Crm.Organization” with task action “UpdateSetting” allows this. The task takes in an ItemGroup of setting names and values as parameter.

The next step in to import a managed solution into the newly created organization. For this the task used is “MSBuild.ExtensionPack.Crm.Solution” with task action of “Import”. The task requires the path where the solution file is placed, the name and extension of the solution file. Also required are parameters to specify whether to overwrite any already existing customisation in the target organisation and also whether to trigger CRM Plug-ins and workflows as the solution is imported.

The final two steps are simply importing a data map and a data file to the newly created organisation. The parameters are self-explanatory. MSBuild extension pack contains some other useful CRM tasks. For more details read the project documentation at http://msbuildextensionpack.com/.

This culminates our discussion about ALM process for solutions involving Microsoft Dynamics CRM. I hope you find this series useful and do give your feedback.

Sunday, 16 March 2014

ALM with Microsoft Dynamics CRM – Setting up Team Build

This is the third of a multi-part series suggesting an ALM process in  projects where Microsoft Dynamics CRM is used as a data store.

My pervious blog post was about setting a Development Build for the developers so that they can build the system (including all latest Microsoft CRM Dynamics artefacts) end to end. In this post, I will write about about setting up a Team Build.

The purpose of the Team build is to compile and build all system artefacts to produce a deployable package. The package is then read by the deployment scripts to deploy the system to a target environment. For a very simple project, the deliverable may be an executable or an MSI. For a more complicated system, it may include published websites, assemblies, databases, etc. For Microsoft Dynamics CRM, the deliverables will be a managed / unmanaged solution along with artefacts such a data maps, data import files, de-duplication rules, SSRS Reports, etc.

CRM Deployment Overview:

Before describing the team build, let’s first take a brief look of what the CRM deployment script would do.

  1. Create a new CRM Organisation.
  2. Set Organisation settings.
  3. Import Solution.
  4. Import Data Maps.
  5. Import Data.
  6. Import Bulk Deletion Operations.
  7. Publish SSRS Reports.
  8. Set Field Level Security.
  9. Publish unpublished workflows.

Above is one of the several possibilities and might not meet your exact requirements. For example, your solution might have to be deployed to an existing organisation, in which case step 1 is not needed. I will detail the deployment process in more detail in the next blog.

Structuring CRM Package

Having taken a look at how the deployment of CRM would take place, let’s take a look at the Dynamics CRM deliverables and how to structure them in the deployment package. Some of the deliverables (such as plug-in assemblies) needs to be compiled, some needs to be taken straight from the source control. In any case, it is essential that the deliverables are taken from source control and not from the a CRM development instance. The following diagram describes how I would structure the deliverables in the CRM folder.

Sample CRM Folder Structure

All these folders are contents of the CRM folder that is included in the cabinet file produced by the build. Let’s have a look at each of the folders

  1. Assemblies: The folder contains Microsoft Dynamics CRM deployment assemblies such as Microsoft.xrm.sdk.deployment.dll.
  2. BulkDeleteOperations: The folder contains the exported Bulk deleted operations files from the Development instance of CRM.
  3. Data: The folder contains initialisation data for the system. The folder contains a csv file for each entity that needs to have initialization data as well as a data map file.
  4. DedupeRules: Contains the de-duplication rules for entities.
  5. FieldLevelSecurity: Contain team association for field level security of custom and out-of-the-box entities.
  6. Reports: Contain details of the the reports to be published.
  7. Settings: Contains organisation setting details.
  8. Solutions: Contains the managed or unmanaged solutions that contains all the customisations.
  9. Workflows: Imported solutions do not have workflows enabled automatically for themselves. This folder contains information of workflows that would need to be enabled.
  10. Structuring CRM Package

Structuring CRM Source

The Microsoft Dynamics CRM source code would be structured as CRM SDK creates them. These will be built as part of compilation of CRM solution in team build. The following diagram describes a typical structure of CRM source code.

image

Team Build:

The Team build will take the contents of the above-mentioned folders, apart from the Solution folder, straight from the source control. The data csv files would be maintained in source control, while other files such as Data Import files,bulk  deletion operations, de-duplication rules, etc would be exported from the CRM development instance and checked-in into source control.

The solutions, on the other hand, would be created by Team Build using the CRM’s SolutionPackager utility. However, before the solution is package, a mapping file should be created to map plug-in assemblies correctly. The FileMapping.ps1file perform this action. The following target in your team build will package the CRM solution for you.

<Target Name="PackageCRMSolution">
  <BuildStep TeamFoundationServerUrl="$(TeamFoundationServerUrl)" BuildUri="$(BuildUri)" Message="...Packaging CRM Solution" Status="Succeeded" Condition="'$(BuildUri)' != ''"/>

  <!-- Copying CRM deployment files-->
  <BuildStep TeamFoundationServerUrl="$(TeamFoundationServerUrl)" BuildUri="$(BuildUri)" Message="......Copying CRM Deployment files" Status="Succeeded" Condition="'$(BuildUri)' != ''"/>
  <ItemGroup>
    <CrmDeploymentFiles Include="$(SolutionRoot)\Build\Deployment\CRM\**\*.*" Exclude="$(SolutionRoot)\Build\Deployment\CRM\Solutions\SolutionFiles\*.zip"/>
  </ItemGroup>
  <Microsoft.Build.Tasks.Copy SourceFiles="@(CrmDeploymentFiles)" DestinationFiles="@(CrmDeploymentFiles-&gt;'$(BinariesRoot)\Release\Server\CRM\%(RecursiveDir)%(Filename)%(Extension)')" />

  <!-- File Mapping -->
  <BuildStep TeamFoundationServerUrl="$(TeamFoundationServerUrl)" BuildUri="$(BuildUri)" Message="......Run File Mapping" Status="Succeeded" Condition="'$(BuildUri)' != ''"/>
  <Microsoft.Build.Tasks.Exec command="powershell $(BuildToolsPath)\FileMapping.ps1 -binarySearchLocation &quot;$(BinariesRoot)\Release\Server\CRM&quot; -unpackFolderLocation &quot;$(SolutionRoot)\Source\CRM\Solution1&quot; -outputLocation &quot;$(BinariesRoot)\Release\Server\CRM&quot;" />

  <!-- Solution Packager UnManaged-->
  <BuildStep TeamFoundationServerUrl="$(TeamFoundationServerUrl)" BuildUri="$(BuildUri)" Message="......Package CRM solution - Unmanaged" Status="Succeeded" Condition="'$(BuildUri)' != ''" />
  <Microsoft.Build.Tasks.Exec command="$(BuildToolsPath)\SolutionPackager /a:Pack /z:&quot;$(BinariesRoot)\Release\Server\CRM\Solutions\CrmSolution1_1_0_0_0_unmanaged.zip&quot; /f:&quot;$(SolutionRoot)\Source\CRM\Solution1&quot; /p:Unmanaged /m:&quot;$(BinariesRoot)\Release\Server\CRM\mapping.xml&quot;"/>

  <BuildStep TeamFoundationServerUrl="$(TeamFoundationServerUrl)" BuildUri="$(BuildUri)" Message="......Package CRM solution - Managed" Status="Succeeded" Condition="'$(BuildUri)' != ''"/>
  <Microsoft.Build.Tasks.Exec command="$(BuildToolsPath)\SolutionPackager /a:Pack /z:&quot;$(BinariesRoot)\Release\Server\CRM\Solutions\CrmSolution_1_0_0_0_managed.zip&quot; /f:&quot;$(SolutionRoot)\Source\CRM\Solution1&quot; /p:Managed /m:&quot;$(BinariesRoot)\Release\Server\CRM\mapping.xml&quot;"/>

  <ItemGroup>
    <CRMFilesToCleanUp Include="$(BinariesRoot)\Release\Server\CRM\*.*" Exclude="$(BinariesRoot)\Release\Server\CRM\*.zip" />
  </ItemGroup>
  <Delete Files="@(CRMFilesToCleanUp)" Condition="@(CRMFilesToCleanUp) != ''" />

  <BuildStep TeamFoundationServerUrl="$(TeamFoundationServerUrl)" BuildUri="$(BuildUri)" Message="...Completed Packaging CRM Solution" Status="Succeeded" Condition="'$(BuildUri)' != ''"/>   
</Target>

Once the above target is included in your team build, your build will produce the CRM deployment folder as described above. I like to create a cabinet (.cab) file out of all files in the Drop folder, but it is certainly optional. In my next blog, I will write about the deployment script to deploy the CRM deliverable files produced by the team build.

Technorati Tags: ,,

Wednesday, 12 March 2014

ALM with Microsoft Dynamics CRM – Setting up a Development Build

This is the second of a multi-part series suggesting an ALM process in  projects where Microsoft Dynamics CRM is used as a data store

In my previous blog post, I wrote about establishing an ALM process for projects involving Microsoft Dynamics CRM. The greatest challenge in projects with Microsoft Dynamics CRM is ensuring that the system is restored to a known baseline state and the deployment process is applied such that it deploys code, customisations and data in a repeatable and reliable way. In my post, I wrote about three constituent pieces of the ALM process
  • Development Build
  • Team Build
  • Deployment

In this post, I will elaborate on the “development build” part of the process.

Development Build

The purpose of the development is threefold

1) To ensure that the complete solution can be compiled end-to-end - Usually, a typical software solution consists of more than one visual studio solution such that solutions have  inter-dependencies i.e. libraries from one solutions are used by other solutions. Before checking-in changes, a developer needs to ensure that there are no build breaks in any of the dependent solutions.The Dev Build will build all the visual studio solutions in order, placing the output of each to the location where the dependant solutions are referencing them from.


2) Setup a complete isolated environment locally for developers – Usually a software solution would have quite a few artefacts such as Active Directory users/groups, databases, web services, windows services, etc. Typically the developer at a time would be working on one part of it. The development build would set up his a scaled down system allowing him / her to test their work area without having to rely on an integration environment.


3) Run Integration Tests - Executing Integration tests in one form or another is vital in ensuring that developers are not destabilizing the system as they check-in. This is specially important for bigger teams. Some would argue that this should happen in Continuous Integration builds and in the Continuous Deployment process. In my experience, leaving it ONLY in continuous deployment process makes finding errors more difficult and result in a large number of BVT (Build Verification Testing) failures.


CRM Development Build

If Microsoft Dynamics CRM is part of your end-to-end solution, you can include it in the development build process in one of the following two ways

1) Have a local installation of Microsoft Dynamics CRM on your machine. Each run of development build, will compile the Dynamics CRM code base and deploy a new CRM Organisation using the deployment scripts. The advantage of this approach is that you are always working from checked-in code and can be certain that what you have got in your development machine is what will be deployed to your test environments.


2) Have your own CRM Organisation in a shared “development” CRM server. With this, your CRM team will maintain a database backup of a stable organisation that they have deployed to using CRM deployment scripts. Your development build will restore this database and import it to your organisation.


Given the effort and resources needed to set up Dynamics CRM and the CRM SDK being required to build CRM codebase, I prefer option 2. The downside is that you be relying on your CRM team to provide a stable organisation. However, the advantages are not needing the CRM development tools and a quicker development time.

The following diagram illustrates how CRM organisation is imported during the build process.


 CRM Dev Build

Like any other ALM  processes, the development build process should be repetitive. This means that it should contain the following sequence of actions

  1. Tear Down
  2. Build
  3. Deploy
  4. Start
  5. Test
Or, it would roughly be something like following (deliberately remove tear down and deployment of other artefacts to keep it simple)

<Target Name="Build" DependsOnTargets="TearDownCrm;Build;DeployCrm"/>

The Teardown script would involve running the PowerShell commandlets on the remote CRM server. For this remote power shell should be enabled on the server. Once it is enabled, you can use the following task MSBuild Task to execute PowerShell commandlets on the CRM server remotely

<UsingTask TaskName="PSExecTask" TaskFactory="CodeTaskFactory" AssemblyFile="$(MSBuildToolsPath)\Microsoft.Build.Tasks.v12.0.dll" >

  <ParameterGroup>
    <Server ParameterType="System.String" Required="true" />
    <Command ParameterType="System.String" Required="true" />
    <Args ParameterType="System.String" Required="false" />
    <FailOnError ParameterType="System.Boolean" Required="false" Output="false"/>
    <ExePath ParameterType="System.String" Required="true" Output="false"/>
  </ParameterGroup>
  <Task>
    <Using Namespace="System"/>
    <Using Namespace="System.IO"/>
    <Using Namespace=" System.Diagnostics"/>
    <Code Type="Fragment" Language="cs">
    <![CDATA[
        ProcessStartInfo start = new ProcessStartInfo();
        start.Verb = "runas";
        start.FileName = ExePath; // Specify exe name.
        Log.LogMessage(@"\\" + Server + @" " + Args + " " + Command);
        start.Arguments = @"\\" + Server + @" " + Args + " " + Command;
        start.UseShellExecute = false;
        start.RedirectStandardOutput = true;
        start.RedirectStandardError = true;
        try
        {
          using (Process process = Process.Start(start))
          {
            using (StreamReader reader = process.StandardOutput)
            {
              string result;
              result = reader.ReadToEnd();
              Log.LogMessage(result);
            }
            if ((process.ExitCode != 0) && (FailOnError == true))
            {
              Log.LogError("Exit code = {0}", process.ExitCode);
            }
            else
            {
              Log.LogMessage("Exit code = {0}", process.ExitCode);
            }
          }
        }
        catch (Exception ex)
        {
          Log.LogError("PSExec task failed: " + ex.ToString());
        }]]>

    </Code>
  </Task>
</UsingTask>

The Teardown script is shown below

<Target Name="TearDownCrm” Condition="’$(SkipCrmDeployment)’ != ‘true’”>

<PSExecTask Server="$(CRMWEBComputerName” Condition="powershell Add-PSSnapin Microsoft.Crm.Powershell; Disable-CrmOrganisation $(CrmNewOrganisationName); Remote-CrmOrganisation $(CrmNewOrganisationName)” ExePath="$(PsExec)">

<MSBuild.ExtensionPack.SqlServer.SqlExecute TaskAction="Execute"
                                                CommandTimeout="120"
                                                Retry="true"
                                                Sql="ALTER DATABASE $(CrmNewDatabaseName) SET SINGLE_USER WITH ROLLBACK IMMEDIATE; DROP DATABASE $(CrmNewDatabaseName);"
                                                ConnectionString="$(CrmDatabaseServerConnectionString)"
                                                ContinueOnError="true"/>
</Target>

The variables used in the script are pretty much self-explanatory. Note the Continue On Error in tear down. This is done because for the first run of the build there won’t be any databases or organisation set up.


The Deployment script is shown below


<Target Name="DeployCrm" DependsOnTargets="RestoreCrmOrganisationDatabase;
                                                        ImportCrmOrganisation
                                      Condition="'$(SkipCrmDeployment)' != 'true'" />
<Target Name="RestoreCrmOrganisationDatabase">
  <MSBuild.ExtensionPack.SqlServer.SqlExecute TaskAction="Execute"
                                                CommandTimeout="120"
                                                Retry="true"
                                                Sql="IF EXISTS(Select * from sysdatabases WHERE NAME LIKE '$(CrmNewDatabaseName)') ALTER DATABASE $(CrmNewDatabaseName) SET SINGLE_USER WITH ROLLBACK IMMEDIATE; RESTORE DATABASE $(CrmNewDatabaseName) FROM DISK = N'$(CrmDatabaseBackupFile)' WITH REPLACE, FILE = 1, MOVE N'MSCRM' TO N'$(CrmDatabaseDataFileLocation)\$(CrmNewDatabaseName).mdf', MOVE N'MSCRM_log' TO N'$(CrmDatabaseDataFileLocation)\$(CrmNewDatabaseName)_log.ldf';"
                                               ConnectionString="$(CrmDatabaseServerConnectionString)"/>
</Target>  

<Target Name="ImportCrmOrganisation">
    <Copy SourceFiles="$(MSBuildProjectDirectory)\Resources\$(CrmUserMappingFile)" DestinationFiles="$(CrmFileStore)\$(CrmNewOrganisationName).xml"/>
    <MSBuild.ExtensionPack.FileSystem.Detokenise TaskAction="Detokenise" TargetFiles="$(CrmFileStore)\$(CrmNewOrganisationName).xml" DisplayFiles="true"/>
<PSExecTask Server="$(CRMWEBComputerName)"
            Command="powershell $(CrmImportOrganisationScriptPath) -sqlServerInstance '$(CrmSqlServerInstance)' -databaseName '$(CrmNewDatabaseName)' -reportServerUrl '$(CrmReportServerUrl)' -orgDisplayName $(CrmNewOrganisationName) -orgName $(CrmNewOrganisationName) -userMappingXmlFile '$(CrmFileStore)\$(CrmNewOrganisationName).xml'"
                        ExePath="$(PSExec)"/>
</Target>

The deployment involves restoring database, which is done using the  “SqlExecute” task in MSBuild Extension Pack. Once the database is restored, the next action is to execute the “Import-CrmOrganisation” Commandlet on the remote server.Once imported, the organisation will be created and available for developer from the database backup.

In the next post, I will discuss about setting up Team Build for solutions containing Microsoft Dynamics CRM

Technorati Tags: ,,

Friday, 7 March 2014

ALM with Microsoft Dynamics CRM

This is the first of a multi-part series suggesting an ALM process in  projects where Microsoft Dynamics CRM is used as a data store.
 

In the last few years, I have been worked on some solutions that use Microsoft Dynamics CRM as the primary back-office system. Dynamic CRM’s rich feature set and it’s ability to be an XRM (eXtended Relationship Management) system makes it an ideal candidate to be an alternative of bespoke database systems. However, in each of the solution, there were some externally exposed services/systems which needed to access the data in Dynamics CRM. So, there was a services layer that will expose required data to other services/systems. In other words, Dynamics CRM acts as data store for other systems.

With Microsoft Dynamics CRM in the frame establishing ALM process has an additional challenge. This challenge comes from the fact that Dynamics CRM is essentially a platform onto which customisations (such as entities, plug-ins, workflows, data, etc.) are deployed. Moreover, the customisations are additive. Because of this setting up a repeatable process is tricky. It is also vital that the “baseline” of Dynamics CRM system is properly captured in any build and deployment process.

Development Build

One of the foremost activity at the start of the development is to get a development build going. The purpose of development build is to make sure that all constituent parts of the system are compiled, the unit tests are run and some level of integration testing done on development workstations. Developers are, of-course, required to run the development build before they check-in.

The same can be achieved with Gated builds in TFS, but from experience running integration and Coded UI tests on team build is somewhat “high maintenance” and developers don’t get the isolation that they get with a development build on local machines.

 

Using CRM in Development Build

So, with Microsoft Dynamics CRM in the picture, what should the process be? The very first thing for you to decide is how to achieve “isolation of environment” for developers. This is needed because each developer will be running his / her own set of integration tests. There are two options:

 1. Local CRM Instance
  • Each developer has a Microsoft Dynamics CRM server deployed locally.
  • A developer is a deployment admin on her/ his own CRM Instance.
  • The CRM team check-ins CRM code and deployable packages in the repository.
  • Each run of development build sets up a CRM Organization by compiling checked-in code and use deployment and data files from the repository.
2. Single Development CRM Instance
  • A single CRM server for all developers or a group of developers.
  • Each developer has his / her own CRM Organization
  • All developers are deployment administrator on the Development CRM instance.
  • The CRM team check-ins CRM code and deployable packages in the repository AND also “publishes” an CRM organisation  by taking a back-up of a stable version.
  • Each run of development build  sets up the CRM organisation by restoring the “published” CRM database backup and running post organisation restoration packages such as map users.

Each of the two options has advantages and disadvantages.

A local CRM instance on each development machine provides more isolation but requires more local resources and CRM knowledge for the local developers. It also means that the development build is slower as setting up an organization is slow. Also means that the CRM SDK needs to be installed each developer’s machine.

The single development CRM instance means that all the developers are dependant upon availability of one server, however, most of CRM details are hidden from them. As long as they are able to restore an organisation database and import an organisation, they are fine.

PLEASE NOTE: The CRM team should always work on a separate instance either way because publishing customisations in CRM is a resource intensive project and them trying to do it on a CRM server used by all other developers will impact the velocity of the team.

Team Build

Like any other project, two types of Team Builds should be set up.

Continuous Integration Build – To be triggered with each check-in, the purpose of the CI build is to ensure that all checked-in code (including CRM customisation code) is compiled and built well and pass all quality gates.

Product Build – The product build is triggered periodically (overnight in our case) and produces deployable packages from the source repository. The deployable package in terms of CRM were the managed and un-managed solution zip files along with scripts for organization settings, import data, data maps,  etc.

PLEASE NOTE: It is important that you generate your CRM Package from the source repository and not from the CRM development instance (for example taking a backup of the organization database) otherwise your source repository will be side track and people will make code changes and fixes in the development environment without ever checking in source code, population scripts, etc.

The structure of CRM deployment packages warrants a separate post, which I will write in the next post.

Deployment

The last piece of setting up the ALM process is setting up a process for the deployment of CRM to take place. The end goal is to have a deployment process that can deploy the deployable package in a consistent and reliable way. The process needs to be repetitive so that you can do it every time when you move between development to test to staging and then to production. This ensures that you don’t get any surprises when you are deploying to production.

As mentioned earlier, any deployment to Microsoft Dynamics CRM is additive. This means that that you need to make sure that the target system deployment is properly baselined. For example, if your production environment has already got some customizations from say another managed CRM solution, make sure they are present in your functional test, pre-production and any other environments as well.

I prefer to use MSBuild based scripts for deployments. The MSBuild Engine is deployed with .Net Framework and libraries such as the “MSBuild Extension Pack” provides a rich set of functionality. You can well use PowerShell. In fact, Dynamics CRM is well  supported in PowerShell. For CRM, there is good PowerShell support as well. However, the latest release of MSBE has now got support for CRM such as Create Organization, Import Solution etc.

Updated 12/03/2014: In my next blog post, I have elaborated the development build process with sample code.

Technorati Tags: ,

Tuesday, 4 March 2014

The imported project "C:\Program Files (x86)\MSBuild\Microsoft\VisualStudio\v11.0\WebApplications\Microsoft.WebApplication.targets" was not found.

This is another “How I got burnt” today post, so sharing here just in case anyone else finds the same issue. So, I moved one of my team project from TFS 2012 to TFS 2013. After moving the solution, I noticed that one of my builds start producing the following error

The imported project "C:\Program Files (x86)\MSBuild\Microsoft\VisualStudio\v11.0\WebApplications\Microsoft.WebApplication.targets" was not found. Confirm that the path in the <Import> declaration is correct, and that the file exists on disk.

This build compiles a solution containing an ASP.Net MVC project, which is where it was complaining about. The solution was already migrated to Visual Studio 2013, so why was it complaining about files for Visual Studio 2012 (v11.0) not being found. Looking closely, it transpired to me that it might be the 2012 Team build template and indeed it was. Although, we have migrated our project to TFS 2013, we were still using the TFS 2012 Build template. The solution was simple – simply passing on the Visual Studio version as an MSBuild argument (as shown below) in the build definition and walla it’s all resolved.

msbuild arguments