Friday, 10 November 2017

Retention Policies for TFS Build System

TFS build system has had a major overhaul since TFS 2015. For people working on team builds since TFS 2010, there is some major learning curve. One of the things the people often find confusing is the retention policy in the new build system. In earlier versions of TFS, you could specify how many builds you want to retain for each status as shown in the screenshot below

Retention Policies for Xaml Builds

The retention policy is quite obvious and you have a deterministic number of builds retained at each status. It's not quite the case in the new build system. A sample retention policy in the new system looks like following

Retention Policies for TFS Builds


So what does it mean? 

Well, to say it simply, it means exactly what it says on the tin!! In the example above, the build would keep all builds from the last 4 days and keep a minimum of 20 builds. That is if there are less than 20 builds present for the given build definition, it would keep older builds until there are a minimum of 20 builds. Lets ignore the options with the lock sign, we will come back to it later. Note there is are no maximum count. It means that you can't control how many builds you keep for your build definition. This is a major shift from earlier retention policy where the number of builds kept for a build definition was deterministic. 


When are builds actually cleaned up?

If you are using an on-premise version of TFS (I am using TFS 2017 Update 2), the builds are actually not cleaned until 3:00 AM every day. For VSTS, it happens several times in a day but the time is not deterministic. The actually explains why there is only a "Minimum to Keep" option in the retention policy.

If you have a build definitions that is triggered very frequently, you will need to find a solution of actually deleting the build definitions. I will explain it in the next post.


What about Keep For 356 days, 100 good builds?

This is the option you see below your policy in the screenshot shown above. This is in fact a TFS Project Collection wide policy and enforces the maximum retention policy. So, in the example above, you can't set "Days to Keep" to more than 365 and "Minimum to Keep" to more than 100. In fact, if you have appropriate permissions, you can change it for the entire Team Project Collection.


TFS Project Collection Retention Policy


Multiple Policies

If you want, you can add retention multiple policies for your build definition. It is very useful, if you have build definition that builds different code branches (release branches for instance). You can use the retention policies to keep different number of builds from each branch. 

Multiple Policies


If you have multiple retention policies for the same branch, the retention would be the most lenient of all the retention, so whatever retains the most builds.

In the next blog post, I will show how we are keeping a lid on the number of builds for builds which are build very frequently, every couple of minutes in our case.

Sunday, 22 October 2017

C# Async Programming - Tasks for dummies

There are umpteenth articles / blogs / guides about the Task-based Asynchronous Pattern used in C# for asynchronous programming. However, I feel that explanations are often convoluted and difficult to follow for something new to language / programming. This week, I explained the pattern to a graduate following some review comments and couldn't find an easy-to-understand article, so thought to explain it myself here.

This will be a series of blogs. I will try to keep it as simple as possible without compromising on completeness. Starting with one of the most basic concept - "The Task".

What is a Task?
Task is the C# abstraction of an asynchronous operation. In other words, it some series of code that executes asynchronously. It may or may/not return a result. 

How are Tasks created?
Tasks can be created explicitly by creating an object of type Task or Task or implicitly by running an async method. For example, both of these expressions end up resulting a task

var task = Task.Run( () => { … });

Func taskFunction = ( async () => { await foo() } ); taskFunction.Invoke();

Tasks and Threads
There are some differences when tasks are created explicitly or implicitly but let's not go there. 

There is a common misunderstanding that creating a task means running on a new thread. This is not true. Whether or not task creates a new thread depends upon how it is created.

For tasks created using Task Parallel Library, using Task.Run() for instance, a thread is created with the task. Running the following code 

              
Console.WriteLine($"Application Thread ID : {Thread.CurrentThread.ManagedThreadId}");
Task.Run(() =>
{
Thread.Sleep(30);
Console.WriteLine("Inside Task");
Console.WriteLine($"Task Thread ID : {Thread.CurrentThread.ManagedThreadId}");
});

Will produce

Application Thread ID : 2
Back to application Thread ID : 2
Inside Task

Task Thread ID : 3

Tasks created by async methods DO NOT create a new thread. Once once task is blocked, control is shifted to other task that is in ready state. 

For example, the following code 

Console.WriteLine($"Application Thread ID : {Thread.CurrentThread.ManagedThreadId}");
Func localTask = (async () =>{Console.WriteLine("Inside Task");Thread.Sleep(30); 
Console.WriteLine($"Task Thread ID :{Thread.CurrentThread.ManagedThreadId}");
});
localTask.Invoke();               
  
Will produce 

Application Thread ID : 2
Inside Task

Back to application Thread ID : 2
Task Thread ID : 2

Note that the thread Id is the same. This means that tasks, unless created by task parallel library, do not run in parallel. They share the same thread and uses context switching to pass control as tasks are blocked and become available again.

The Await Operator
This brings us nicely to the await operator. In simplest words, the await operator cause context switching. The operator is used when executing code block needs to get a result from a task that is running asynchronously. Calling await will block the current routine, only to be returned when the task it is waiting on has completed.

Conclusion
I hope this post will help people in understanding C# tasks. Some of the key take away from this post
  • Tasks can be explicitly using Task Parallel Library or implicitly using async keyword.
  • Task are not same as threads. Some tasks are created in a new thread - the ones created by TPL for instance. While others are created on the same thread.

Sunday, 15 October 2017

Visual Studio 2017 - New npm package won't install...

One of my "how I got burned today" blogs. Spent some time on it so thought to share.

I started with writing a simplistic application using NodeJS today using Visual studio 2017. Tried to install NodeJS using the "Install new npm Package option".



The "Install New npm package" dialog opens. Typed in the name of the package and clicked install "Install Package"


Absolutely nothing happened. No errors or messages were shown and my package wasn't installed. Turned out that there was a syntax error in my packages.json file, where I had missed out a comma. Would have been nice if visual studio had captured this error and shown some sort of message. I am using Visual Studio 2017 Update 3.

Friday, 13 October 2017

PowerShell - The curious case of @ in converted Json strings

PowerShell is great when it comes to working with JSON. Being a scripting language, you can pretty much de-serialize your json without declaring types for them, do your work on de-serialized objects, and then serialize it back to for storing or transport. 

The ConvertFrom-Json and ConvertTo-Json are powerful functions. However, there are a few nuggets that you need to be aware of. I got caught out by one of the so thought to blog about it.

When working with ConvertTo-Json, be mindful of the -Depth parameter. The parameter specifies how deep it should go in our object while converting to json string. The default value is 2. What it means is that if you have a complex json object that goes down more than two levels of depth and it you haven't specified the -Depth parameter, your nested objects would be treated as Hashtable.

As an example, let's assign a json string to a variable 

 $programmersJson = '[{ 
      "Name" : "Hamid",
      "Gender" : "Male",
      "Expertise": [  
        {
          "Skill": "PowerShell",
          "Level": "5"
        },
        {
          "Skill": "C#",
          "Level": "8"
        }
      ]},
      {
      "Name" : "Adnan", 
      "Gender" : "Male" ,
      "Expertise": [
        {
          "Skill": "PowerShell",
          "Level": "7"
        },
        {
          "Skill": "C#",
          "Level": "6"
        }
      ]}]'

The json string contains an object that is three level deep. . The top level object is a collection, each item in the collection is an object with a "Name" and "Gender" property as well as a collection of objects each has a "Skill" and "Level" property

Now, lets call ConvertFrom-Json

$programmers = ConvertFrom-Json -InputObject $programmersJson

Write-Output $programmers

The result is an array of objects, as expected

Name  Gender Expertise
----  ------ ---------
Hamid Male   {@{Skill=PowerShell; Level=5}, @{Skill=C#; Level=8}}
Adnan Male   {@{Skill=PowerShell; Level=7}, @{Skill=C#; Level=6}}

Now, lets try to convert it back to json. So, when you call

 ConvertTo-Json -InputObject $programmers

You would expect the Json string to be same as $programmersJson. Wrong!! The string you get back is


[
  {
    "Name": "Hamid",
    "Gender": "Male",
    "Expertise": [
      "@{Skill=PowerShell; Level=5}",
      "@{Skill=C#; Level=8}"
    ]
  },
  {
    "Name": "Adnan",
    "Gender": "Male",
    "Expertise": [
      "@{Skill=PowerShell; Level=7}",
      "@{Skill=C#; Level=6}"
    ]
  }
]

Notice, the @ sign for each of the item in the Expertise collection. It means that the function has treated each item as a Hashtable rather than an object.

Now execute the following

 ConvertTo-Json -InputObject $programmers -Depth 3

The depth parameter will make it treat each item of Expertise collection as an object as well and the resulting Json would be as you would expect.

[
  {
    "Name": "Hamid",
    "Gender": "Male",
    "Expertise": [
      {
        "Skill": "PowerShell",
        "Level": "5"
      },
      {
        "Skill": "C#",
        "Level": "8"
      }
    ]
  },
  {
    "Name": "Adnan",
    "Gender": "Male",
    "Expertise": [
      {
        "Skill": "PowerShell",
        "Level": "7"
      },
      {
        "Skill": "C#",
        "Level": "6"
      }
    ]
  }
]


This is how we would have expected the output json string to look like. So, next time you are working with json on PowerShell, make sure to be mindful of the -Depth parameter.

Monday, 9 October 2017

Migrating ASP.NET MVC website to ASP .NET Core

I maintain an ASP.NET MVC website that I have been meaning to move to ASP.NET Core, but found the .Net Core 1.1 library rather limited. With the release of .NET Core 2.0 and ASP.NET Core 2.0, we decided to migrate the websites to the new framework. The site has been operational since October 2010 and was built using ASP.NET MVC 2.0. It has gone through various bouts of upgrades and is currently using ASP.NET MVC 5.2.0, which forms the baseline of this conversion. I had several discoveries along the way so thought to blog about them. 

In this post, I am going to write about prep and moving our "Model" to Entity Framework .Core 2.0


Background

The model of our website was built using Entity Framework code first. All database operations were performed using repository pattern. Our repository interface looks as follows

    public interface IRepository : IDisposable where TEntity : class
    {
        IQueryable GetQuery();
        IEnumerable GetAll();
        IEnumerable Find(Expression> predicate);
        TEntity Single(Expression> predicate);
        TEntity First(Expression> predicate);
        void Add(TEntity entity);
        void Delete(TEntity entity);
        void Attach(TEntity entity);
        void SaveChanges();
        DbContext DataContext { get; }
    }

We use interface inheritance to create repository for each of our model objects, so for a object "Token", the repository looks like following


    public interface ITokenRepository : IRepository
    {
    }

With the interface inheritance in place. our single generic repository class can the logic for database operations as shown below

public class Repository : IRepository where TEntity : class
    {
        private DbContext _context;

        private IDbSet _dbSet;

        private static string _connectionString = string.Empty;

        public Repository(IDataContextFactory dbContextFactory)
        {
            if (string.IsNullOrWhiteSpace(dbContextFactory.ConnectionString))
            {
                _context = dbContextFactory.Create(ConnectionString);
            }
            else
            {
                _context = dbContextFactory.Create();
            }
            
            _dbSet = _context.Set();
        }

        public Repository(DbContext context)
        {
            _context = context;
            _dbSet = _context.Set();
        }

        public DbContext DataContext
        {
            get
            {
                return _context;
            }
        }
        public IQueryable GetQuery()
        {
            return _dbSet;
        }

        public IEnumerable GetAll()
        {
            return GetQuery().AsEnumerable();
        }

        public IEnumerable Find(Expression> predicate)
        {
            return _dbSet.Where(predicate);
        }

        public TEntity Single(Expression> predicate)
        {
            return _dbSet.SingleOrDefault(predicate);
        }

        public TEntity First(Expression> predicate)
        {
            return _dbSet.FirstOrDefault(predicate);
        }

        public void Delete(TEntity entity)
        {
            if (entity == null)
            {
                throw new ArgumentNullException("entity");
            }

            _dbSet.Remove(entity);
        }

        public void Add(TEntity entity)
        {
            if (entity == null)
            {
                throw new ArgumentNullException("entity");
            }

            _dbSet.Add(entity);
        }

        public void Attach(TEntity entity)
        {
            _dbSet.Attach(entity);
        }

        public void SaveChanges()
        {
            _context.SaveChanges();
        }

        public void Dispose()
        {
            Dispose(true);
            GC.SuppressFinalize(this);
        }

        protected virtual void Dispose(bool disposing)
        {
            if (disposing)
            {
                if (_context != null)
                {
                    _context.Dispose();
                    _context = null;
                }
            }
        }

        public static string ConnectionString 
        {
            get
            {
                if (string.IsNullOrWhiteSpace(_connectionString))
                {
                    _connectionString = ConfigurationManager.ConnectionStrings["Rewards"].ConnectionString;
                }

                return _connectionString;
            }
        }
    }

The class above does all the heavy lifting for us. We just need to define classes that implement each of our models' repository interface. For our model Token, it would be


public class TokenRepository : Repository , ITokenRepository
    {
        public TokenRepository(IDataContextFactory dbContextFactory)
            : base(dbContextFactory)
        {   
        }

        public TokenRepository(DbContext dataContext) 
            : base(dataContext)
        {
        }
    }


Entity Framework Core 2.0 limitations

1. No Many-To-Many Relationship

The biggest issue we have encountered while migrating to .Net Core 2.0 is lack of resolution for Many-To-Many relationships. This is an open issue, which haven't been resolved yet. For us, it means a lot of re-work.

With the POCO way of working, you would start with writing your domain model and your write your business logic using models, without really thinking about relational database details. We have a lot of code where our LINQ queries were based on domain model relationships. Now, we need to re-work all those. 

This, in my mind, is a major issue and though ways to resolve this issue, it prevents Entity Framework .Core from being a true ORM tool. 

As an example, consider you have two entities Parent and Student in your model, where a student can have multiple parents and a parent can have multiple students. With Entity Framework 6, the model definition was sufficient to imply the correct type of relationship. If you have to do it explicitly, you could do it at the time of model creation like below
modelBuilder.Entity()
              .HasMany(p => p.Parents)
              .WithMany(r => r.Students)
              .Map(m =>
              {
                  m.ToTable("ParentStudents");
                  m.MapLeftKey("Student_ID");
                  m.MapRightKey("Parent_ID");
              });
You can then go on and work with defining a collection of Parents in Students class and a collection of Students in Parent class. The .WithMany() method  is not there in Entity Framework Core.

The lack of Many-To-Many feature in EF Core is hard to justify. POCO came out as a good model for domain driven development and not supporting many-to-many in a domain driven world is hard to justify. We didn't want to "dilute" the model with resolving entities, so we decided to "implement" the many-to-many resolution in our code. This series of post describes a good way of keeping domain relationship in our objects, so that there is no change in business logic in other parts of the application.


2. IDbSet Interface 

The IDbSet interface was removed in Entity Framework 6.0, because the team were looking to add new operations to it without defining a new set of interfaces. This is pretty well documented in EF 6.0 design decisions. I do not agree to this decision as it breaks the whole promise of interface as immutable being. The EF team wanted to avoid creating interfaces like IDBSet2, etc for more functions they decided to do away with it. However, the interface is still present in the in EntifyFramework 6.0 library, so our code still worked. Now we had to replace any use of IDbSet with the DBSet class. Also, meant our test code had to be re-written as we mocked IDbSet to for results from database.


3. No Lazy Loading

The entity framework does not support lazy loading as of yet. There is an open issue for it on github.  The feature request is in the backlog of EF team but there is no date of adding it yet. Lazy loading is the default behaviour of Entity Framework and would be there for you if you have the navigation property defined as virtual. This is another big way in which Entity Framework core breaks backward compatibility. 

The way around is to "Eager Loading" i.e. ensure that you use the .Include("") and .ThenInclude("") method in all places, where you are relying on Lazy loading. This is no simple as it's easy to miss it out at placed and the error is only manifested at run time. One way of go about doing it, is to find references of all virtual properties and add .Include("") where the object is "hydrated".


4. No GroupBy Translation

Entity Framework Core 2.0 doesn't support translate Group By to SQL. So, if your application is using GroupBy() method, you might need to take a look for alternatives. Fortunately, more support for Group By is getting added in EF Core 2.1.

The only way to resolve this issue without punitive performance impact is to move the logic to stored procedures. We were using GroupBy mostly in our reports, which were already a candidate to use stored procedures. So, although there was some work involved but the result was much better performance.


Final Words...

My experiences with migrating code from Entity Framework 6.0 to Entity Framework Core 2.0 would not have uncovered all pertaining issues in migration process but this post might help out someone who is looking to take the plunge. 

In my view, Entity Framework Core 2.0 is still a bit under cooked but if you are willing to take do the extra effort, it has enough functionality for you to move your model / data libraries to it.

Saturday, 23 September 2017

Extending Team Explorer in Visual Studio 2017

Visual studio extensibility has always been a great feature in Visual Studio and enhance the entire development experience. With Visual Studio 2017, there were a bunch of very substantial changes made with respect to extensibility. Most of these changes comes from the fact that Visual Studio now supports a lighter installation version with bare minimum feature installation as default. There is also the option to have multiple installation on the same machine. So, what does it mean for for extensions? 

VS2017 extensions now following the vsix v3 file format. If you have an extension for earlier visual studio versions and you want to port it to VS2017, it means a whole bunch of changes. Here, I am going to write an extension that demonstrate extending Team Explorer. We will create a very simple extension that has a button on Team Explorer, which will open notepad.

Project Creation & Dependencies

Let's start with creating a new extensibility vsix project. You will only see the option if you had selected the VS SDK option while installing visual studio. Let's call our project TeamExplorerExtSample. Visual Studio 2017 uses .Net Framework 4.6.1, so we select this version.



Once the project is created, you will see a couple of web files and a file called source.extension.vsixmanifest, which contains extension information. We will come to this file later.

Now let's add references to the assemblies we would need to extend Team Explorer. Note that with visual studio 2017, assemblies are not added to GAC so we would need to make sure that all desired assemblies are included in the vsix. To display a navigation button in team explorer, we would need to implement the interface ITeamExplorerNavigationItem2, so we would need to add references to the following assemblies
  •     Microsoft.TeamFoundation.Controls
  •     System
  •     System.ComponentModel.Composition
  •     System.Drawing

VSIX Manifest file:

The manifest file contains information about the extension, it's dependencies, assets and pre-requisites. Double click on the source.extension.vsixmanifest to see details. To extend Team Explorer, the key thing to remember is to add the assembly containing classes that implement Team Explorer interfaces as a MEF component. This will ensure that visual studio loads it up when loading team explorer.

Our VSIX manifest file looks like this


Extending ITeamNavigationItem2

Our extension will create a button in Team Explorer that opens up the notepad application. To do this, we need to extend the ITeamNavigationItem2 interface. The interface is found in Microsoft.TeamFoundation.Control assembly that we have already referenced. We will also need to add TeamExplorerNavigationItem attribute. Our very simple class looks as below.


namespace TeamExplorerExtSample
{
       using System;
       using System.ComponentModel;
       using System.Diagnostics;
       using System.Drawing;
       using Microsoft.TeamFoundation.Controls;
       [TeamExplorerNavigationItem("C9B2CF74-0C87-4CEA-ACA9-8CC1C816D7F3", 1800)]
       public class NotepadNavigationItem : ITeamExplorerNavigationItem2
       {
              public bool IsEnabled => true;
              public int ArgbColor => 0;
              public object Icon => null;
              public string Text => "Open Notepad";
              public Image Image => null;
              public bool IsVisible => true;
              public event PropertyChangedEventHandler PropertyChanged;
              public void Dispose()
              {
                     this.Dispose(true);
                     GC.SuppressFinalize(this);
              }
              protected virtual void Dispose(bool disposing)
              {
              }
              public void Execute()
              {
                     Process.Start("notepad.exe");
              }
              public void Invalidate()
              {
              }
       }
}

As you can see, the only matter we have got in the class is a call to Process.Start to start up notepad. The navigation item appears as below


Click on the button and a new instance of notepad opens up.

Conclusion:

Admittedly. this is a very simplistic extension but contains all the steps you need to extend Team Explorer. You can add classes to add Pages, Sections and Links in Team Explorer, add icons \ images and menu items. The code sample from post is here

Friday, 22 September 2017

Shelveset Comparer now supports Visual Studio 2017

 The popular Shelveset comparer extension that I created a few years ago now support Visual Studio 2017 as well. 

It took me some time to create a compatible version due to load of things happening in personal life. There is also the added reason that I am not using Git for almost all projects I am working on, so the need for shelveset & comparisons wasn't felt as much as it would have. 

While working on creating the new version, I had to learn about very substantial changes in visual studio extensibility. I will write a blog about it. Please feel free to download the extension & give me your feedback


<ciao />