Debug TypeScript Unit Test with Jest and VsCode

If you are using create-react-app or the TypeScript equivalent react-script-ts for TypeScript, you see that the default testing framework is Jest. This is developed by Facebook like React and Redux. Jest procures many advantages like being fast. It doesn’t need to load a browser headless or being in a browser at all. It is also fast because it can run the unit tests on a changed test or run the unit test that has a relation to the code changed instead of running every test. In this article, I’ll guide you to setup Visual Studio Code to be able to debug directly in TypeScript. I take the time to write something because information on Internet is very slim and the configuration is fragile.

As mentioned, configuring Visual Studio with Jest require subtle detail that can break the whole experience. For example, using Node 8.1.4 won’t work, using Node 8.4 or 8.6 works. Another sensitive area is the configuration of Visual Studio. It requires having some specific configurations which vary. The following code is two different launchers that work with Visual Studio Code.

{
    "type": "node",
    "request": "launch",
    "name": "Jest 1",
    "program": "${workspaceRoot}/node_modules/jest/bin/jest",
    "args": [
        "-i"
    ],
    "preLaunchTask": "tsc: build - tsconfig.json",
    "internalConsoleOptions": "openOnSessionStart",
    "console": "integratedTerminal",
    "outFiles": [
        "${workspaceRoot}/build/dist/**/*"
    ],
    "envFile": "${workspaceRoot}/.env"
}

// OR

{
    "name": "Jest 3",
    "type": "node",
    "request": "launch",
    "program": "${workspaceRoot}/node_modules/jest-cli/bin/jest.js",
    "stopOnEntry": false,
    "args": [
        "--runInBand"
    ],
    "cwd": "${workspaceRoot}",
    "preLaunchTask": null,
    "runtimeExecutable": null,
    "runtimeArgs": [
        "--nolazy"
    ],
    "env": {
        "NODE_ENV": "test"
    },
    "console": "integratedTerminal",
    "sourceMaps": true
}

The second one requires to have jest-cli, the first one not. To download the jest-cli use NPM.

npm install --save-dev jest-cli

From there you can run directly inside Visual Studio Code under the debug tab the debug program or hit F5.

Compiling Individual TypeScript File with VSCode

TypeScript is used to compile TypeScript .ts into JavaScript .js. The basic setup is to have a tsconfig.json file that indicate what file to include, and which files to exclude. Almost everyone is having a Gulp/Grunt task that will execute the compilation, some goes a step ahead with having a watcher that looks for changes to start compiling. The problem is when a project grows, it doesn’t make sense in term of efficiency to build every TypeScript when only one change.

The first step is to make a new Gulp’s task that will look similar to this one which, at this moment, hard code a single file to be compiled.

const gulp = require('gulp');
const tsc = require('gulp-typescript');
const tsProject = tsc.createProject('tsconfig.json');

gulp.task("buildsinglefile", () => {
    const pathWithFileNameToCompile = "src/folder1/file2.ts";
    const pathWithoutFileNameForOutput = "./output/folder1";
    return gulp.src(pathWithFileNameToCompile)
        .pipe(tsc({
            "target": "es6",
            "module": "amd"
        }))
        .pipe(gulp.dest(pathWithoutFileNameForOutput));
});

As you can see from the 2 first lines, it needs to have 2 npm packages.

npm install gulp --save-savedev
npm install gulp-typescript typescript --save-savedev
npm install gulp-sourcemaps --save-dev

The Gulp’s script needs to take in parameter some information from Visual Studio Code. To pass this data, we need VSCode to sent its variable through arguments

{
    "version": "0.1.0",
    "command": "gulp",
    "isShellCommand": true,
    "args": [
        "--workspaceRoot",
        "${workspaceRoot}",
        "--fileDirname",
        "${fileDirname}",
        "--file",
        "${file}"
    ],
    "tasks": [
        {
            "taskName": "buildsinglefile",
            "isBuildCommand": true,
            "args": []
        }
    ]
}

To make it works, we need to pass the double dash for the name of the parameter, followed by special macro variable of VSCode. Then, the Gulp’s task needs to consume the argument. The arguments passed are indexed and contains more than only what is passed down by the VSCode’s task. The index 0 is the node.exe path, index 1 is the gulp.js path, index 2 starts to be out own argument, which in our case will contain “–workspaceroot” and so on. The index 7 is the first argument used and it contains the file on which the task is executed. We need it to give the file to TypeScript to compile. Next, we use argument 5 and 3 to create the output path. This need to be custom to your file structure. In the illustrated case we have a structure that looks like:
./src/same-structure-after that is compiled to ./output/same-structure-after.

gulp.task("buildsinglefile", () => {
    const arguments = process.argv;
    const pathWithFileNameToCompile = arguments[7];
    const pathWithoutFileNameForOutput = arguments[5].replace(arguments[3], ".").replace("\\src\\", "\\output\\");

    const step1 = gulp.src(pathWithFileNameToCompile)
        .pipe(sourcemaps.init())
        .pipe(tsc({
            "target": "es6",
            "module": "amd"
        }))

    step1.pipe(gulp.dest(pathWithoutFileNameForOutput));

    step1.dts.pipe(gulp.dest(pathWithoutFileNameForOutput));
    return step1.js
        .pipe(sourcemaps.write('.'))
        .pipe(gulp.dest(pathWithoutFileNameForOutput));
});

To use it, we only need to press “F1” on the single file we want to compile, type “task”, select “Run Task” and select “buildsinglefile”.

This will generate the map file, as well as the JavaScript file. You can click the following image to see the animation of how it works.

Having a quick way to build file you are working on is crucial to move fast forward. This is one step in this direction. Two possibles improvement would be to have a shortcut to execute this task, and another solution is to have a file watcher that execute a compilation on the modified file.

Visual Studio Code and Debugging Gulp Script

Microsoft Visual Studio Code (VSCode) has the capability to debug TypeScript, JavaScript and a lot of other languages. One interesting thing is that Visual Studio Code can also debug Gulp script (written in JavaScript). This is very useful since writing code that become complex is easier with good step through capabilities.

To do so, we need to add a launch configuration for VSCode. This is done by adding inside the root of your project a folder called “.vscode” and adding a file named “launch.json”.

This file is used by Visual Studio when you hit “F5” or when you go in the left panel, under “debug” and click “Play”.

The configuration contains few item that you must have, and some that you need to configure.

{
    "version": "0.2.0",
    "configurations": [
        {
            "type": "node",
            "request": "launch",
            "name": "Debug Gulp",
            "program": "${workspaceRoot}/node_modules/gulp/bin/gulp.js",
            "stopOnEntry": true,
            "args": [
              "copy"  
            ],
            "cwd": "${workspaceRoot}/",
            "outFiles": [],
            "sourceMaps": true,
            "runtimeExecutable": null,
            "env": {
                
            }
        }
    ]
}

The type, request are required to be “node” and “launch” which said to VSCode that we will debug a node application. The “name” property is the name that will show in the debugger. In the screenshot you see it’s written “Debug Gulp” which is the name specified here. The very important part is the “program” which must point to Gulp. So far, we said to VSCode to execute node with the program Gulp.

“StopOnEntry” is not required but it will stop right when it loads the program. I found it handy to have it stop, which allow me to go set my breakpoint into the Gulp’s task I want to debug. In regard of which task we debug, it is defined under the args. In my example above, I am debugging the task named “copy”. The “cwd” is where the gulpfile.js is located, this is where the task to debug is located. It’s white to use the workspace root keyword to start from a good unchangeable root.

And that’s it! You will have the possibility to step through all the Gulp’s task code.

Visual Studio Code with NPM and TypeScript (Part 1 : NPM)

Creating a new project that use TypeScript with Visual Studio code can be not as straight forward as expected. A quick search on the web shows dozen of different ways and none of them are the same. Most current example are out of date or use so many npm packages that it’s hard to follow for people coming from full blown IDE and framework. This article goal is to present the simplest possible way to have a TypeScript project up and running. Even if we want to be very simple, a lot of technologies will be required. We will try to use the minimum.

Before getting started, you need to have NodeJs. This is a requirement because we will use npm. Right there, you may wonder what these two words means. NodeJs is needed to use npm which is the biggest JavaScript libraries repository. For the simplicity of this article, see NodeJs as a system that allows you to run tools and npm is one of the tool. See npm like NuGet for .Net ecosystem, but this time for JavaScript. Installing NodeJs can be done directly on the website : https://nodejs.org/en/download/

Using npm needs to use Powershell or a command line. But before, let’s create a new directory where we will host the project and initialize npm.

mkdir tsWithCode1
npm init

The npm command creates a package.json file where information about which packages is needed. See this as a cart of library that you need and that later we will download and install. For your project, we want to use TypeScript, JQuery and Gulp. The first one is the TypeScript library, the second is a popular library that is not written in TypeScript and the third one is a task runner. We will go in more detail soon, just keep in mind that they are 3 different libraries that needs to be handle differently. One different is that TypeScript and Gulp are used at development time, while JQuery is needed in development and in production. We need to install differently because of this difference since we do not want to have libraries not needed in production.

npm install --save-dev typescript
npm install --save-dev gulp
npm install --save jquery

The output should have the init parameters provided (name, description, etc) and two sections.

{
  "name": "tswithcode1",
  "version": "1.0.0",
  "description": "TypeScript with NPM and VSCODE",
  "main": "index.js",
  "scripts": {
    "test": "test"
  },
  "author": "Patrick Desjardins",
  "license": "ISC",
  "devDependencies": {
    "gulp": "^3.9.1",
    "typescript": "^2.1.5"
  },
  "dependencies": {
    "jquery": "^3.1.1"
  }
}

At this point, you can look in your directory and see a new one generated named “node_modules”. This folder doesn’t need to be in your source control since you can simply invoke “npm install” in the directory to get the folder back with all the libraries. When using the install command, NPM looks in the package.json configuration file and get what needed. I suggest that you delete the “node_modules” and try the command. But before doing so, look at how many folder contains the “node_module”. At the time I am writing this article, the number if 162. The count is way more than just 3 (TypeScript, Gulp and JQuery) because each of these libraries have dependencies on other libraries that also have dependencies and so on. Since it’s an introduction article, I won’t go in too much detail, but it’s possible to install package globally in your computer (%AppData%\npm\node_modules). If you are developing several projects, you may want to install common utility tools globally, like TypeScript or Gulp. The advantage is that you avoid having in all project the same files, hence saving disk space.

Before going to the next article that will setup the HTML file and create the first TypeScript file, let’s add an additional library: requirejs. This library will be used to handle TypeScript module that we will see soon.

npm install --save requirejs

How to use Visual Studio 2015 with ASP.NET MVC5 and TypeScript 2.1+

TypeScript is a wonderful language for front-end developing. It helps by make front-end code feels like C#. At the end you compile the TypeScript code in JavaScript like if you had do without TypeScript. The first step is to setup Visual Studio to use TypeScript. On the official website, there is instruction for Asp.Net MVC4 which doesn’t work very smoothly with MVC5 and the latest version of TypeScript. In fact, following those instructions will lead you into a compilation problem that would tell you that VSTSC doesn’t exist. In this article, I’ll show you the quickest way to use TypeScript.

The first step is to download the latest version of TypeScript. This will install TypeScript in Program Files (C:\Program Files (x86)\Microsoft SDKs\TypeScript). Be aware that you may already have some older version of TypeScript (like 1.6 and 1.8) but you want to have 2.1.

Installing TypeScript should take about 3 minutes. Once it’s done, you need to add a tsconfig.json file at the root of your project. This second step add a file that gives configuration to TypeScript. For example, where to take the TypeScript and where to output the JavaScript result. Here is an example:

{
  "compilerOptions": {
    "sourceMap": true,
    "target": "es6",
    "outDir": "./Scripts/App"
  },
  "files": [
    "./src/app.ts"
  ]
}

That said to take the file app.ts from the src folder and create the corresponding JavaScript in the outDir. In this example, we take the file in /src/ and output the result in /Scripts/App/ which is the default JavaScript folder in Asp.Net MVC. It could have been any other folder as long as in your .cshtml you refer to this one. That said, we need to change the .cshtml where we want to consume the JavaScript. We need to add the script tag like we would do normally.

<script src="~/Scripts/App/app.js"></script>

Before going any further, let’s talk about the tsconfig.json. The option are very basic, the first one indicate that we want sourcemap which allow you to debug directly the TypeScript file instead of the JavaScript. The second parameter is the target. It indicates in which version of JavaScript (EcmaScript) to output. The third one if where to save the JavaScript files compiled. Files is the input.

From here, you just need to have a TypeScript file called app.ts and write some code. Do as normally with C#, go in Visual Studio’s menu under Build and do Build Solution. This will output JavaScript. You may not see the output file if you do not have “Show all Files” selected in the Solution Explorer.

At that point, you may ask yourself that it sounds cumbersome to manually add files if the project is big. This is why you can change the tsconfig.json to compile all TypeScript file of specific folders.

{
  "compilerOptions": {
    "sourceMap": true,
    "target": "es6",
    "outDir": "./Scripts/App"
  },
  "include": [
        "src/**/*"
    ]
}

This will go through all .ts, .tsx, and .d.ts file and generate the right JavaScript.

You may fall into the problem that while writing your TypeScript that Visual Studio 2015 tells you that you are using a version different from the version specified in the tsconfig.ts.

This come with the problem of having the .csproject having the TypeScript options being disabled saying that two tsconfig.json exist.

The problem is that Visual Studio is having TypeScript configuration directly into the .csproj file. You can open with a text editor the .csproj and search for TypeScript.

There is two options here. The first one is to remove the tsconfig.json file and configure from Visual Studio project property. However, you will be limited in term of options. The second is to remove all TypeScript entries inside the .csproject and keep the tsconfig.json. You may have to restart Visual Studio to have Intellisense to work again.

Using Application Insight with Visual Studio for Azure Website

Working with production code is not always easy when it comes the time to fix issue. Application Insight is a free service on Microsoft Azure that allow to do a lot and one of the feature is to integrate with Visual Studio. In this article, we will see how Application Insight can improve the speed to fix your issue.

First of all, if you log in the Cloud Explorer panel into your Azure account and open the solution your deployed you will see Application Insight in CodeLens.
ApplicationInsightCodeLen

That mean that while coding, you may see that some exception go raised in your production server. From here, you can click Application Insight in CodeLens and see the number of exception as well as 2 links. The first one is Search. Search allows you to search in time the exception and get more information. It’s possible to filter and search by exception type, country, ip, operation name (Asp.Net actions), etc. For example, here is a NullReferenceException thrown when users where accessing the Asp.Net MVC controller “UserContest” from the action “Detail”. We can see in the stack trace and see who called the faulty method.

ApplicationInsightDetail

The second link is called Trend. This one let you see when the exception was raised, as well as the amount of time the exception got thrown and the problem id. You can navigate in time and in exceptions and see what can cause this issue. It might be a webjob that run at a specific times or a period of high traffic.

ApplicationInsightTrend

This is a short article, but it should give you the desire to go explore this free feature. It’s clearly a powerful tool for developers that need to react fast to problem in production and remove a lot of fences between finding the right log and fixing the issue. With an easy tool, and natural integration, investigations are faster which lead to faster resolutions of problem.

Continuous Integration (CI) with C# .Net and Travis-CI

I am a big fan of Visual Studio Team Services (VSTS) as well as a developer on this platform. However, if you are doing open source project, you cannot benefit of VSTS. The reason is that everything is private. Microsoft is also using alternative company like GitHub to host public project, so am I! However, GitHub is focusing only at the source repository, not the build, running unit tests, doing coverage, deploying Nuget package. This doesn’t matter since GitHub provides what is called “Webhooks” which allow other services to get notified when new code get pushed. In this article, we will discuss about Travis-ci.org. This free service can get notified, with the “webhook” of GitHub to start compiling your code. It also let you run other task, like running unit tests.

I am take for grant that you have a GitHub account. If you do not, you can create one for free. The next step, is to go at Travis-ci.org and signup. This is very easy since you can login with your GitHub account. This is a big win because it is so related and GitHub has so many services around it that the burden of handling multiple accounts is not a problem with Travis. The next step is to select which repository you want Travis to get notified by GitHub.

TravisCiAddRepository

Once that is done, you need to add something into your repository which will give instruction to the continuous integration system. This is where we will tell what to build and to run unit tests. The file must be set at the root of your repository with the name “.travis.yml”. There is a lot of options. Here is a sample of my C# project.

language: csharp
solution: AspNetMvcEasyRouting.sln
install:
  - nuget restore AspNetMvcEasyRouting.sln
  - nuget install xunit.runners -Version 1.9.2 -OutputDirectory testrunner
script:
  - xbuild /p:Configuration=Release AspNetMvcEasyRouting.sln
  - mono ./testrunner/xunit.runners.1.9.2/tools/xunit.console.clr4.exe ./AspNetMvcEasyRoutingTest/bin/Release/AspNetMvcEasyRoutingTest.dll

As you can see, it specify the language, and which solution to use. It defines what to do during the installation which is to restore all nuget packages and to install XUnit runner. XUnit is one supported type of unit tests that can be used. Visual Studio Test cannot be run because Travis run on Mono. That might be a show stopper for you if you have heavily invested in Visual Studio Test. The last section of the file is what to do. The first line, with xbuild, compile the code and the next one run the unit test. If at one of these steps something happen wrong, you will get an email. Otherwise, it’s all fine!

TravisCiSuccessfulBuild

Travis-ci lets you see all logs in real time from the website. It is easy to access, easy to debug. It also let you have a dynamic image of the state of your build that you can embed in the readme file of GitHub. To do so, click the badge next to the repository and select markdown.

TravisCiBadge

I will cover soon how to generate your Nuget package by pushing on GitHub. That would be one more step on having automated your steps.

VsCode and TypeScript

Visual Studio Code has a good article about how to configure VsCode and TypeScript but I wanted to give more detail about how to setup everything in a more straight forward way. This article talks about how to use VsCode in a way that every time you save a TypeScript file that this one generate the JavaScript and mapping automatically. It also have the advantage to do quick change while running NodeJs because NodeJs doesn’t need to be closed and re-started.

.vscode folder

The VsCode folder is a folder located at the root folder of your project. This folder lets you add configuration files in Json format. One file that can be used is named “tasks.json” and is used for running task. This is where we will create the task runner for TypeScript.

vscodefolder

To do so, hit “F1” key and type “Configure Task Runner“. This will create for you the tasks.json file under .vscode folder.

taskrunner

You can put he following code:

{
	// See http://go.microsoft.com/fwlink/?LinkId=733558
	// for the documentation about the tasks.json format
	"version": "0.1.0",
	"command": "tsc",
	"isShellCommand": true,
	"args": ["-p", "."],
	"showOutput": "silent",
	"problemMatcher": "$tsc"
}

VSCode Shortcut

From there, you can use the default hotkey to run the task : Ctrl+Shift+B. However, you can also bind that key on save. This way, everytime you save your file, TypeScript will be compiled into JavaScript. To do so, you can change you VsCode keybinding located in your user profile.

    C:\Users\patrick\AppData\Roaming\Code\User\keybindings.json

You can all on “ctrl+s” the command “tasks build”.

[
  {
    "key": "ctrl+s",          
    "command": "workbench.action.tasks.build" 
  }
]

TypeScript Configuration

Next thing is to add TypeScript configuration into a tsconfig.json file. This file is located at the root of your project, at the same level where reside your package.json (npm). My TypeScript configuration contains a “jsx” mention only because I am using React and TypeScript.

{
    "compilerOptions": 
    {
    "target": "ES6",
    "module": "amd",
    "sourceMap": true,
    "jsx": "react",
    "experimentalDecorators": true,
    "emitDecoratorMetadata": true,
    "declaration": false,
    "noImplicitAny": false,
    "removeComments": true,
    "noLib": false,
    "preserveConstEnums": true,
    "suppressImplicitAnyIndexErrors": true
    }
}

You can have some problem to compile your TypeScript if you have this one already installed in your Program Files. VsCode already have TypeScript installation, so you do not need to have any other installation on your machine.

Improving Visual Studio and MsBuild Speed for Large Solution

I am working on a side project that is a single solution with 51 projects. The amount of project is considered “big” for 2016 while it was still considered “medium” few years ago. For some reason, Visual Studio doesn’t handles very well project with more than 50 projects. I could refactor the solution by consolidating some projects and having a single project for unit testing instead of 12. Nevertheless, this take some time and before optimizing the design of the solution, let’s start by understand what is happening.

First, we need some basic metric. One useful extension to add in Visual Studio is the Build Monitor Extension by Daniel Vinntreus. This will give you an additional Output with the time of each project to be compiled. The second tool is also free, it is called Process Monitor. This can be download from Microsoft TechNet website. This tool lets you see what the process write on the hard drive (and more). Here was the data from both of these tool.

- 00h 00m 01s 253ms	-- MySqlToMsSql --
 - 00h 00m 03s 898ms	-- ModelContracts --
 - 00h 00m 04s 277ms	-- ComplexTypes --
 - 00h 00m 04s 901ms	-- CrossLayer --
 - 00h 00m 00s 871ms	-- TestHelpers --
 - 00h 00m 01s 139ms	-- ComplexTypesUnitTest --
 - 00h 00m 01s 248ms	-- ValueObjects --
 - 00h 00m 03s 053ms	-- CrossLayerUnitTest --
 - 00h 00m 01s 481ms	-- ComplexTypesWithValueObjects --
 - 00h 00m 03s 586ms	-- ComplexTypesWithValueObjectsUnitTest --
 - 00h 00m 05s 555ms	-- ValueObjectsUnitTest --
 - 00h 00m 05s 747ms	-- Model --
 - 00h 00m 01s 884ms	-- ModelBuilders --
 - 00h 00m 10s 744ms	-- ModelUnitTest --
 - 00h 00m 12s 927ms	-- ViewModel --
 - 00h 00m 12s 527ms	-- ModelBuildersTest --
 - 00h 00m 03s 407ms	-- Mapping --
 - 00h 00m 14s 751ms	-- DataAccess --
 - 00h 00m 03s 980ms	-- ViewModelBuilders --
 - 00h 00m 10s 642ms	-- MappingUnitTest --
 - 00h 00m 10s 708ms	-- ViewModelUnitTest --
 - 00h 00m 11s 235ms	-- Services --
 - 00h 00m 12s 411ms	-- WebServicesCore --
 - 00h 00m 25s 328ms	-- DataAccessMigration --
 - 00h 00m 25s 943ms	-- DataAccessUnitTest --
 - 00h 00m 03s 291ms	-- ScriptCleanExpiredOrders --
 - 00h 00m 16s 434ms	-- ServicesIntegrationTest --
 - 00h 00m 18s 438ms	-- ServicesUnitTest --
 - 00h 00m 20s 570ms	-- IoC --
 - 00h 00m 08s 155ms	-- ScriptDailyPerformance --
 - 00h 00m 05s 538ms	-- ScriptBuyOrders --
 - 00h 00m 08s 966ms	-- ScriptInitializeCache --
 - 00h 00m 08s 522ms	-- ScriptEndContests --
 - 00h 00m 09s 063ms	-- ScriptIndiceMarketPoint --
 - 00h 00m 15s 038ms	-- ScriptBotRegisterContest --
 - 00h 00m 15s 665ms	-- ScriptSymbolChange --
 - 00h 00m 12s 100ms	-- ScriptFindUpcomingSplit --
 - 00h 00m 12s 101ms	-- ScriptFindUpcomingSymbolRename --
 - 00h 00m 12s 011ms	-- ScriptBotBuyContest --
 - 00h 00m 12s 070ms	-- ScriptBuyShortOrders --
 - 00h 00m 14s 895ms	-- ScriptEndOfDayBadges --
 - 00h 00m 21s 934ms	-- FrontEndSharingLayer --
 - 00h 00m 12s 243ms	-- ScriptUpdateCompanyInformation --
 - 00h 00m 23s 498ms	-- ScriptSellShortOrders --
 - 00h 00m 23s 583ms	-- ScriptSellOrders --
 - 00h 00m 34s 100ms	-- ScriptSymbolSplit --
 - 00h 00m 34s 153ms	-- ScriptInterdayStatisticPortefolios --
 - 00h 00m 30s 416ms	-- DataAccessIntegrationTest --
 - 00h 00m 29s 488ms	-- WebSite --
 - 00h 00m 08s 063ms	-- Architectures --
 - 00h 00m 08s 267ms	-- WebSiteUnitTest --
Time Elapsed: 00h 02m 51s 635ms 

To get these statistics, I first clean up the solution to have the build rebuild everything. The total amount of time is 2 minutes 51 seconds. A lot of time is one project starting with “Script” which are webjob that run in the background. These ones are in Visual Studio under a folder and could be disabled in the future. This is something I was not doing and thus wasting a lot of build time when working on the main project : the website. Process Monitor is also educational by showing how many bytes is written when building the solution. To do so, open Process Monitor, click Filter (Ctrl+L) and add Visual Studio process (Devenv.exe) and MsBuild (msbuild.exe).
ProcessMonitorFilter
One the filter is done, be sure to clear everything (Ctrl+X) and start building. One the build is done. Go back in Process Monitor and go int Tools > File Summary and get everything sorted by Folder. You can dive in and see what happened.

WriteFileDuringCompile

This give us 893 megs written. I am on a SSD drive so it still not bad, but to be honest quite a lot of writing. From here, I noticed few things. First, I have a lot of bin folder with the same files. Second, that we are rebuilding the same file because they referenced them. To improve, I decided to edit all projects to output on the same bin folder. Third, the jobs folder that contains all scripts are heavy on the writing.

I decided to have every project to output in the bin folder of the website. The reason is that I have IIS taking the bin folder as source so every time I build I can just refresh the browser to get the website with the latest version without deploying. After that, I went into all projects references and click reference to other projects and changing the Copy Local to false.

CopyLocal

From there, I cleaned up everything (all bin folder empty) and re-build everything to see how the performance improved. First, the Build Monitor extension is showing some improvements:


 - 00h 00m 00s 505ms	-- ComplexTypes --
 - 00h 00m 00s 601ms	-- ModelContracts --
 - 00h 00m 00s 352ms	-- ComplexTypesUnitTest --
 - 00h 00m 01s 453ms	-- MySqlToMsSql --
 - 00h 00m 01s 383ms	-- ValueObjects --
 - 00h 00m 03s 002ms	-- CrossLayer --
 - 00h 00m 01s 402ms	-- ComplexTypesWithValueObjects --
 - 00h 00m 01s 488ms	-- CrossLayerUnitTest --
 - 00h 00m 01s 940ms	-- TestHelpers --
 - 00h 00m 02s 924ms	-- Model --
 - 00h 00m 01s 637ms	-- ValueObjectsUnitTest --
 - 00h 00m 01s 806ms	-- ComplexTypesWithValueObjectsUnitTest --
 - 00h 00m 01s 184ms	-- ModelBuilders --
 - 00h 00m 03s 282ms	-- ViewModel --
 - 00h 00m 05s 198ms	-- ModelUnitTest --
 - 00h 00m 02s 993ms	-- Mapping --
 - 00h 00m 05s 746ms	-- ModelBuildersTest --
 - 00h 00m 03s 516ms	-- ViewModelBuilders --
 - 00h 00m 06s 564ms	-- DataAccess --
 - 00h 00m 00s 788ms	-- ViewModelUnitTest --
 - 00h 00m 03s 636ms	-- MappingUnitTest --
 - 00h 00m 03s 634ms	-- Services --
 - 00h 00m 04s 583ms	-- DataAccessMigration --
 - 00h 00m 04s 983ms	-- DataAccessUnitTest --
 - 00h 00m 04s 610ms	-- ServicesIntegrationTest --
 - 00h 00m 06s 222ms	-- IoC --
 - 00h 00m 06s 657ms	-- WebServicesCore --
 - 00h 00m 11s 568ms	-- ServicesUnitTest --
 - 00h 00m 04s 573ms	-- ScriptIndiceMarketPoint --
 - 00h 00m 06s 860ms	-- DataAccessIntegrationTest --
 - 00h 00m 04s 658ms	-- ScriptCleanExpiredOrders --
 - 00h 00m 04s 715ms	-- ScriptEndContests --
 - 00h 00m 05s 408ms	-- ScriptInitializeCache --
 - 00h 00m 05s 637ms	-- ScriptDailyPerformance --
 - 00h 00m 01s 181ms	-- ScriptBuyOrders --
 - 00h 00m 01s 760ms	-- ScriptSymbolChange --
 - 00h 00m 01s 831ms	-- ScriptBotRegisterContest --
 - 00h 00m 01s 770ms	-- ScriptFindUpcomingSplit --
 - 00h 00m 01s 766ms	-- ScriptFindUpcomingSymbolRename --
 - 00h 00m 07s 225ms	-- FrontEndSharingLayer --
 - 00h 00m 09s 518ms	-- ScriptEndOfDayBadges --
 - 00h 00m 09s 531ms	-- ScriptBuyShortOrders --
 - 00h 00m 09s 540ms	-- ScriptBotBuyContest --
 - 00h 00m 08s 867ms	-- ScriptUpdateCompanyInformation --
 - 00h 00m 11s 291ms	-- ScriptSymbolSplit --
 - 00h 00m 11s 304ms	-- ScriptSellShortOrders --
 - 00h 00m 10s 675ms	-- ScriptSellOrders --
 - 00h 00m 02s 717ms	-- ScriptInterdayStatisticPortefolios --
 - 00h 00m 10s 860ms	-- WebSite --
 - 00h 00m 03s 206ms	-- WebSiteUnitTest --
 - 00h 00m 03s 952ms	-- Architectures --
Time Elapsed: 00h 01m 32s 275ms 

The time to build is cut by half. This is already better. If we take the Process Monitor we can see the reason: we write only 51 megs.

CompileAfterCopyReference

Finally, if I unload all jobs project (scripts) one, I have a build time of 1m13sec. Not a huge improvement, but still 20 seconds less! The initial performance of 2 minutes 51 seconds to 1 minute 13 secconds is quite appreciable. With all these changes some problems raised. First, when pushing the code in the continuous integration (ci) environment, the build server is not able to build the whole solution. This is because the build server builds the startup project which doesn’t copy local any references. The second problem is when you deploy. Visual Studio Publish mechanism build the main project too, whence the same consequences. So, we need to add additional steps to build everything and we will come back to have some performance lost.

An other direction is to remove the most possible projects. This approach is fine but limited to what you can group together. For example, I have 1 web project, and about 14 web jobs. This mean a minimum of 15 projects. If we want to divide unit tests from the code, we can add 1 more project. If we want to share webjobs and website logic we add one more project. Still, we have half the number of project and while working on the shared tier and website, it is always possible to unload from the main solution every webjob projects. The best way to move everything is to create a shared project, that I called “ApplicationTier”. The website project remains the same but refers this new project. Inside Visual Studio, we need to go one by one in each project to drag-and-drop all files in a folder with the same name of the project. The final result is easy to read and consolidate a lot of project within one but with familiar structure. At the end, the result was very impressive. Instead of taking 2 minutes 51 seconds the build time was at 54 seconds.

 - 00h 00m 00s 775ms	-- MySqlToMsSql --
 - 00h 00m 06s 799ms	-- ApplicationTier --
 - 00h 00m 06s 064ms	-- DataAccessMigration --
 - 00h 00m 03s 400ms	-- ApplicationTierUnitTest --
 - 00h 00m 09s 714ms	-- WebServicesCore --
 - 00h 00m 11s 656ms	-- WebSite --
 - 00h 00m 02s 612ms	-- ScriptCleanExpiredOrders --
 - 00h 00m 02s 312ms	-- ScriptFindUpcomingSymbolRename --
 - 00h 00m 05s 217ms	-- ScriptEndContests --
 - 00h 00m 05s 271ms	-- ScriptBuyShortOrders --
 - 00h 00m 05s 345ms	-- ScriptBuyOrders --
 - 00h 00m 03s 508ms	-- ScriptFindUpcomingSplit --
 - 00h 00m 07s 308ms	-- ScriptDailyPerformance --
 - 00h 00m 08s 350ms	-- ScriptSymbolChange --
 - 00h 00m 08s 447ms	-- ScriptBotRegisterContest --
 - 00h 00m 03s 731ms	-- ScriptEndOfDayBadges --
 - 00h 00m 03s 327ms	-- ScriptSymbolSplit --
 - 00h 00m 04s 775ms	-- ScriptBotBuyContest --
 - 00h 00m 04s 728ms	-- ScriptUpdateCompanyInformation --
 - 00h 00m 02s 835ms	-- ScriptSellOrders --
 - 00h 00m 02s 907ms	-- ScriptSellShortOrders --
 - 00h 00m 01s 714ms	-- ScriptInterdayStatisticPortefolios --
 - 00h 00m 01s 608ms	-- ScriptInitializeCache --
 - 00h 00m 01s 579ms	-- ScriptIndiceMarketPoint --
 - 00h 00m 04s 239ms	-- Architectures --
 - 00h 00m 05s 405ms	-- WebSiteUnitTest --
[3] Time Elapsed: 00h 00m 54s 059ms 

By reducing the amount of project, we have a lot less references that needs to be cut. The number of megs written on disk is about 550 megs now. The main bottle neck is all the scripts for webjobs. Since all jobs are just entry points to the ApplicationTier, if they share the same bin folder, it will reduce by a lot the building time. This is because that the first script project to build will get the binary files in the bin folder, and subsequent scripts will just build the executable project without building again the references. The result is the following 31 seconds, mainly because only 196 megs go written on the disk.

 - 00h 00m 01s 633ms	-- MySqlToMsSql --
 - 00h 00m 06s 483ms	-- ApplicationTier --
 - 00h 00m 06s 696ms	-- DataAccessMigration --
 - 00h 00m 01s 458ms	-- ApplicationTierUnitTest --
 - 00h 00m 08s 610ms	-- WebServicesCore --
 - 00h 00m 09s 298ms	-- WebSite --
 - 00h 00m 01s 044ms	-- ScriptCleanExpiredOrders --
 - 00h 00m 00s 719ms	-- ScriptFindUpcomingSymbolRename --
 - 00h 00m 01s 140ms	-- ScriptFindUpcomingSplit --
 - 00h 00m 02s 132ms	-- ScriptDailyPerformance --
 - 00h 00m 02s 282ms	-- ScriptBuyOrders --
 - 00h 00m 02s 216ms	-- ScriptBotRegisterContest --
 - 00h 00m 02s 477ms	-- ScriptBuyShortOrders --
 - 00h 00m 02s 295ms	-- ScriptSymbolChange --
 - 00h 00m 00s 771ms	-- ScriptEndOfDayBadges --
 - 00h 00m 02s 619ms	-- ScriptEndContests --
 - 00h 00m 00s 811ms	-- ScriptUpdateCompanyInformation --
 - 00h 00m 00s 858ms	-- ScriptBotBuyContest --
 - 00h 00m 00s 764ms	-- ScriptSymbolSplit --
 - 00h 00m 03s 963ms	-- ScriptSellShortOrders --
 - 00h 00m 04s 022ms	-- ScriptSellOrders --
 - 00h 00m 03s 957ms	-- ScriptInitializeCache --
 - 00h 00m 04s 030ms	-- ScriptInterdayStatisticPortefolios --
 - 00h 00m 03s 683ms	-- ScriptIndiceMarketPoint --
 - 00h 00m 04s 649ms	-- WebSiteUnitTest --
 - 00h 00m 01s 533ms	-- Architectures --
Time Elapsed: 00h 00m 31s 471ms 

I quickly started the RamDisk tool to see if having the scripts bin folder could benefit of it and I haven’t see any improvement. Finally, I am pretty happy of the end result. I can always unload all scripts and this can be done easily since they are inside a folder. Inside Visual Studio, right click the folder that contain these folders and click “unload project”. This will unload all of them in 1 operation. For further optimization, we could also unload the migration project and unit tests and by doing so having less than 20 seconds of total build time.

How to diagnostic slow code with Visual Studio

During the development of one feature, I noticed the performance to be very slow in some scenario. It was not obvious at first because the task was to simply update a user profile. The user profile in question is stored in a single table. It’s a pretty straight forward task. Before persisting the data, some validations are done but that is it.

This is where Visual Studio can be very useful with the integrated Diagnostic Tools. The diagnostic tools provide information about event and on any of them, you can come back in time and replay the call stacks which is pretty useful. It also gives some timing information, cpu usage and memory usage. To start diagnosing, simply attach Visual Studio to the process you want to diagnostic. After, open Visual Studio’s diagnostic tools that is located in the top menu under Debug > Profiler > Performance Explorer > Show Performance Explorer.

Here is an example of the output that I got from my performance problem.

DiagnosticTool

Visual Studio Diagnostic tools events include Entity Framework SQL statements. This is where I realized that the user’s table was updated but also hundred of others which looks to be a table linked to this one. Here was the performance bottleneck, the culprit! I never expected to update anything related to that table — just the main user’s table.

Entity Framework code was like this:

public void Update(ApplicationUser applicationModel)
{
	//Update the password IF necessary
	var local = UnitOfWork.Set<ApplicationUser>().Local.FirstOrDefault(f => f.Id == applicationModel.Id);
	if (local != null)
	{
		UnitOfWork.Entry(local).State = EntityState.Detached;
	}
	UnitOfWork.Entry(applicationModel).State = EntityState.Modified;
	if (string.IsNullOrEmpty(applicationModel.PasswordHash))
	{
		UnitOfWork.Entry(applicationModel).Property(f => f.PasswordHash).IsModified = false;
	}
	UnitOfWork.Entry(applicationModel).Property(f => f.UserName).IsModified = false;
	UnitOfWork.Entry(applicationModel).Property(f => f.CreationDateTime).IsModified = false;
	UnitOfWork.Entry(applicationModel).Property(f => f.ValidationDateTime).IsModified = false;
	UnitOfWork.Entry(applicationModel).Property(f => f.LastLogin).IsModified = false;
	UnitOfWork.Entry(applicationModel).Property(f => f.SecurityStamp).IsModified = false;
	UnitOfWork.Entry(applicationModel).Property(f => f.Language).IsModified = false;
}

As you can notice, nothing is done directly on the property that has the collection of “reputation”. The problem is that if the user as in that collection 250 objects, that for an unknown reason, Entity Framework does 250 updates. Since we want just to update first name, last name and few other basic properties than we need to be sure to remove those unwanted updates. After some modification with Entity Framework, like nulling every collection before updating, The SQL provided was only a single SQL, whence the performance at full speed.