Thursday, 14 November 2013

What I found in 4 minutes in Visual Studio 2013 in an MVC5 project

I have installed Visual Studio this morning and here are a few initial things I have found.

I used the following version:

Firstly, after installation it asked my a couple of questions including what colour scheme I would like to use for visual studio. As you can see I chose DARK.
I thought this was a nice little welcome surprise instead of me burning my eyes out for weeks until I realise I could change the colour scheme.

Secondly, and I'm not sure if this is a good thing, I am logged into MSDN through visual studio:

I suppose it feels more personalised but lets see what happens down the track!

Another thing I noticed straight away is a little hint above methods that shows you how many time that method is referenced:

When you click on this it shows you the referenced code:

Another thing I think I like it that the old aspnet membership provider has been replaced with claims identification. I suppose it's still a custom database supplied by Microsoft but, as I remember as I have used claims identity before

( it should be more loosely coupled and allow you to write cleaner code without being totally locked inside the membership provider. As far as I remember I was using dependency injection and the OLD membership provider kept on making me write ugly code.

They have also added some controller example tests by default. This is great as I remember when I started writing controller test I did not have any examples.
This should motivate developers to write more tests from the start which is good.

I have always hated the Visual Studio test GUI but now it seems it looks a little bit better and is now more like re-sharper:

I am not sure if this was here before but there seems to be a new notification window to give you updates on what is new. For example, an update to the nuget package manager.

Code coverage is also included which is something I would have needed to pay for before:

The final thing I found was when I was in a controller, for example, I could right click and select code map:

And then I can drag classes onto the canvas to see their relationships:

That's my 2 cents worth and that's all for now.


Friday, 20 September 2013

Building .Net 4.5 Projects with Jenkins

I wont write much here but just wanted to say thanks to Rik Leigh for posting this.

It helped me out. My build was spitting out 100s of warnings and now it is fixed.


Thursday, 12 September 2013

Stop resharper adding a dll reference from a randon project and make it update from Nuget


We have been having issues where a developer (ME) uses Resharper to add a reference. The issue is, it doesnt look at nuget and update your packages file it just adds the nearest dll it can find (from a neighbouring project).

This is not good.

Anyway, I just found and installed this plugin from Resharper:

Here is a good post on it:

You just download the zip file:
(make sure you check for the latest verion on the site above (1))

Then you just run the batch file that matches your version:

Then when you add a reference:

It updates from nuget:

This is awesome!


Tuesday, 2 July 2013

Continuous Integration with Jenkins and MSTest

I have just set up CI with Jenkins for a client. I will now demonstrate how I did this. I will create a demo MVC project that basically does nothing in the browser. It does however have some unit tests in it that test some very simple business logic using Moq.
Here is my sample MVC4 project and then tests:

The unit tests within my project. You can see that they just test some arbitrary business logic and mock out a repository but it doesn’t matter what they really do as this post is about making Jenkins run MSTests.
Now I want to get Jenkins to:
  • poll my SVN server to check for new commits.
  • Build the code
  • Run my 2 unit tests
  • Create a unit test report
  • Send the report to email recipients.
Ok so first we want set up Jenkins.
I needed to install the following first as MSTest was not available:
Visual Studio Agents – ISO:
There are some config options with this but I just used the defaults and it all worked just fine.
Following that I began setting up Jenkins.
Here is my Jenkins project name:

Now we want to set up polling (currently 15 minutes). If an SVN commit is detected Jenkins will begin running a build. 

Then we need to point to our solution file for the build to work:
We then need to point to our unit testing DLL so Jenkins can tell MSTest where our tests are:
Then we need to tell a 3rd part tool (you don’t have to use this tool) to convert our trx file to Html format so it is easier to read:
After that, we want to ensure we actually publish our trx file:
And then set up email notifications to tell us when certain thing occur like the build failing:
We can then run the build:
After the build output we can now see the MSTest output:
If we now look in the file system we can see the original trx file and the newly generate Html version:
When we look at the Html we can see a nice summary of our passing test:
And if we force the tests to fail we can see this also:

You can also look at the build trends in Jenkins and see what's been happening:

And drill down on these:

That’s all there is to it. I have been meaning to set this up for ages and finally got the opportunity to do it.
Ps. Waiting on another Jenkins Plugin to be installed so I can email the Html version of the trx report to email recipients.

Tuesday, 18 June 2013

Setting up Sql Database mail for a client

I had a requirement to periodically send a data extract from an SQL database. I had a few ideas about how I could accomplish this but decided to take a look at SQL server to see what it had to offer as I hadn't used it to send anything for the last few years.

Anyway, I came across SQL database mail:

Here is what Microsoft says about it:
Database Mail is an enterprise solution for sending e-mail messages from the SQL Server Database Engine. Using Database Mail, your database applications can send e-mail messages to users. The messages can contain query results, and can also include files from any resource on your network. Database Mail is designed for reliability, scalability, security, and supportability.

So back to the task I needed to complete. I needed to periodically schedule a SQL Server Agent job to query my database and send the results to an email recipient.

Here's what I did.

I needed to send the following data:
SELECT P.Id, FirstName, LastName, 
        Email, AddressLine1, Suburb, PostCode, ST.Name, FutureCommunication, [Uid] 
        INNER JOIN 
        STATES ST
        ON P.State = ST.Id;

EXEC msdb.dbo.sp_send_dbmail

You can find the documentation on this here:

I came up with the follow script:
EXEC msdb.dbo.sp_send_dbmail
    EXEC msdb.dbo.sp_send_dbmail
        @profile_name = 'DatabaseMailProfile',
        @recipients = ';',
        @query = @parentEmailQuery,
        @subject = @parentEmailSubject,
        @execute_query_database = 'DATABASE_NAME',
        @query_result_header = 1,
        @query_attachment_filename = @parentCsvFileName,
        @query_result_separator = '|',
        @query_result_no_padding = 1,
        @attach_query_result_as_file = 1 ;

This is basically a list of options that database mail needs.
Some useful ones I will point out are:

  • @profile_name = 'DatabaseMailProfile' - I will show you what this relates to shortly.
  • @query = @parentEmailQuery - This the query I showed you above but I included: SET NOCOUNT ON; before my query and SET NOCOUNT OFF; after my query. This was because the following: "n rows were returned" kept on appearing in my final out put and I did not want this.
  • @query_result_header = 1 - this shows or hides the column names. When it is set to 1 it also adds in an ugly row of dashes after column headings. This was ok for me but you may want to investigate writing a custom query to write out the header instead.
For this script to work we now need to talk about setting up SQL database mail.

Firstly we need to set up an account to use to send emails:

 the database mail node can be found in the SQL management studio tree:

 Right click on that and choose:

 You need to first set up an account so choose from the following screen:

and then this screen:

and enter the details of for your account:

After you have created an account you need setup a profile. As you will notice in the above mentioned query, there is a reference to the profile name:
EXEC msdb.dbo.sp_send_dbmail
   @profile_name = 'DatabaseMailProfile',

So here we go. Choose the option to add a profile and then add the account you created to it:

Choose what security options you want:

In the following screen, leave as is or change if you need to:

You will also need to run the following:

After that you can run a test if you like:

now you can run the script above:
EXEC msdb.dbo.sp_send_dbmail
    EXEC msdb.dbo.sp_send_dbmail
        @profile_name = 'DatabaseMailProfile',
        @recipients = ';',
        @query = @parentEmailQuery,
        @subject = @parentEmailSubject,
        @execute_query_database = 'DATABASE_NAME',
        @query_result_header = 1,
        @query_attachment_filename = @parentCsvFileName,
        @query_result_separator = '|',
        @query_result_no_padding = 1,
        @attach_query_result_as_file = 1 ;

Obviously I have removed my real world values.

This will do the following:
  • Look for the profile we created
  • Run the query to get the data we want to send
  • Set up some email params
  • Send our data as a csv file attachment with headers. The data will be separated with a pipe (|). You could use comma separation if you like.
Here is the email that was sent:

Following on from this you could set up a SQL Server Agent scheduled job to run periodically to
send your data.


Thursday, 13 June 2013

AutoMapper ValueResolver - it's not big and it's not clever but it's handy!!

I had a task where I needed to allow the client to enter a money value in a form field. 

They could enter any of the following:


For example.

I could have just written some code to strip out the special characters but I really wanted to use AutoMapper as it already maps my view models to my domain entities.

Any way I wrote this:

What it does is grabs an array of special characters to strip from my config. (I suppose I could have made it strip out everything that was not a number but for the sake of this post i'll leave it as is.)
It then cycles through the array and strips out the character from the entered amount ($10,000).

It is set up here:

So now every time I map a ChildViewModel to a Child domain entity I get this character stripping for free.
Here is where I use it:

Hope this helps.


Monday, 10 December 2012

My Journey with Sitecore and MVC - View model builders


Here is a small post on my experience with MVC controllers and view models. This post is not about Sitecore.

At times you will find that, due to the complexity of a View Model, your controller become a bit long winded. Controllers are called controllers for a reason, they control stuff. In my mind I see them as small traffic controller. Controllers get some data (parameters or view models) decide what to do with it and then direct the some data (view model) to a view or redirect.

Some view models may be simple. For example, you might have a view model with:

public class LogOnViewModel
    [Display(Name = "CustomerId")]
    public string CustomerId { get; set; }

    [Display(Name = "Password")]
    public string Password { get; set; }

    [Display(Name = "Remember me when I log in again?")]
    public bool RememberMe { get; set; }

This is not a very complete view model so in theory your could just get your controller to build it up. I still recommend using a view model builder as you never know when you will need more complexity. Later, if you don't use a view model builder, you might get lazy and just dirty up your controller and wish you had a builder. I say, use one from the beginning.

If you have a more complex view model (the following is not that complex but you can see what i'm getting at) then you will start to find the need to put more code in your controller to build up collections for drop down lists (there may be a few of these in the form you are trying to render in your view), custom string formats or collections for other rendering needs.

public class DashboardViewModel
    public LogOnViewModel LogOnModel { get; set; }
    public IEnumerable < VehicleViewModel > VehicleModels { get; set; }

Here is an Index action on my DashboardController:

public ActionResult Index()
    var dashboardViewModelBuilder = new DashboardViewModelBuilder(VehicleManager);
    var dashboardViewmodel = dashboardViewModelBuilder.Build();

    return View(dashboardViewmodel);

As you can see I am using my own custom DashboardViewmodelBuilder which abstracts all the work that is required to build up my DashboardViewModel.

This means that my ViewModelBuilder can get as complex as I like whilst my controller stays nice and clean.

Here is my DashboardViewModelBuilder.

public DashboardViewModel Build()
    List < VehicleViewModel > vehicleModels = new List < VehicleViewModel > ();

    DashboardViewModel dashboardViewModel = new DashboardViewModel { LogOnModel = new LogOnViewModel() };
    var vehicles = VehicleManager.GetVehiclesByCustomerId(1);  //Yes I know, hard coding, ignore this

    foreach (var vehicle in vehicles)
        var vehicleViewModel = Mapper.Map < Vehicle, VehicleViewModel > (vehicle);

    dashboardViewModel.VehicleModels = vehicleModels;

    return dashboardViewModel;

As you can see, it gets a IVehicleManager injected into it. I need this VehicleManager to get data from a database for my view.
I also use Automapper to map the Domain model Vehicle to my VehicleViewModel.
you don't really need to follow all this, i'm just trying to make a point that having all your logic to build up your view model inside a builder is much better than cluttering up your controller.

What you reckon?


Tuesday, 13 November 2012

Upgrading an existing instance (Sitecore.NET 6.5.0 (rev. 111230)) to Sitecore to 6.6 (120918 => 121015)


So I have done this a couple of time already but it always failed. After talking to Sitecore we realized that our existing Sitecore instance had file paths that were too long so the 6.6 installer was failing because of the 256 char. file path limit in windows:

Maximum Path Length Limitation

In the Windows API (with some exceptions discussed in the following paragraphs), the maximum length for a path isMAX_PATH, which is defined as 260 characters. A local path is structured in the following order: drive letter, colon, backslash, name components separated by backslashes, and a terminating null character. For example, the maximum path on drive D is "D:\some 256-character path string" where "" represents the invisible terminating null character for the current system codepage. (The characters < > are used here for visual clarity and cannot be part of a valid path string.)
So after talking to Sitecore they suggested moving our Sitecore instance to something shorter like:
They also suggested giving full access to IIS_IUSER and the account the app pool runs under to the existing Sitecore instance website folder. Done!
Also Note: I dont have DMS installed on the existing Sitecore instance. I will install this after I upgrade.
First thing I am going to do is backup my Sitecore databases: Core, Master and web. Probably don't need to care about web but I did it anyway. Done.
Note: I will check that my current Sitecore instance runs correctly before I upgrade. Done.
Right so now I am ready to run the upgrade installer.
I choose the package to upgrade from 6.5 to 6.6:
Sitecore 6.6.0 rev. 120918_fromv650rev110602.update
Then I click "View package information" and then "Analyse and install the package".
Ill analyse it first to see if there are any obvious issues before installation. It says it is running 5448 action.
The analysis of the results says:
730 potential problems were found, including 709 collisions and 21 warnings.
I am not sure what I can do with this information as, even when I "filter by message types" there is no obvious information on errors etc.
Next i'll click "Install the package" and see what happens. Note that, as I mentioned, I have done this a few times and it failed but hopefully with the new shorter path it will work.
Again, it says it is processing 5448 actions and in the "More information" section I can see it installing Sitecore bits and pieces. (go get a coffee as it looks like it is going to take an hour!)

Installer stopped on 119 actions and SQL processor was at 1.5GB so I rebooted my machine.
I ran the installer again and it seems to be running correctly now but again it failed at the end saying it could not find Lucene.NET.

I checked a fresh version of 6.6 for Lucene and all the version numbers are identical to my version but the file sizes are different so i'll copy the version from 6.6 into mine and try upgrading again ... ppphhhhht.

Ok, contacting Sitecore support again as it is failing on:

System.IO.FileLoadException: Could not load file or assembly 'Lucene.Net, Version=, Culture=neutral, PublicKeyToken=null' or one of its dependencies

Ill post more when they reply and I try again.

Ok so I got some feedback from the Guys at Sitecore. We were using a Shared source crawler for our indexes. for now I have disabled /App_Config/Include/Indexing.config to /App_Config/Include/Indexing.config.disabled.

I ran the installer again and it worked, although I have 2500 warnings but it looks like it was just alerting me that it was overwriting files.
Right, next step run the upgrade script on master, web and core databases: CMS660_AfterInstall.sql. This worked. It seems to just create and update  the SQL tables ArchivedVersions and ArchivedFields.

After this I went through and did all the config changes required for 120918 and 121015 mentioned in the SDN posts above.

Note: Read the instructions for each config change very carefully, one mistake and you will have a world of pain.

So this all went fine. I did notice that I had a couple of config sections missing when comparing to the Sitecore original files on SDN so I amended where required. For example, step 17 here:

In the section, add three new settings related to the Page Preview feature, after the "PageStateStore" setting:
I was missing PageStateStore so I added it in. 

After that I needed to make my site run with MVC. As john West states, the latest Sitecore 6.6 installer (exe) enables MVC by default:

But as I was doing an upgrade and not using the EXE I need to do some stuff with config files. I followed the instructions here:
At point 1.1.3 How to install Sitecore MVC.

Right, all done. 

The next step is to make sure it all works. I followed Sitecore John West's post here:
Scroll down to the little heading: "To confirm MVC is working:" and follow his instructions.

Right, now I can get back to actually writing some code.

More to come on Sitecore with MVC in other posts.


My Journey with Sitecore and MVC : part1

I thought i'd start a post about my journey with a green field project for a large client. 

The project is an web application that existing customers of a large vehicle manufacturer can log on to to view details of vehicles the customer owns. This includes service history
The project will be built in MVC3 and will be driven by an existing client database wrapped by a Java web service which I will consume.

The site will use content from an existing Sitecore instance that the client already uses to populate their current site.

There is a possibility of other custom data stores being needed as the requirements become clearer. These extra data stores will be in Sql server and I will use Entity Framework 5 as my ORM.

Here is a list of technology I will be using:

  • ASP.NET MVC3 with a possible mixture of web forms where required.
  • WCF
  • Sql server 2008
  • Ninject as my DI container
  • Moq for unit testing
  • Glimpse.MVC for debugging
  • Automapper - mapping my domain entities to my MVC view models
  • Glass.Sitecore.Mapper - for mapping my Sitecore content items to my domain model. This also gives me Sitecore context in my domain.
  • Sitecore 6.6
My architecture will be somewhat as follows:
  • MVC UI
  • Viewmodel per view 
  • View model builder abstraction to keep my view model code construction out of my controllers
  • A manager / business layer
  • A caching layer
  • I will use the repository pattern with unit of work where required although from the spec, there is no real immediate need for a Unit of work but I will put it in there anyway.
  • A service layer to consume the external service that is hosted at the client.
  • An ORM layer using EF for any custom data stores
  • Sitecore 6.6 that supports MVC3
  • My DI container will take care of IOC so every layer right up to my controllers will be unit testable (I will test where needed but not just do testing for the sake of testing.)
This first post is just a summary. I will post more as I continue.


Tuesday, 21 August 2012

Attach Sql Database

This is just a little reminder to me on how to attach an existing database:

EXEC sp_attach_db @dbname = 'MazdaCleanSitecore_WebTarget',
@filename1 = 'D:\Data\Mazda\MazdaCleanSitecore_WebTarget.mdf', @filename2 = 'D:\Data\Mazda\MazdaCleanSitecore_WebTarget_1.ldf'
Fixing an orphaned user.
This used to be a pain to fix, but currently (SQL Server 2000, SP3) there is a stored procedure that does the heavy lifting.

All of these instructions should be done as a database admin, with the restored database selected.
First, make sure that this is the problem. This will lists the orphaned users:
EXEC sp_change_users_login 'Report'
If you already have a login id and password for this user, fix it by doing:
EXEC sp_change_users_login 'Auto_Fix', 'user'
If you want to create a new login id and password for this user, fix it by doing:
EXEC sp_change_users_login 'Auto_Fix', 'user', 'login', 'password'

Thursday, 15 March 2012

ASP.NET Membership System.Web.Providers. Can&apos;t log in to my siteon Azure but can locally.

Hi there,

I had an interesting issue last night. I deployed my local Azure emulated MVC 3 role and worker role to my Windows Azure instance. All worked perfectly locally and in particular the logon page I built allows me to log onto the secure area of my MVC 3 site using System.WebProviders (ASP.Membership).

I found an issue when I tried to log onto my remote Azure site using my deployed logon view.
The logon on the server failed and told me my user name or password was wrong.

Ok, so I then pointed my local instance ASP.NET configuration (the built in membership, role etc.. tool for setting up members and roles) at my Azure SQL database and tried to manage my members from there and it worked. 

Even when I changed my local connection string to point to my Azure SQL database and tried logging on, it worked. WEIRD!!! (ie.. local browser, remote Azure database).

So then today I fould this blog: 
and then this one:

The part of the post in particular that caught my attention was this:
I tracked it down, thanks to some info in this article by David Hoerster. The problem is that the default password hashing algorithm on Azure is different from the .NET 4.0 defaults. It is set to SHA1 on Azure, and HMACSHA256 is the new standard setting on 4.0.
The rest of the post is here:
This can be fixed by specifying the hash type explicitly in web.config. If you decide to use a method like HMACSHA256, make sure you also specify a machine key - otherwise you will run into similar problems as the autogenerated machine key will differ from server to server.
The configuration element you need to change is under :
<machineKey decryptionKey="PUT_DECRYPTION_KEY_HERE" 
validation="HMACSHA256" />
You can use this machine key generator to generate random keys in the proper format.

I will go home tonight and try this and update this post in the morning.......time passes.......ok, so went home last night and tried a few things. My password hashing algorithm was different between my local and azure. I just used the Azure machine config setting on my local and it all works fine now. Good times!!!


Thursday, 8 March 2012

A look at ASP.NET MVC 4


I am currently getting up to speed on MVC 4. Thought i'd post Scot Guthrie's video on the subject.

Here is the real link as I didn't want to embed the video as the width was to large:


Monday, 20 February 2012

My weekend (I'm a geek) of deploying my new Kite Boarding site to Windows Azure with an Azure SQL database, membership and blob storage

Good Morning,

I just thought I'd post to my blog after a long period of being to lazy to do it and also being too busy at work to get around to it.

I spent part of the weekend deploying my latest side project to Azure. I won't tell you what it's all about until I have it all polished and shiny as right now it's a definite work in progress.

What I have deployed:
  • MVC 3, Html5, Razor website - I'm thinking about moving to MVC 4 as I think the beta just came out.
  • The site runs off Entity Framework 4.3 and I used code first poco classes with DBContext.
  • Users and Roles are managed by 
  • My custom data is stored an an Azure Sql relational database 
I posted the following questions on MSDN and have now got the answers I need so i'll update the questhions with answers. As I was an Azure newbie a 2 weeks ago, these questions now seem a bit lame to me!!

  • I have used an Azure SQL relational database for my database. They talk a lot about table storage. I assume I have done the right thing in using a relational database?
    Ok, so this one was easy. Whilst I could probably use table storage to store me data ( I am not ready to ditch my relational database way of thinking. My site will have many tables with relationships so for now I am playing it safe and using SQL Azure. I have, however, used blob storage ( to store the gallery images I need for my site. So, and I may be wrong here, if I had a list of people whom each had a picture attached to their relational database record, I think I could store the blob Url in a field in their record and, when rendering, iterate the records and retrieve the blob image from my storage account.
  • I have used the to manage users and roles. Once I deployed to Azure using the same database in my web.release.config for my custom data AND the membership tables it all works great. I can register a new user using the standard MVC 3 register user view and I can see this new user in my Users table inside the asp membership table.
    My question here then: and I have Googled this but cant find it, is how do I manage my users and roles on the azure server as I would when I use the asp.netwebadminfiles/default.aspx in Visual Studio? Do I, and I plan to this anyway down the line, write my own custom membership management code to do this or can I use an instance of asp.netwebadminfiles/default.aspx on the server somewhere?
    I figured this one out too. Although there are ways to use the netwebadminfiles tool to manage users and there is also third party code that can do this for you, ultimately I want to manage these users myself. I will build all my user and role code into the site.
  • When I deployed for the first time it created my web svc role on Azure AND my Sql database mentioned above. Lets say I have the site live for a few months and want to redeploy, will my membership and custom data get wiped out? What if a deploy schema changes to my database, will this be taken care of without affecting my data. I know I can use a one way data sync to get the data onto local but im sure I would not have to worry about this?Not 100% sure on this one but I now know that firstly, I don't have to redeploy / publish the whole site, storage and database every time I make a change. I can use Visual Studio WebDeploy to push new changes to my site. If I needed to sync data from my local I think I could use the Data Sync in the Azure portal and I assume I could manage my users via SQL mambership stored procs. until I get all my membership screens written.

Tuesday, 6 December 2011

Saving the internet


Quote from:

"As concerned global citizens, we call on you to stand for a free and open Internet and vote against both the Protect IP Act and the Stop Online Piracy Act. The Internet is a crucial tool for people around the world to exchange ideas and work collectively to build the world we all want. We urge you to show true global leadership and do all you can to protect this basic pillar of our democracies worldwide."

Please sign up to save the internet. This is important for all of us to continue to have an open and uncensored internet. Imagine if the police monitored all your phone calls, you wouldn't like that would you?


Monday, 7 November 2011

Explanation of Ninject Bindings and where we used them in our MVC Website (Part 1 - MVC Website)


Im going to summarize what we have done with our Ninject bindings and injection which allows us to follow the "Inversion of Control (IOC Container)" principle in a project we are working on. 

As mentioned we are using Ninject as our IOC container:

The project architecture looks much like the following diagram:

Architecture Diagram

I will work my way from the top to the bottom of the architecture. Starting with the MVC web site all the way through to the Microsoft Crm layer.(This post  (Part 1 - MVC Website) just refers to the web site.)

One of the 2 BIG reasons we are using an IOC container (Ninject) is to allow us to separate all our layers (separation of concerns) so that they are Unit testable and also so that we can swap components out at a later date. We may want to replace our Crm later with a Sql server layer, doubtful but we are catering for this anyway as you never know what will be a requirement in the future. As long as our Sql server repository layer implements the same interface as our existing Crm repository layer, we can more or less swap out the Crm Repository with a new Sql Repository.

Mvc Website:
On the website layer we will be using Ninject to inject the following:
  • A GatewayAgent layer into our controller - the GatewayAgent layer separates the concerns of the controller from the concerns of our WCF SOA layer. 
  • An AutoMapper implimentation into our GatewayAgent - the mapper will map our presentation entities to our Domain entities.
  • Our presentation entity validation classes into our presentation entities - as we are using Fluent validation, we may want to ditch this later and inject a different type of validation.
The following diagram show part of the web site's global.asax file. This code uses Ninject to inject the above 3 items.

The Ninject Binding in the Global.asax class
Obviously, the web site has a reference to the Ninject.dll.

Most of this is set in stone for our site but one thing to note. As more development is done we will need to add more validators as we add more presentation entities to our site. So when we add an Invoice presentation entity to the site a related Invoice Validator class will need to be added to the ConfigureKernel method, shown above, in our global.asax file.

Note: the line:
- kernel.Bind<IValidator<Contact>>().To<ContactValidator>(); 
means, when we call validate on a Contact presentation entity, 
we would like to use the ContactValidator class to carry out the validation

Sample Contact Validator
You can see in the ConfigureKernel method above that we also inject a ContactGateway.
The following line shows this:

This means that when we refer to IContactGateway we would like to use the concrete class ContactGateway.
So in our controller we see the following code:
Sample controller

Why is this good? Because you can then mock up a ContactGateway in a unit test to test your controller.

A unit test for the controller
As mentioned above, we also inject a mapper implementation into our web site:
This is used to map our domain entities to our presentation entities on the way up and our  presentation entities to our domain entities on the way down.

Why is this good? We may way to use a different mapping tool other than AutoMapper in the future. Injection will enable us to swap AutoMapper out for a different mapper tool.

This is the end of Part 1 - Explanation of Ninject Bindings and where we used them in our MVC Website

I will do another couple of posts on the other layers mentioned in the architecture diagram at the start of this post.