How to copy files by name pattern and date under Windows with Robocopy

This is not a post about Windows 10, but a post about how to copy a lot of files inside a folder that contains sub-folders. So far, this can be done by using Windows Explorer or the command line copy/xcopy but being able to select specific extension to copy, and a specific file prefix, and for a maximum date can be more challenging.

This is where RoboCopy shines! This tool is available if you have Windows Vista, Windows 7, Windows 8 and Windows 10. It’s free. It’s the next generation of xcopy — more features. You can find some example directly on Microsoft Technet website.

Robocopy

The example that was concerning me was that I wanted to copy all .jpg file that start with an underscore, and all .NEF file to a backup drive. I also wanted only pictures after a specific date.

ROBOCOPY C:\source F:\destination _*.jpg *.NEF /E /MAXAGE:20150806

The first string is the name of the command. You should have this tool available if you open a Windows command prompt. To see if you do, just type Robocopy /?. This will give you the command line help for Robocopy if this one is installed. The first parameter is the source folder, the second the destination. Both of these parameter can only be folder. If you need to copy one specific file, you will need to use options. Options come after the source and destination folder in any order. In the example, I specify that I want all jpg extension file that start with an underscore. The stars is a wildcard for any characters. In the example, I want all NEF file, so I also specify every .NEF file (no prefix specified). The slash E (/E) means to go recursive. This take more time but is required if you have folders inside folders. The last option is /MAXAGE which take a date in year-month-day format.

The last step is to hit enter and you will see file being copied. If by misfortune your destination folder does not have enough space, Robocopy will pause and retry every 30 seconds. This let you clean up the destination drive for more space without having to restart the whole process. As you can see, this is a simple, free and powerful tool already installed on your Windows’ machine that can help you

Improving Asp.Net MVC Page Rendering

Asp.Net MVC view engine is not very efficient. Usually, the database is the bottle neck, but with Asp.Net, the view engine can be the one to slow down your system. This is especially true if you are using a lot of partial views, templates and service code. This is a little bit ironic because good patterns suggest to divide. Nevertheless, we can improve it by trying to reduce some of the tax that the system is producing.

First of all, building in release instead of debug will help, but this will not be a game changer. This is something that stun me for several reasons. Mostly because if you search on Internet, this is the big optimization to do, but also because the compilation in release does not optimize the rendering properly. Here are three screenshot taken of Glimpse. All benchmark results are from the same page that we will optimize during this article. The first one is the webpage in debug, without caching of any database calls.
EntityFrameworkConnectionTime

The second one benchmark has the web application not reaching the database at all — it use Redis to cache all the data queries.
RedisPerformance

Redis cache is powerful and give us an advantage of 10% in the scenario under test. The performance gains can be even more with a database with more values. This is why, it is worth it. However, you can still see that it takes 3 seconds to load the page — this is not acceptable.

ReleaseCompilationAndCache

The next image is with the cache enabled and compiled in release mode (also with the web.config is set to to the compilation to debug at false). The gain in performance is interesting with a result at 2.27 seconds, 25% faster. The result is appreciable but still does not make any sense. This is unfortunate because if you search on Internet, this is about what they suggest to help the performance of your application. You can cache the generated output, but this is good only if it is acceptable to not have fresh values. This is not really an option in many scenario. I never been a big fan of caching the output because it’s harder to invalidate. Unlike caching database results or calculus results with Redis on the back-end which can be invalidate and re-set by the back-end, the front end require to do an http calls to refresh the output cache. In any case, the problem is the rendering of the view.

With the help of Glimpse, I were able to see that the Razor view engine was producing 108 views, all cached. Instead of having a system well separated (on the views), the next step of optimization is to use directly in the .cshtml the creation of the Html. It’s also interesting to see the when we build in debug that the amount of call to the view engines double.

After removing most of the Display Templates, the performance increased. The time dropped from 2.27 seconds to 551ms (debug)/ 509ms (release). In the remaining time, most of it is still in the construction of Html by the Razor Engine. Instead of having 108 views, only 8 was used. This is clearly the consequence of reducing the use of display template. Unfortunately, the performance gain come with the cost of having a code that is less reusable. This new benchmark got the html of the template copy-pasted in some area of the .cshtml. The next step is to transfer this reusable code inside Html Helper. Also, even if we have better performance, I am still far from what I can do with PHP. This is something that is still bothering me. I am benchmarking an application that is a written of a PHP application. The backend performance (logic + database) is about the same, but the rendering is a lot slower. On the 509ms, 358ms is on the creation of the view. In the PHP application, less than 50ms was in the creation of the html — still 6 times slower.

Build in Debug with reduction of Display Template:
RemoveDisplayForTemplate

Build in Releasewith reduction of Display Template:
ReleaseWithoutDisplayFor

.Net is moving away, slowly, to have the browser renders the Html with JavaScript Framework instead of the on the server. They embrace the use of WebApi and advocates the use of client side framework like Angular. This is a decision that I respect, but I am still convicted that since the inception of Razor in 2010 that Microsoft could have improved the server side rendering with Razor.

Redis Experimentation with Full List Cache against using Redis Sorted List

I am improving the performance of a system right now with Redis and the library StackExchange. Onne particular case was that I needed to cache a list of data that are ordered by rank from a value that change often. One requirement is that it’s possible that two items can have the same rank. For example:

Rank - Data   - Value
1    - User 1 - 100
2    - User 2 - 99
3    - User 4 - 99
4    - User 5 - 97

The data is in fact a serialized object that contains multiples classes. For not making this article too heavy, I will just use a string. Nevertheless, keep in mind that this is not a simple string unique identifier. The value column is required by Redis when using the Sorted List. In reality, this value is inside the data, in a property.

This is also information that must be paged because the list can go around five thousand entries.

The first step was to measure the time when using the database. I got an average of 264ms per query for 20 items on a set of 200 items. The database contains thousand of entry, the page is using a clause to filter down depending of other criteria defined inside the data (inside the class that we serialize). The next step was to use Redis as a simple cache — once we get the result of the database we store it for few times. The first hit will have the same average, because it goes to the database, but the subsequent request will go inside Redis instead of the database. This was producing an improvement of 50% faster, with 125ms in average. The key was determined by the type of list, by the filter attribute and the page number. For example, “MyListOfObjectXXX_PartitionYYY_Page_1”. The speed was interesting for me, I was aiming around 100 ms but I was satisfy with the result. The time also contains the time to deserialize the object to create a generic list of all 20 results. I count the deserialization process time in my benchmark because I was counting the ORM time to instantiate the object too. My concern with that solution is that every object can change its value at any time. The value does change the rank by consequence. Since I am also caching the data with a separate key for each instance, I duplicate this information in the cache. The size of the cache can be a problem in the long run, but the bigger problem is that the information become desynchronize. In fact, the source of truth is the individual cached version in the system. It looks like this : “MyData_Key_1”. I set an expiry because this is not the real source of data. I will not invalidate that data like the rest of the software when values change from the entity. I will let them expire and than change it. It means that a user that drill down from the list can get an up-to-date data. This is the cost to pay (so far) for a one minute delay.

db.StringSet(MyListOfObjectFoo_PartitionRed_Page_1, myListOfDataForPage1, TimeSpan.FromMinutes(1));

To overcome this issue, Redis offers to be able to store an ordered list that is sorted by a value. What is interesting is that the value can be the same which will produce the same rank. So far, this is exactly the answer of the problem. However, that solution does not fix the problem of having to duplicate the data in the cache. The sorted list solution can query by range, so it’s interesting for paging, but not by unique key. Thus, it solves only the problem of having desynchronized value since I can push easily in the sorted list an entry in a specify (updated) rank.

// Initial push
db.SortedSetAdd("MyListOfObjectFoo_PartitionRed_Page_1", new[] {
                    new SortedSetEntry("User 1",100),
                    new SortedSetEntry("User 2",99),
                    new SortedSetEntry("User 3",99)});

// Later when one entity change with a value of 100. This will produce two rank 1.
db.SortedSetAdd("MyListOfObjectFoo", objectToCache, 100);

This was surprising in many ways. First of all, the main problem was that if you have several same ranks that it is not possible to have a second ordering value from the object. You are stock with value you set which is a double. This allow you to do some mathematics trick but if you would like to sort by alphabetic order than you need to manually in C# do your second sort. I didn’t go more deep with that solution because of the second problem. The second bigger problem was the performance. To get the information, you use the get by range method.

db.SortedSetRangeByRank("MyListOfObjectFoo", 1, 20)

From that, you need to loop and deserialize all values which is the same tax to pay that we have when caching the whole page in 1 Redis key-value entry. However, the performance was disastrous. My average on three run was 1900ms. This was really surprising. I double check everything because it wasn’t making any sense to me. My initial hypothesis was that this was highly optimized for this kind of scenario — I was wrong. However, the fault is not Redis. After some investigation, I found that the serialization, done with Json.Net library, got some harder time deserializing 20 times a very complex objects than a list of 20 objects. This is mostly because when serializing a list, if the complex object has already a reference that this one is not serialized again but use a reference system. For example, instead of having a deep object, Json.Net will use “$ref”: “20”. This has a huge impact in performance.

I finally decided to optimize my model classes and have a more light classes for this page. Instead of using a list of objects that has a lot of sub-rich objects, using a simple list of a basic class with properties did an awesome job. The list that was taking 1900ms to get from Redis and deserialize is not taking less than .17 ms. That is right and not a typo, it is less than a single millisecond.

I am still learning how to maximize the use of Redis and so far like the flexibility that it offers compared to Memcached that I used for more than a decade. So far it’s interesting and will keep you inform with any new optimization I can find. In short term, I think a solution may be to cache not the whole complex object but just a part of it in an aggregate view of objects.

How to Extend Glimpse for Redis

Glimpse is the best real time profiler/diagnostic add-on you can have for your Asp.Net MVC solution. I will not describe in that article all the capabilities but in one sentence, Glimpse allows to have for you Asp.Net MVC project all times for each calls like filter, action, db call, etc. Unfortunately, no extension has been done for Redis. Nevertheless, creating a custom extension for the Timeline is not too hard. However, the documentation is very dry and it is not obvious about what you can extend or not. This is really sad and the extensibility model of Glimpse is pretty limited. For example, you cannot extend the HUD.

The objective of the Glimpse’s extension we are building in this article is to add in Glimpse’s timeline every cache calls starting time, ending time, duration and what was the method name and key used. Here is the end result:
GlimpseExtension

The first thing is that extension will be not for Redis particularity but for any cache system. I have in the project I have a Cache.cs class that is abstract. My Redis implementation inherit from that cache. That class contains a lot of method like Set, Get, Delete etc. Here is the set method.

public void Set<T>(string key, T objectToCache, TimeSpan? expiry = null)
{
    if (string.IsNullOrEmpty(key))
    {
        throw new ArgumentNullException("key");
    }
    if (this.isCacheEnable)
    {
        var serializedObjectToCache = Serialization.Serialize(objectToCache);
        if (!this.ExecuteUnderCircuitBreaker(()=>this.SetStringProtected(key, serializedObjectToCache, expiry),key))
        {
            Log.Error(string.Format("Cannot Set {0}", key));
        }
    }
}

As you can see, the method serializes the object to cache, and calls the SetStringProtected method. Something particular is the method is called within a function called ExecuteUnderCircuitBreaker which is a design pattern. Whatever this pattern, every calls to the cache go through this function. At the end, if we remove all the circuit breaker pattern we can add the entry point for the Glimpse’s extension.

protected bool ExecuteUnderCircuitBreaker(Action action, string key, [CallerMemberName]string callerMemberName="")
{
   using (var glimpse = new GlimpseCache(key, callerMemberName))
   {
      //Code removed here about circuit breaker
      action();
   }
}

The important part for the moment is that every calls for the cache are proxied by this method which execute the Redis action between a GlimpseCache object creation and disposition. The GlimpseCache class start a timer when the class is constructed and stop the timer when it is disposed.

public class GlimpseCache:IDisposable
{
    private readonly GlimpseCacheCommandTracer tracer;
    public GlimpseCache(string key, string commandName)
    {
        this.tracer = new GlimpseCacheCommandTracer();
        tracer.CommandStart(commandName, key);
    }

    public void Dispose()
    {
        if (tracer != null)
        {
            tracer.CommandFinish(); 
        }
    }
}

The core code is in the GlimpseCacheCommadnTracer. The tracer will use the IMessageBroker and IExecutionTimer to know the configuration. This will get from the configuration file (web.config) Glimpse’s configurations but also if this one is active or not. It will also give you a hook to the timer start and stop. This will allow us to get into the timeline by publishing an event. This class also configure how to display the information. You can define the label, the color and the highlight.

public class GlimpseCacheCommandTracer 
{
    private IMessageBroker messageBroker;
    private IExecutionTimer timerStrategy;

    private IMessageBroker MessageBroker
    {
        get { return messageBroker ?? (messageBroker = GlimpseConfiguration.GetConfiguredMessageBroker()); }
        set { messageBroker = value; }
    }

    private IExecutionTimer TimerStrategy
    {
        get { return timerStrategy ?? (timerStrategy = GlimpseConfiguration.GetConfiguredTimerStrategy()()); }
        set { timerStrategy = value; }
    }
        
    private const string LABEL = "Cache";
    private const string COLOR = "#555";
    private const string COLOR_HIGHLIGHT = "#55ff55";
        
    private string command;
    private string key;
    private TimeSpan start;

    public void CommandStart(string command, string key)
    {
        if (TimerStrategy == null)
            return;
        this.start = TimerStrategy.Start();
        this.command = command;
        this.key = key;
    }


    public void CommandFinish()
    {
        if (TimerStrategy == null || MessageBroker == null)
            return;

        var timerResult = TimerStrategy.Stop(start);

        var message = new CacheTimelineMessage(this.command, this.key)
                .AsTimelineMessage(command + ": " + key, new TimelineCategoryItem(LABEL, COLOR, COLOR_HIGHLIGHT))
                .AsTimedMessage(timerResult);

        MessageBroker.Publish(message);
    }
}

The command finish method is called by the disposable method stop the timer for this event and build the message to be added to the timeline. In that example, we display the command and the key. The third and last class you need is the CacheTimelineMessage. This is the class that inherit from Glimpse’s MessageBase and ITimelineMessage. This is what will be used to display information in the timeline.

    public class CacheTimelineMessage : MessageBase, ITimelineMessage
    {
        public string Command { get; set; }
        public string Key { get; set; }

        #region From Interface
        public TimelineCategoryItem EventCategory { get; set; }
        public string EventName { get; set; }
        public string EventSubText { get; set; }
        public TimeSpan Duration { get; set; }
        public TimeSpan Offset { get; set; }
        public DateTime StartTime { get; set; }
        #endregion
        public CacheTimelineMessage(string command, string key)
        {
            this.Command = command;
            this.Key = key;

        }
    }
}

I am pretty sure we can do something better and even maybe show more information, but I am satisfy with the insight that I can have now with this few lines of code to Glimpse.

How to Have Json.Net Deserialize Using Private Constructor and Private Setter

You may have a private constructor for your class. This is often useful if you want the developer to use a constructor but still have your ORM to be able to create your entity. The problem is when serializing and deserializing will use the public constructor and will pass null on every parameters. To have the deserializer user the private constructor, like the ORM, you need to tell Json.Net that it can use private constructor.

 new JsonSerializerSettings
                    {
                        ConstructorHandling = ConstructorHandling.AllowNonPublicDefaultConstructor
                    }

The JsonSerializerSettings required is the ConstructorHandling which need to be set to AllowNonPublicDefaultConstructor.

Also, you may have public getter but private setter. Often the case when the property value must be set with private logic. However, when serializing and deserializing, these values will not be set. To allow Json.Net to use the private setter, you must set a new contract resolver.

Json.Net can have custom contract resolver. The trick is to tell to Json.Net that if the property has a private setter to use it.

new JsonSerializerSettings
                    {
                        ContractResolver = new PrivateResolver()
                    }
//.........
public class PrivateResolver : DefaultContractResolver
{
    protected override JsonProperty CreateProperty(MemberInfo member
                                                 , MemberSerialization memberSerialization)
    {
        var prop = base.CreateProperty(member, memberSerialization);

        if (!prop.Writable)
        {
            var property = member as PropertyInfo;
            if (property != null)
            {
                var hasPrivateSetter = property.GetSetMethod(true) != null;
                prop.Writable = hasPrivateSetter;
            }
        }

        return prop;
    }
}

This is it.
DefaultContract
The CreateProperty method returns a JsonProperty which has some information about the serialization settings. As you can see in the image above, it’s possible to see if we have a get and set with the Readable and Writable property of the the JsonProperty. The only task we need to do is to set the Writable to true if this one is not yet set to deserialize (Writable) and if the property has a setter. This is done by using reflection and check if the property has a setting. This is done by using the PropertyInfo on the MemberInfo and use GetSetMethod.

Using Redis in Asp.Net in an Enterprise System

I wrote about how to integrate Redis into Asp.Net MVC few days ago. Here is a way how to integrate Redis into your solution with dependency injection and abstracting Redis. This additional layer will be helpful if in the future we change from Redis to Memcached or simply.

The first step is to create the interface that will be used.

public interface ICache
{
    void SetString(string key, string objectToCache, TimeSpan? expiry = null);
    void Set<T>(string key, T objectToCache, TimeSpan? expiry = null) where T : class;
    string GetString(string key);
    T Get<T>(string key) where T : class;
    void Delete(string key);
    void FlushAll();
}

This interface gives primary operations that can be execute against Redis (or any other cache system). It’s possible to enhance this interface with more methods, This is the basic operations that is required to run a cache. The first two methods are to set a value inside the cache. One set a simple string, the second take a class of type T. The second one will be mostly used to take an object and serialize it. The next two methods are to get from a key the unserialized data. The next two methods is to delete. One use a key to delete a specific object and the other one delete everything from the cache.

A second interface is used. This one will allow us to get some status about if the cache is enable and if the cache is running properly.

public interface ICacheStatus
{
    bool IsCacheEnabled { get;}
    bool IsCacheRunning { get;}
}

The difference between IsCacheEnable and IsCacheRunning is that the first one is controlled by us. Normally from the web.config, you should have a key to turn on and off the cache. In case you notice a problem with the cache, it is always a good option to be able to turn off. The Second property is about getting the status of the caching server, Redis. If this one become inactive, it’s interesting to get the status from an administration panel for example.

Despite this interface, we need to have an abstract class with shared logic for any cache system (not only Redis). This is where we will have the serialization process, the error logging and the handling of the on/off mechanism. This is where the Circuit Pattern could also be used. I will discuss about it in a future article. Keep in mind for the moment that

public abstract class Cache : ICache, ICacheStatus
{
    private readonly bool isCacheEnable;

    public Cache(bool isCacheEnable)
    {
        this.isCacheEnable = isCacheEnable;
    }

    public void Set<T>(string key, T objectToCache, TimeSpan? expiry = null) where T : class
    {
        if (string.IsNullOrEmpty(key))
        {
            throw new ArgumentNullException("key");
        }
        if (this.isCacheEnable)
        {
            try
            {
                var serializedObjectToCache = JsonConvert.SerializeObject(objectToCache
                     , Formatting.Indented
                     , new JsonSerializerSettings
                     {
                         ReferenceLoopHandling = ReferenceLoopHandling.Serialize,
                         PreserveReferencesHandling = PreserveReferencesHandling.Objects,
                         TypeNameHandling = TypeNameHandling.All
                     });

                this.SetStringProtected(key, serializedObjectToCache, expiry);
            }
            catch (Exception e)
            {
                Log.Error(string.Format("Cannot Set {0}", key), e);
            }
        }
    }

    public T Get<T>(string key) where T : class
    {
        if (string.IsNullOrEmpty(key))
        {
            throw new ArgumentNullException("key");
        }
        if (this.isCacheEnable)
        {
            try{
                var stringObject = this.GetStringProtected(key);
                if(stringObject  ==  null)
                {
                     return default(T);
                }
                else
                {
                     var obj = JsonConvert.DeserializeObject<T>(stringObject
                         , new JsonSerializerSettings
                         {
                             ReferenceLoopHandling = ReferenceLoopHandling.Serialize,
                             PreserveReferencesHandling = PreserveReferencesHandling.Objects,
                             TypeNameHandling = TypeNameHandling.All
                         });
                    return obj;
                }
            }
            catch (Exception e)
            {
                Log.Error(string.Format("Cannot Set key {0}", key), e);
            }
        }
        return null;
    }

    public void Delete(string key)
    {
        if (string.IsNullOrEmpty(key))
        {
            throw new ArgumentNullException("key");
        }
        if (this.isCacheEnable)
        {
            try{
                this.DeleteProtected(key);
            }
            catch (Exception e)
            {
                Log.Error(string.Format("Cannot Delete key {0}",key), e);
            }
        }
    }

    public void DeleteByPattern(string prefixKey)
    {
        if (string.IsNullOrEmpty(prefixKey))
        {
            throw new ArgumentNullException("prefixKey");
        }
        if (this.isCacheEnable)
        {
            try
            {
                this.DeleteByPatternProtected(prefixKey);
            }
            catch (Exception e)
            {
                Log.Error(string.Format("Cannot DeleteByPattern key {0}", prefixKey), e);
            }
        }
    }

    public void FlushAll()
    {
        if (this.isCacheEnable)
        {
            try{
                this.FlushAllProtected();
            }
            catch (Exception e)
            {
                Log.Error("Cannot Flush", e);
            }
        }
    }

    public string GetString(string key)
    {
        if (string.IsNullOrEmpty(key))
        {
            throw new ArgumentNullException("key");
        }
        if (this.isCacheEnable)
        {
            try
            {
                return this.GetStringProtected(key);
            }
            catch (Exception e)
            {
                Log.Error(string.Format("Cannot Set key {0}", key), e);
            }
        }
        return null;
    }

    public void SetString(string key, string objectToCache, TimeSpan? expiry = null)
    {
        if (string.IsNullOrEmpty(key))
        {
            throw new ArgumentNullException("key");
        }
        if (this.isCacheEnable)
        {
            try
            {
                this.SetStringProtected(key, objectToCache, expiry);
            }
            catch (Exception e)
            {
                Log.Error(string.Format("Cannot Set {0}", key), e);
            }
        }
    }
    public bool IsCacheEnabled
    {
        get { return this.isCacheEnable; }

    }
    
    protected abstract void SetStringProtected(string key, string objectToCache, TimeSpan? expiry = null);
    protected abstract string GetStringProtected(string key);
    protected abstract void DeleteProtected(string key);
    protected abstract void FlushAllProtected();
    protected abstract void DeleteByPatternProtected(string key);
    public abstract bool IsCacheRunning { get;  }
}

As you can see, this abstract class will delegate all methods into a protected abstract methods which contains the cache implementation code. This one does not know about concrete implementation, just how to handle general caching knowledge. It also abstract a single method which save a string. This mean that the implementer does not need to care about anything other than string. However, the one that will use the class has access to a Set method that allow to pass a string or an object. The next class, is the one that does the real job. Here is a simple Redis implementations of this abstract class.

public class RedisCache : Definitions.Cache
{
    private ConnectionMultiplexer redisConnections;

    private IDatabase RedisDatabase {
        get {
            if (this.redisConnections == null)
            {
                InitializeConnection();
            }
            return this.redisConnections != null ? this.redisConnections.GetDatabase() : null;
        }
    }

    public RedisCache(bool isCacheEnabled):base(isCacheEnabled)
    {
        InitializeConnection();
    }

    private void InitializeConnection()
    {
        try
        {
             this.redisConnections = ConnectionMultiplexer.Connect(System.Configuration.ConfigurationManager.AppSettings["CacheConnectionString"]);
        }
        catch (RedisConnectionException errorConnectionException)
        {
            Log.Error("Error connecting the redis cache : " + errorConnectionException.Message, errorConnectionException);
        }
    }

    protected override string GetStringProtected(string key)
    {
        if (this.RedisDatabase == null)
        {
            return null;
        }
        var redisObject = this.RedisDatabase.StringGet(key);
        if (redisObject.HasValue)
        {
            return redisObject.ToString();
        }
        else
        {
            return null;
        }
    }

    protected override void SetStringProtected(string key, string objectToCache, TimeSpan? expiry = null)
    {
        if (this.RedisDatabase == null)
        {
            return;
        }

        this.RedisDatabase.StringSet(key, objectToCache, expiry);
    }

    protected override void DeleteProtected(string key)
    {
        if (this.RedisDatabase == null)
        {
            return;
        }
        this.RedisDatabase.KeyDelete(key);
    }

    protected override void FlushAllProtected()
    {
        if (this.RedisDatabase == null)
        {
            return;
        }
        var endPoints = this.redisConnections.GetEndPoints();
        foreach (var endPoint in endPoints)
        {
            var server = this.redisConnections.GetServer(endPoint);
            server.FlushAllDatabases();
        }
    }

    public override bool IsCacheRunning
    {
        get { return this.redisConnections != null && this.redisConnections.IsConnected; }
    }
}

The Redis connection get its setting from the web.config. The instantiation of the Redis object is done by using the ConnectionMultiplexer that come from the StackExchange API. This one is thread save and this is why the Cache will be a singleton from the dependency container.

    container.RegisterType<RedisCache>(new ContainerControlledLifetimeManager()
                                                                , new InjectionConstructor(
                                                                        Convert.ToBoolean(ConfigurationManager.AppSettings["IsCacheEnabled"])
                                                                )); //Singleton ( RedisCache use thread-safe code)
    container.RegisterType<ICache, RedisCache>(); //Re-use the singleton above
    container.RegisterType<ICacheStatus, RedisCache>(); //Re-use the singleton above

This is how to register the cache with Microsoft Unity. The first one register the RedisCache class with a new object object shared by every queries to the cache, thus every requests. The two next registrations associate the two interfaces to that cache instance.

From there, it’s possible to use anywhere the interface. It’s also easy to unit test since you can mock the ICache interface which is the only interface that you need to pass through all your code. About what need to be used, it’s clear from the dependency injection code that we use ICache as the interface to use and not the concrete RedisCache class. The cache shouldn’t be used in the controller class, neither in your service class or in your repository class. This belong to the accessory classes which are between your service and repository class. Here is the a graphic of the layers that is recommended to have when using a cache system and a database.

Layers

The idea is that that the only layer to know about the cache is the accessor. The service layer does not know about the cache or the database — it only know about to get and set from the accessor. The repository does not know about caching, it’s responsibility is to get from the persistence storage the data. This can be with Entity Framework (or any other ORM) or directly with Ado.Net. On the other hand, the cache does not know about the database, it only know how to store data in a fast access way. This mean that the accessor class is the only one to get the cache injected. Here is a small example.

public class ContestAccessor: IContestAccessor
{
	private readonly IContestRepository contestRepository;
	private readonly ICache cache;
	public ContestAccessor(IContestRepository repository, ICache cache)
	{
		//...
	}
}

This class can have methods to get specific information. Here is an example to get a contest by id.

public Contest GetById(int id)
{
    var key = string.Format("contest_by_id_", id);
    var contestObject = this.cache.Get<Contest>(key);
    if (contestObject == null)
    {
        contestObject = this.contestRepository.GetById(id);
        this.cache.Set(key, contestObject);
    }
    return contestObject;
}

This is a basic example, that get the contest from the cache, if this one does not find it, than get it from the repository and store it inside the cache for the next call. Every time, we return the object whatever where it comes from. The service layer uses the injected accessor (the interface IContest for example). It does not know anything about the repository or the cache — the service just knows about getting its object by an id.