Back to all posts

Understanding & Implementing Caching in ASP.NET Core

Posted on Oct 29, 2022

Posted in category:
Development
ASP.NET

In a prior post, I discussed Architecting for Scalability & Redundancy. A key method of improving application performance is implementing caching to store critical information to prevent additional server processing. .NET provides two primary cache mechanisms for programmatic caching of individual objects; In Memory Cache and Distributed Cache. In addition to these traditional caching methods for server-side code, a Response Caching Middleware exists allowing entire HTTP responses to be cached further improving application performance. Understanding and implementing these concepts are often difficult for those that have not done so with other projects.

When to Start

Often, I will encounter development teams that only consider caching mechanisms after building the application. One of the most common statements defending this practice is "I will not know what to cache until we see the data." Although, possible to see issues based on real-world data, you end up chasing red herrings rather than improving your application.

If starting a project with a clear understanding of the future state and the tools/features available, it is possible to leverage these and build a framework with seamless growth potential and minimal initial effort.

Understanding Your Application

By taking the time to understand the fundamentals, it is possible to start an application with a great approach to caching and improved performance rather than adding it after the fact. Although fully documented via the referenced links, the practical application of these concepts should be more noticed and understood. As we look at cache implementation, we must start by looking at our application to help provide a pathway to proper architecture.

Elements Prime for Caching

The best way to start is to look at your application for information that is costly to retrieve AND often used. These are the types of operations that most commonly can benefit from implementing a caching strategy and will reduce the overall load on your system, allowing for more throughput etc. Now, it is important to look at the total cost of these operations as the type of information to be cached could be different based on where/when/what is the most costly. It could be caching the raw data to limit database load, or it could mean caching the full display results to limit database and web-server resource usage.

Some things that are commonly identified in this category:

  • Common dynamic menus
  • User specific permission caches

With the identification of each of these items, it is important to understand what trigger might change the data to ensure that you have proper controls to clear or otherwise update the cache upon completion.

Understanding Growth Opportunities

Additionally, when looking at your application, it is critical to review your future plans and scale consideration. For example, if you target a geo-redundant implementation in the future, your implementation decision is easy; Distributed Cache always. If you have less lofty goals, you can implement a different style of cache; however, know what limitations you introduce by doing so.

Usage of IMemoryCache

Often the first solution found by developers, the In Memory cache is easy to use and has a simple API. Simply inject a IMemoryCache object and use the provided methods to get/set values based on your particular object type. The sample below shows getting a value either from cache or a local location.

IMemoryCache Usage Example
if (!_memoryCache.TryGetValue(CacheKeys.Entry, out DateTime cacheValue))
{
    cacheValue = DateTime.Now; //Wasn't found set it in cache
    var cacheEntryOptions = new MemoryCacheEntryOptions()
        .SetSlidingExpiration(TimeSpan.FromSeconds(3));
    _memoryCache.Set(CacheKeys.Entry, cacheValue, cacheEntryOptions);
}
return cacheValue
    

As you can see, this cache implementation is simple, strongly typed, and easy to use. The downside of usage is simply the fact that this is memory based and does not work in situations where you scale to multiple machines.

Usage of IDistributedCache

Similar processes are utilized for IDistributedCache; however, the helper methods do not exist to allow typed retrieval, and the setting of values is more involved. The following code is utilized to retrieve a value from cache.

IDistributedCache Usage Example - Get Value
byte[] encodedCacheValue = await _cache.GetAsync("MyKey");
if(encodedCacheValue != null)
{
    return Encoding.UTF8.GetString(encodedCacheValue);
}
else
{
    return "Not Found"
}
    

To update/set a value in cache you have to do a bit more work.

IDistributedCache Usage Example - Set/Update
var myValueToStore = DateTime.UtcNow.ToString();
byte[] encodedValue = Encoding.UTF8.GetBytes(myValueToStore);
var options = new DistributedCacheEntryOptions()
    .SetSlidingExpiration(TimeSpan.FromSeconds(30));
await _cache.SetAsync("MyKey", encodedValue, options);
    

This API difference is because all cached data must be distributed to an external system, so the format needs to be easily serializable/transmittable. The backing storage for this cache may be Memory, Database, Redis or something else. It is this additional complexity in implementation that often will push developers to fall back to IMemoryCaching but is also the primary reason why making this change later is more costly than just implementing it this way from the beginning.

But I Don't Want to Pay for Redis/SQL

This statement is my my common pushback for using IDistributedCache within a project. However, it is important to note that IDistributedCache if not provided any additional configuration will default to utilizing an in-memory cache, requiring no additional resources. Future upgrades are easy.

Selecting Your Path

Armed with this information, you will have to make your own selection for your application. I have found that although more work, that starting with IDistributedCache can result in far less work later in life when a rapid scale or other change is needed. Share your experience below!