Skip to main content
Background Image

Making a Thread-Safe Cache in .NET

·398 words·2 mins·
Table of Contents

One of the challenges that we will face as a developer is the problem of accessing a resource from multiple threads. Handling concurrency and shared state between different threads is a delicate and important feat. Today we’ll look into the problem of accessing a Cache/Dictionary in a thread safe way.

The Problem
#

suppose we have a cache that holds data that we fetch from another server. A consumer calls the cache to fetch data for a key, and if the cache doesn’t have the data, it must download it.

static Dictionary<string, DataModel> _cache = []

public async Task<DataModel> GetValueAsync(string key) 
{
  if (_cache.TryGetValue(key, out data)) return data;
  return _cache [key] =
    await new DownloadDataAsync(key);
}

This approach is generally fine and thread-safe and we don’t need locking for safety. Until two threads try to get the same key value from the cache. In that case there will be multiple download requests for the same data, each updating the same cache entry. While this still won’t cause an error, it can affect performance.

The Solution
#

The solution is interesting and easy. Instead of keeping the value directly in the cache, we keep it’s future. So the cache holds Task<DataModel>.

static Dictionary<string, Task<DataModel>> _cache = []

public Task<DataModel> GetValueTask(string key) 
{
  if (_cache.TryGetValue(key, out dataTask)) return dataTask;
  return _cache [key] = new DownloadDataAsync(key);
}

This way, we’ll always get the same Task for each DataModel if we call the GetValue method with the same key over and over, and if the Task is complete awaiting it has low cost. This approach also reduces the load on the Garbage Collector. Notice that we’re not using the async/await keywords here since we need to return the tasks themselves not the actual value.

To make it truly thread-safe we lock the operation. Though the lock will be very short since we’re only locking for the duration of a lookup and quick add to the cache, since the actual downloading will be done inside the Task and not in the GetValueTask method.

static Dictionary<string, Task<DataModel>> _cache = []

public Task<DataModel> GetValueTask(string key) 
{
  lock(_cache)
    if (_cache.TryGetValue(key, out dataTask)) 
      return dataTask;
    else
      return _cache [key] = new DownloadDataAsync(key);
}

Although there are ready-made constructs to achieve similar designs, it’s important to understand the underlying techniques used in their making.

I hope this helps! Ask me any questions you might have about concurrent caches!

Mazdak Parnian
Author
Mazdak Parnian
Software Engineer using .NET & GO