ASP.NET Core Caching Explained: Memory Cache vs Redis Distributed Cache

ASP.NET Core Caching Explained: Memory Cache vs Redis Distributed Cache

What is caching and why use it?

Caching saves data that takes time or effort to fetch, so the next time it’s needed, it can be returned quickly. This helps reduce load on databases and APIs and improves response times. Caching is especially useful for read-heavy workloads, costly calculations, or handling traffic spikes.

In ASP.NET Core, the most common options are in-memory caching for single-server setups and distributed caching, such as Redis, when the app runs on multiple servers or needs shared cache data.

Article content

Quick comparison

Article content

When to use which

  • Memory Cache: Use for single-instance deployments, short-lived data, and ultra-low latency.
  • Redis: Use for multi-instance scaling, shareable cache entries, and cross-service coordination.

Setup project & packages

For Redis distributed cache you need the Microsoft package:

# Add to your project
dotnet add package Microsoft.Extensions.Caching.StackExchangeRedis        

(If using other providers, install their package)

Configure services (Program.cs)

Add Memory Cache

builder.Services.AddMemoryCache();        

Add Distributed Redis Cache

builder.Services.AddStackExchangeRedisCache(options =>
{
    options.Configuration = "localhost:6379"; // or your redis connection string
    options.InstanceName = "MyApp:"; // optional key prefix
});        

Using IMemoryCache 

Simple usage in a service or controller

using Microsoft.Extensions.Caching.Memory;

public class ProductService
{
    private readonly IMemoryCache _cache;
    public ProductService(IMemoryCache cache) => _cache = cache;

    public async Task<Product> GetProductAsync(int id)
    {
        string cacheKey = $"product:{id}";

        // Try get from cache
        if (_cache.TryGetValue(cacheKey, out Product cached))
            return cached;

        // Miss -> load from DB or API
        var product = await LoadProductFromDbAsync(id);

        // Set with expiration
        var cacheEntryOptions = new MemoryCacheEntryOptions
        {
            AbsoluteExpirationRelativeToNow = TimeSpan.FromMinutes(10),
            SlidingExpiration = TimeSpan.FromMinutes(2)
        };

        _cache.Set(cacheKey, product, cacheEntryOptions);

        return product;
    }
}        

Notes:

  • AbsoluteExpirationRelativeToNow = fixed lifetime.
  • SlidingExpiration = resets on access.
  • IMemoryCache stores references — be careful with large objects

Using IDistributedCache (Redis)

IDistributedCache stores byte[]; common pattern is to serialize to JSON.

Example service

using Microsoft.Extensions.Caching.Distributed;
using System.Text.Json;

public class ProductService
{
    private readonly IDistributedCache _cache;
    public ProductService(IDistributedCache cache) => _cache = cache;

    public async Task<Product> GetProductAsync(int id)
    {
        string cacheKey = $"product:{id}";
        var cachedString = await _cache.GetStringAsync(cacheKey);
        if (!string.IsNullOrEmpty(cachedString))
        {
            return JsonSerializer.Deserialize<Product>(cachedString)!;
        }

        var product = await LoadProductFromDbAsync(id);

        var options = new DistributedCacheEntryOptions
        {
            AbsoluteExpirationRelativeToNow = TimeSpan.FromMinutes(20)
        };

        var json = JsonSerializer.Serialize(product);
        await _cache.SetStringAsync(cacheKey, json, options);

        return product;
    }
}        

Binary version (for performance)

var bytes = JsonSerializer.SerializeToUtf8Bytes(product);
await _cache.SetAsync(cacheKey, bytes, new DistributedCacheEntryOptions { AbsoluteExpirationRelativeToNow = TimeSpan.FromMinutes(20) });

var cachedBytes = await _cache.GetAsync(cacheKey);
if (cachedBytes != null)
    product = JsonSerializer.Deserialize<Product>(cachedBytes);        

Caching patterns & tips

Cache aside (lazy loading) most common

On a read request, first check the cache. If the data isn’t there, load it from the source, save it in the cache, and then return the value. The examples above follow this flow.

Write through , Write behind

For write operations, data can be written to both the cache and the database at the same time (write-through), or written to the cache first and saved to the database later (write-behind). This approach is more complex and requires careful handling to keep data consistent.

Invalidation

When source data changes, invalidate or update the cache:

await _cache.RemoveAsync("product:123");        

Or update cached value after DB update.

Key design

  • Keep cache keys consistent, for example: entity:{id}.
  •  Use versioned keys like products:v2:{id} when you need to invalidate all entries at once.

Expiration strategy

  • Use short TTLs for data that changes often.
  • Use longer TTLs for data that rarely changes.
  •  When it makes sense, combine absolute and sliding expiration.

Serialization

  • UTF-8 JSON is a good default because it’s readable and works across platforms.
  •  If performance or payload size matters, consider MessagePack or other binary formats.

Caching lists (avoid stale large data)

When caching lists such as “top products,” cache only the list of IDs and fetch item details by ID, or store smaller DTOs. Avoid caching very large blobs unless there’s a strong reason.

Monitoring and reliability

  • Track cache hit and miss rates.
  • For Redis, keep an eye on memory usage, evictions, and latency.
  •  Always plan for cache failures — your application should still function by falling back to the database if the cache is unavailable.

Quick code comparison

MemoryCache (sync, simple)

_cache.Set(key, value, new MemoryCacheEntryOptions {...});
var value = _cache.Get<MyType>(key);        

DistributedCache (async, shared)

await _cache.SetStringAsync(key, json, options);
var json = await _cache.GetStringAsync(key);        

Best Practices

  • Use MemoryCache for single instance low-complexity scenarios.
  • Use Redis (distributed cache) for multiple instances, sessions, or shared caches.
  • Keep cached items small; store identifiers, not entire graphs when possible.
  • Always design for cache failure (graceful fallback).
  • Expire and invalidate deliberately — stale data is worse than no cache.
  • Secure Redis: use password, TLS, and restrict network access.
  • Measure: log hit/miss, TTLs, and eviction events.

Minimal end-to-end example (Program.cs + controller)

Program.cs

var builder = WebApplication.CreateBuilder(args);

builder.Services.AddMemoryCache(); // in-memory
builder.Services.AddStackExchangeRedisCache(options =>
{
    options.Configuration = builder.Configuration.GetValue<string>("Redis:Connection"); // e.g. "localhost:6379"
    options.InstanceName = "MyApp:";
});

builder.Services.AddScoped<ProductService>();
builder.Services.AddControllers();
var app = builder.Build();
app.MapControllers();
app.Run();        

ProductsController (using distributed cache)

[ApiController]
[Route("api/[controller]")]
public class ProductsController : ControllerBase
{
    private readonly ProductService _service;
    public ProductsController(ProductService service) => _service = service;

    [HttpGet("{id}")]
    public async Task<IActionResult> Get(int id)
    {
        var product = await _service.GetProductAsync(id);
        if (product == null) return NotFound();
        return Ok(product);
    }
}        

Conclusion

Caching can improve performance but also adds complexity. Pick the right type for your setup, use in-memory caching for a single server and Redis for distributed systems. Design your keys and expiration times carefully, and make sure your app handles cache misses and failures gracefully. Set up monitoring to track performance, adjust TTLs, and catch issues early.

To view or add a comment, sign in

More articles by Muhammad Asad

Explore content categories