Skip to main content
Knowledge Hub

Redis Caching Strategies

Using Redis for caching to improve application performance

Last updated: December 20, 2025

Redis is an in-memory data structure store that can be used as a cache, message broker, or database. When used for caching, Redis significantly improves application performance by storing frequently accessed data in fast memory.

Understanding Redis caching patterns helps you build faster, more scalable applications.

Why Use Redis for Caching

Redis provides several advantages for caching. It’s extremely fast because data is stored in memory, providing microsecond-level access times. It supports various data structures like strings, lists, sets, and hashes, making it flexible for different use cases.

Redis can be used as a distributed cache, allowing multiple application instances to share cached data. It supports expiration, allowing cached data to automatically expire and refresh.

Basic Redis Operations

Redis stores data as key-value pairs. Basic string operations include SET to store a value, GET to retrieve a value, and EXPIRE to set a time-to-live for keys.

const redis = require('redis');
const client = redis.createClient();

await client.connect();

// Set a value
await client.set('key', 'value');

// Get a value
const value = await client.get('key');

// Set expiration (30 seconds)
await client.expire('key', 30);

// Set with expiration in one command
await client.setEx('key', 30, 'value');

Cache-Aside Pattern

The cache-aside pattern is the most common caching strategy. The application checks the cache first, and if data isn’t present, fetches from the database and stores it in cache:

Cache-Aside Flow:

Request for Data

┌─────────────────┐
│ Check Cache     │
└────────┬────────┘

    Cache Hit?
    /        \
  Yes        No
   ↓          ↓
Return    ┌─────────────────┐
Data      │ Fetch from DB   │
          └────────┬────────┘

          ┌─────────────────┐
          │ Store in Cache  │
          └────────┬────────┘

          Return Data
app.get('/', async (req, res) => {
  try {
    // Check cache first
    const cacheValue = await client.get('todos');
    if (cacheValue) {
      console.log('Cache hit!');
      return res.json(JSON.parse(cacheValue));
    }

    // If not in cache, fetch from API
    const { data } = await axios.get('https://jsonplaceholder.typicode.com/todos');

    // Save to Redis (stringify because Redis stores strings)
    await client.set('todos', JSON.stringify(data));
    await client.expire('todos', 30); // expire in 30 seconds

    console.log('Cache miss — fetched from API');
    res.json(data);
  } catch (err) {
    console.error(err);
    res.status(500).json({ error: 'Failed to fetch data' });
  }
});

This pattern gives applications control over what’s cached and when, but requires managing cache invalidation.

Redis Data Structures

Redis supports various data structures beyond simple strings. Lists are ordered collections useful for queues and stacks:

// Push to left
await client.lPush('Message', '1');
await client.lPush('Message', '2');

// Pop from right
const result = await client.rPop('Message');

Hashes are useful for storing objects:

await client.hSet('user:1', {
  name: 'John',
  email: 'john@example.com'
});

const user = await client.hGetAll('user:1');

Sets are useful for unique collections, and sorted sets maintain order by score.

Cache Invalidation

Cache invalidation is one of the hardest problems in computer science. When data changes, you need to update or remove cached entries.

Time-based expiration is simple but may serve stale data. Event-based invalidation removes cache entries when data changes, ensuring freshness but requiring coordination.

A common approach is to use short expiration times combined with event-based invalidation for critical data.

Cache Warming

Cache warming pre-populates the cache with frequently accessed data. This is useful after application restarts or cache flushes.

Warm the cache during application startup or use background jobs to refresh cache entries before they expire.

Distributed Caching

When multiple application instances share a Redis cache, you need to handle cache consistency. Redis provides atomic operations that help maintain consistency across instances.

Use Redis transactions or Lua scripts for complex operations that need to be atomic.

Connection Management

Properly manage Redis connections. Use connection pooling to reuse connections efficiently. Always close connections when done:

await client.quit(); // closes the connection properly

For production, use connection pooling and handle reconnection logic for reliability.

Error Handling

Cache failures shouldn’t break your application. Implement fallback logic that continues without cache if Redis is unavailable:

try {
  const cached = await client.get('key');
  if (cached) return JSON.parse(cached);
} catch (err) {
  console.error('Cache error:', err);
  // Continue without cache
}

// Fetch from database
const data = await fetchFromDatabase();

Best Practices

Use meaningful key names with prefixes to organize keys. For example, user:123 or blog:456.

Set appropriate expiration times based on how often data changes. Frequently changing data should have shorter expiration times.

Monitor cache hit rates to understand cache effectiveness. Low hit rates indicate the cache isn’t helping much.

Size your Redis instance appropriately. Monitor memory usage and set maxmemory policies to prevent out-of-memory errors.

Use Redis for data that’s expensive to compute or fetch, frequently accessed, and relatively stable. Don’t cache data that changes constantly or is rarely accessed.

Summary

Redis caching can dramatically improve application performance by reducing database load and response times. The cache-aside pattern is the most common approach, but understanding Redis data structures and proper cache invalidation strategies is essential for effective caching.

The key is to cache the right data with appropriate expiration times, handle cache failures gracefully, and monitor cache effectiveness. When used correctly, Redis caching is a powerful tool for building scalable applications.