English
Architecture
Redis Caching

Redis Caching

ReliaPulse uses Redis for API response caching to improve performance under high load.

Overview

Public-facing endpoints like the status page API and RSS feeds can receive high traffic, especially during incidents when users frequently refresh. Redis caching reduces database load and improves response times.

Cache Configuration

TTL Guidelines

EndpointTTLRationale
Public Status Page30 secondsFrequently viewed, needs fresh data
Public Feeds (RSS/Atom/JSON)60 secondsLess critical freshness
Organization Settings5 minutesRarely changes
Component Aggregations10 secondsModerate freshness needed

Cache Keys

// Public status page data
cache:public:status:{slug}
 
// Public feeds by format
cache:public:feed:{slug}:rss
cache:public:feed:{slug}:atom
cache:public:feed:{slug}:json
 
// Organization settings
cache:org:settings:{organizationId}
 
// Component aggregations
cache:org:components:{organizationId}

Usage

Basic Caching

import { withCache, cacheKeys, cacheTTL } from "@/lib/cache";
 
// Wrap expensive operations with caching
const data = await withCache(
  cacheKeys.publicStatus(slug),
  cacheTTL.PUBLIC_STATUS,
  async () => {
    // This only runs if cache miss
    return await fetchStatusData(slug);
  }
);

Manual Cache Operations

import { cacheGet, cacheSet, cacheDel } from "@/lib/cache";
 
// Get cached value
const cached = await cacheGet<StatusData>(key);
 
// Set with TTL
await cacheSet(key, data, 30); // 30 second TTL
 
// Delete cache entry
await cacheDel(key);

Cache Invalidation

Cache is automatically invalidated when data changes:

import { invalidatePublicStatusCacheByOrgId } from "@/lib/cache";
 
// Call after creating/updating/deleting data
// This is fire-and-forget (doesn't block response)
invalidatePublicStatusCacheByOrgId(organizationId).catch(console.error);

Automatic Invalidation

The following actions automatically invalidate the public status cache:

  • Components: Create, update, delete, bulk actions
  • Incidents: Create, update, delete
  • Maintenances: Create, update, delete

This ensures users see fresh data after changes while still benefiting from caching during stable periods.

HTTP Caching

In addition to Redis caching, the API sets HTTP cache headers:

Cache-Control: public, max-age=30

This allows:

  • Browser caching for individual users
  • CDN edge caching if deployed behind a CDN
  • Proper cache busting via cache keys

Performance Impact

With caching enabled:

MetricWithout CacheWith Cache
Average response time50-200ms5-15ms
Database queries per request3-50 (on cache hit)
Requests before throttling~100/min~1000/min

Configuration

Caching uses the same Redis instance as the job queue. No additional configuration is needed beyond setting REDIS_URL:

REDIS_URL=redis://localhost:6379

Monitoring

Monitor cache performance via Redis commands:

# Connect to Redis
redis-cli
 
# View cache keys
KEYS cache:*
 
# Check TTL on a key
TTL cache:public:status:acme-corp
 
# View memory usage
INFO memory

Best Practices

  1. Short TTLs for critical data: Use 30 seconds or less for status data
  2. Fire-and-forget invalidation: Don't block API responses waiting for cache clearing
  3. Graceful degradation: If Redis is unavailable, bypass cache and hit database directly
  4. Monitor hit rates: Track cache hits vs misses to optimize TTLs