Advanced Caching Strategies for Next.js and Supabase Applications
Developer Guide

Advanced Caching Strategies for Next.js and Supabase Applications

Master caching patterns including Redis integration, ISR optimization, SWR patterns, cache invalidation, and performance optimization for Next.js and Supabase applications at scale.

2026-03-22
30 min read
Advanced Caching Strategies for Next.js and Supabase Applications

Caching is essential for building high-performance applications that can scale to millions of users. This comprehensive guide teaches you advanced caching strategies for Next.js and Supabase applications, from basic browser caching to sophisticated distributed cache architectures.

Effective caching reduces database load, improves response times, and enhances user experience. Poor caching strategies lead to stale data, cache stampedes, and performance bottlenecks that become expensive to fix at scale.

This guide covers everything from Next.js ISR optimization to Redis integration, with production-tested patterns you can implement immediately to dramatically improve your application performance.

Caching Architecture Overview#

Multi-Layer Caching Strategy#

Implement caching at multiple levels for optimal performance and reliability.

// lib/cache/architecture.ts
export enum CacheLayer {
  BROWSER = 'browser',
  CDN = 'cdn', 
  APPLICATION = 'application',
  DATABASE = 'database'
}

export interface CacheConfig {
  ttl: number
  layer: CacheLayer
  tags?: string[]
  revalidateOnStale?: boolean
}

export class CacheManager {
  private redis: Redis
  private memoryCache: Map<string, { data: any, expires: number, tags: string[] }>

  constructor() {
    this.redis = new Redis(process.env.REDIS_URL!)
    this.memoryCache = new Map()
  }

  async get<T>(key: string, config: CacheConfig): Promise<T | null> {
    // Try memory cache first (fastest)
    if (config.layer === CacheLayer.APPLICATION) {
      const cached = this.memoryCache.get(key)
      if (cached && cached.expires > Date.now()) {
        return cached.data
      }
    }

    // Try Redis cache (shared across instances)
    if (config.layer === CacheLayer.DATABASE) {
      const cached = await this.redis.get(key)
      if (cached) {
        return JSON.parse(cached)
      }
    }

    return null
  }

  async set<T>(key: string, data: T, config: CacheConfig): Promise<void> {
    const expires = Date.now() + (config.ttl * 1000)

    // Store in memory cache
    if (config.layer === CacheLayer.APPLICATION) {
      this.memoryCache.set(key, {
        data,
        expires,
        tags: config.tags || []
      })
    }

    // Store in Redis
    if (config.layer === CacheLayer.DATABASE) {
      await this.redis.setex(key, config.ttl, JSON.stringify(data))
      
      // Store cache tags for invalidation
      if (config.tags) {
        for (const tag of config.tags) {
          await this.redis.sadd(`tag:${tag}`, key)
          await this.redis.expire(`tag:${tag}`, config.ttl)
        }
      }
    }
  }

  async invalidateByTag(tag: string): Promise<void> {
    // Get all keys with this tag
    const keys = await this.redis.smembers(`tag:${tag}`)
    
    if (keys.length > 0) {
      // Remove from Redis
      await this.redis.del(...keys)
      
      // Remove from memory cache
      for (const key of keys) {
        this.memoryCache.delete(key)
      }
    }

    // Clean up tag set
    await this.redis.del(`tag:${tag}`)
  }
}

Cache Key Strategy#

Design consistent, hierarchical cache keys for easy management and invalidation.

// lib/cache/keys.ts
export class CacheKeyBuilder {
  private static separator = ':'
  private static version = 'v1'

  static user(userId: string, suffix?: string): string {
    const parts = [this.version, 'user', userId]
    if (suffix) parts.push(suffix)
    return parts.join(this.separator)
  }

  static organization(orgId: string, suffix?: string): string {
    const parts = [this.version, 'org', orgId]
    if (suffix) parts.push(suffix)
    return parts.join(this.separator)
  }

  static posts(orgId: string, filters?: {
    status?: string
    author?: string
    category?: string
    page?: number
  }): string {
    const parts = [this.version, 'org', orgId, 'posts']
    
    if (filters) {
      if (filters.status) parts.push('status', filters.status)
      if (filters.author) parts.push('author', filters.author)
      if (filters.category) parts.push('cat', filters.category)
      if (filters.page) parts.push('page', filters.page.toString())
    }
    
    return parts.join(this.separator)
  }

  static query(tableName: string, params: Record<string, any>): string {
    const sortedParams = Object.keys(params)
      .sort()
      .map(key => `${key}=${params[key]}`)
      .join('&')
    
    return [this.version, 'query', tableName, btoa(sortedParams)].join(this.separator)
  }

  static analytics(orgId: string, metric: string, period: string): string {
    return [this.version, 'analytics', orgId, metric, period].join(this.separator)
  }
}

// Usage examples
const userProfileKey = CacheKeyBuilder.user('123', 'profile')
const orgPostsKey = CacheKeyBuilder.posts('org-456', { status: 'published', page: 1 })
const analyticsKey = CacheKeyBuilder.analytics('org-456', 'pageviews', '7d')

Next.js ISR Optimization#

Advanced ISR Patterns#

Implement sophisticated ISR strategies for dynamic content with optimal performance.

// app/posts/[slug]/page.tsx
import { Metadata } from 'next'
import { createClient } from '@/lib/supabase/server'
import { CacheManager, CacheLayer } from '@/lib/cache/architecture'
import { CacheKeyBuilder } from '@/lib/cache/keys'

interface PostPageProps {
  params: { slug: string }
}

export async function generateMetadata({ params }: PostPageProps): Promise<Metadata> {
  const post = await getPostBySlug(params.slug)
  
  return {
    title: post?.title || 'Post Not Found',
    description: post?.excerpt,
    openGraph: {
      title: post?.title,
      description: post?.excerpt,
      images: post?.image ? [post.image] : []
    }
  }
}

export async function generateStaticParams() {
  const supabase = createClient()
  
  // Generate static params for most popular posts
  const { data: posts } = await supabase
    .from('posts')
    .select('slug')
    .eq('status', 'published')
    .order('view_count', { ascending: false })
    .limit(100) // Pre-generate top 100 posts
  
  return posts?.map(post => ({ slug: post.slug })) || []
}

async function getPostBySlug(slug: string) {
  const cache = new CacheManager()
  const cacheKey = CacheKeyBuilder.query('posts', { slug })
  
  // Try cache first
  let post = await cache.get(cacheKey, {
    ttl: 300, // 5 minutes
    layer: CacheLayer.DATABASE,
    tags: ['posts', `post:${slug}`]
  })
  
  if (!post) {
    const supabase = createClient()
    
    const { data, error } = await supabase
      .from('posts')
      .select(`
        *,
        author:users(name, avatar_url),
        organization:organizations(name, slug)
      `)
      .eq('slug', slug)
      .eq('status', 'published')
      .single()
    
    if (error || !data) return null
    
    post = data
    
    // Cache the result
    await cache.set(cacheKey, post, {
      ttl: 300,
      layer: CacheLayer.DATABASE,
      tags: ['posts', `post:${slug}`, `org:${data.organization.id}`]
    })
  }
  
  return post
}

export default async function PostPage({ params }: PostPageProps) {
  const post = await getPostBySlug(params.slug)
  
  if (!post) {
    return <div>Post not found</div>
  }
  
  return (
    <article>
      <h1>{post.title}</h1>
      <div dangerouslySetInnerHTML={{ __html: post.content }} />
    </article>
  )
}

// Enable ISR with revalidation
export const revalidate = 3600 // Revalidate every hour

Dynamic ISR with On-Demand Revalidation#

Implement on-demand revalidation for immediate content updates.

// app/api/revalidate/route.ts
import { NextRequest } from 'next/server'
import { revalidatePath, revalidateTag } from 'next/cache'
import { createClient } from '@/lib/supabase/server'

export async function POST(request: NextRequest) {
  try {
    const { searchParams } = new URL(request.url)
    const secret = searchParams.get('secret')
    
    // Verify secret token
    if (secret !== process.env.REVALIDATION_SECRET) {
      return Response.json({ error: 'Invalid secret' }, { status: 401 })
    }
    
    const body = await request.json()
    const { type, slug, tags, paths } = body
    
    switch (type) {
      case 'post':
        // Revalidate specific post
        revalidatePath(`/posts/${slug}`)
        revalidateTag(`post:${slug}`)
        break
        
      case 'posts-list':
        // Revalidate posts listing pages
        revalidatePath('/posts')
        revalidateTag('posts-list')
        break
        
      case 'tags':
        // Revalidate by cache tags
        if (tags) {
          tags.forEach((tag: string) => revalidateTag(tag))
        }
        break
        
      case 'paths':
        // Revalidate specific paths
        if (paths) {
          paths.forEach((path: string) => revalidatePath(path))
        }
        break
        
      default:
        return Response.json({ error: 'Invalid revalidation type' }, { status: 400 })
    }
    
    return Response.json({ 
      revalidated: true, 
      timestamp: new Date().toISOString() 
    })
    
  } catch (error) {
    console.error('Revalidation error:', error)
    return Response.json({ error: 'Revalidation failed' }, { status: 500 })
  }
}

// Trigger revalidation from Supabase webhook
// app/api/webhooks/supabase/route.ts
export async function POST(request: NextRequest) {
  try {
    const payload = await request.json()
    const { table, eventType, new: newRecord, old: oldRecord } = payload
    
    if (table === 'posts') {
      const slug = newRecord?.slug || oldRecord?.slug
      
      if (eventType === 'UPDATE' || eventType === 'INSERT') {
        // Revalidate the specific post
        await fetch(`${process.env.NEXTAUTH_URL}/api/revalidate?secret=${process.env.REVALIDATION_SECRET}`, {
          method: 'POST',
          headers: { 'Content-Type': 'application/json' },
          body: JSON.stringify({
            type: 'post',
            slug: slug
          })
        })
        
        // Revalidate posts list if status changed to published
        if (newRecord?.status === 'published') {
          await fetch(`${process.env.NEXTAUTH_URL}/api/revalidate?secret=${process.env.REVALIDATION_SECRET}`, {
            method: 'POST',
            headers: { 'Content-Type': 'application/json' },
            body: JSON.stringify({
              type: 'posts-list'
            })
          })
        }
      }
    }
    
    return Response.json({ success: true })
  } catch (error) {
    console.error('Webhook error:', error)
    return Response.json({ error: 'Webhook failed' }, { status: 500 })
  }
}

Redis Integration Patterns#

Redis Cache Implementation#

Implement Redis caching with connection pooling and error handling.

// lib/cache/redis.ts
import Redis from 'ioredis'

export class RedisCache {
  private redis: Redis
  private isConnected = false

  constructor() {
    this.redis = new Redis({
      host: process.env.REDIS_HOST,
      port: parseInt(process.env.REDIS_PORT || '6379'),
      password: process.env.REDIS_PASSWORD,
      retryDelayOnFailover: 100,
      maxRetriesPerRequest: 3,
      lazyConnect: true,
      keepAlive: 30000,
      // Connection pool settings
      family: 4,
      connectTimeout: 10000,
      commandTimeout: 5000,
    })

    this.redis.on('connect', () => {
      this.isConnected = true
      console.log('Redis connected')
    })

    this.redis.on('error', (error) => {
      this.isConnected = false
      console.error('Redis error:', error)
    })
  }

  async get<T>(key: string): Promise<T | null> {
    if (!this.isConnected) return null
    
    try {
      const value = await this.redis.get(key)
      return value ? JSON.parse(value) : null
    } catch (error) {
      console.error('Redis get error:', error)
      return null
    }
  }

  async set(key: string, value: any, ttlSeconds?: number): Promise<boolean> {
    if (!this.isConnected) return false
    
    try {
      const serialized = JSON.stringify(value)
      
      if (ttlSeconds) {
        await this.redis.setex(key, ttlSeconds, serialized)
      } else {
        await this.redis.set(key, serialized)
      }
      
      return true
    } catch (error) {
      console.error('Redis set error:', error)
      return false
    }
  }

  async del(key: string | string[]): Promise<boolean> {
    if (!this.isConnected) return false
    
    try {
      await this.redis.del(Array.isArray(key) ? key : [key])
      return true
    } catch (error) {
      console.error('Redis del error:', error)
      return false
    }
  }

  async exists(key: string): Promise<boolean> {
    if (!this.isConnected) return false
    
    try {
      const result = await this.redis.exists(key)
      return result === 1
    } catch (error) {
      console.error('Redis exists error:', error)
      return false
    }
  }

  async increment(key: string, by = 1): Promise<number> {
    if (!this.isConnected) return 0
    
    try {
      return await this.redis.incrby(key, by)
    } catch (error) {
      console.error('Redis increment error:', error)
      return 0
    }
  }

  async getMultiple<T>(keys: string[]): Promise<Record<string, T | null>> {
    if (!this.isConnected || keys.length === 0) return {}
    
    try {
      const values = await this.redis.mget(...keys)
      const result: Record<string, T | null> = {}
      
      keys.forEach((key, index) => {
        const value = values[index]
        result[key] = value ? JSON.parse(value) : null
      })
      
      return result
    } catch (error) {
      console.error('Redis mget error:', error)
      return {}
    }
  }

  async setMultiple(data: Record<string, any>, ttlSeconds?: number): Promise<boolean> {
    if (!this.isConnected) return false
    
    try {
      const pipeline = this.redis.pipeline()
      
      Object.entries(data).forEach(([key, value]) => {
        const serialized = JSON.stringify(value)
        if (ttlSeconds) {
          pipeline.setex(key, ttlSeconds, serialized)
        } else {
          pipeline.set(key, serialized)
        }
      })
      
      await pipeline.exec()
      return true
    } catch (error) {
      console.error('Redis mset error:', error)
      return false
    }
  }
}

// Singleton instance
export const redisCache = new RedisCache()

Advanced Redis Patterns#

Implement sophisticated Redis patterns for complex caching scenarios.

// lib/cache/advanced-redis.ts
export class AdvancedRedisCache extends RedisCache {
  
  // Cache with automatic refresh
  async getWithRefresh<T>(
    key: string,
    refreshFn: () => Promise<T>,
    ttlSeconds: number,
    refreshThreshold = 0.8
  ): Promise<T> {
    const cached = await this.get<{ data: T, timestamp: number }>(key)
    
    if (cached) {
      const age = Date.now() - cached.timestamp
      const maxAge = ttlSeconds * 1000
      
      // If cache is getting stale, refresh in background
      if (age > maxAge * refreshThreshold) {
        // Don't await - refresh in background
        this.refreshCache(key, refreshFn, ttlSeconds).catch(console.error)
      }
      
      return cached.data
    }
    
    // Cache miss - fetch and cache
    const data = await refreshFn()
    await this.set(key, { data, timestamp: Date.now() }, ttlSeconds)
    return data
  }

  private async refreshCache<T>(
    key: string,
    refreshFn: () => Promise<T>,
    ttlSeconds: number
  ): Promise<void> {
    try {
      const data = await refreshFn()
      await this.set(key, { data, timestamp: Date.now() }, ttlSeconds)
    } catch (error) {
      console.error('Background cache refresh failed:', error)
    }
  }

  // Distributed locking for cache stampede prevention
  async getWithLock<T>(
    key: string,
    fetchFn: () => Promise<T>,
    ttlSeconds: number,
    lockTtlSeconds = 30
  ): Promise<T> {
    // Try to get from cache first
    const cached = await this.get<T>(key)
    if (cached) return cached

    const lockKey = `lock:${key}`
    const lockValue = Math.random().toString(36)
    
    // Try to acquire lock
    const lockAcquired = await this.redis.set(
      lockKey, 
      lockValue, 
      'EX', 
      lockTtlSeconds, 
      'NX'
    )
    
    if (lockAcquired) {
      try {
        // We have the lock - fetch and cache
        const data = await fetchFn()
        await this.set(key, data, ttlSeconds)
        return data
      } finally {
        // Release lock
        await this.releaseLock(lockKey, lockValue)
      }
    } else {
      // Lock exists - wait and retry
      await this.sleep(100 + Math.random() * 200) // Random jitter
      return this.getWithLock(key, fetchFn, ttlSeconds, lockTtlSeconds)
    }
  }

  private async releaseLock(lockKey: string, lockValue: string): Promise<void> {
    const script = `
      if redis.call("get", KEYS[1]) == ARGV[1] then
        return redis.call("del", KEYS[1])
      else
        return 0
      end
    `
    await this.redis.eval(script, 1, lockKey, lockValue)
  }

  private sleep(ms: number): Promise<void> {
    return new Promise(resolve => setTimeout(resolve, ms))
  }

  // Cache warming
  async warmCache(warmingConfig: Array<{
    key: string
    fetchFn: () => Promise<any>
    ttlSeconds: number
  }>): Promise<void> {
    const promises = warmingConfig.map(async ({ key, fetchFn, ttlSeconds }) => {
      try {
        const exists = await this.exists(key)
        if (!exists) {
          const data = await fetchFn()
          await this.set(key, data, ttlSeconds)
        }
      } catch (error) {
        console.error(`Failed to warm cache for key ${key}:`, error)
      }
    })

    await Promise.allSettled(promises)
  }
}

export const advancedRedisCache = new AdvancedRedisCache()

SWR and Client-Side Caching#

SWR Configuration and Patterns#

Implement SWR with Supabase for optimal client-side caching.

// lib/swr/config.ts
import useSWR, { SWRConfiguration, mutate } from 'swr'
import { createClient } from '@/lib/supabase/client'

const supabase = createClient()

// Global SWR configuration
export const swrConfig: SWRConfiguration = {
  revalidateOnFocus: false,
  revalidateOnReconnect: true,
  refreshInterval: 0,
  dedupingInterval: 2000,
  errorRetryCount: 3,
  errorRetryInterval: 5000,
  onError: (error) => {
    console.error('SWR Error:', error)
    // You can add error reporting here
  }
}

// Custom fetcher for Supabase queries
export const supabaseFetcher = async (key: string) => {
  const [table, query] = key.split('|')
  const queryParams = JSON.parse(query)
  
  let queryBuilder = supabase.from(table).select(queryParams.select || '*')
  
  // Apply filters
  if (queryParams.filters) {
    queryParams.filters.forEach((filter: any) => {
      queryBuilder = queryBuilder.filter(filter.column, filter.operator, filter.value)
    })
  }
  
  // Apply ordering
  if (queryParams.order) {
    queryBuilder = queryBuilder.order(queryParams.order.column, {
      ascending: queryParams.order.ascending
    })
  }
  
  // Apply pagination
  if (queryParams.range) {
    queryBuilder = queryBuilder.range(queryParams.range.from, queryParams.range.to)
  }
  
  const { data, error } = await queryBuilder
  
  if (error) throw error
  return data
}

// Custom hooks for common patterns
export function usePosts(orgId: string, filters?: {
  status?: string
  author?: string
  limit?: number
}) {
  const key = `posts|${JSON.stringify({
    select: '*, author:users(name, avatar_url)',
    filters: [
      { column: 'organization_id', operator: 'eq', value: orgId },
      ...(filters?.status ? [{ column: 'status', operator: 'eq', value: filters.status }] : []),
      ...(filters?.author ? [{ column: 'author_id', operator: 'eq', value: filters.author }] : [])
    ],
    order: { column: 'created_at', ascending: false },
    ...(filters?.limit ? { range: { from: 0, to: filters.limit - 1 } } : {})
  })}`
  
  return useSWR(key, supabaseFetcher, {
    ...swrConfig,
    refreshInterval: 30000 // Refresh every 30 seconds for posts
  })
}

export function useUser(userId: string) {
  const key = `users|${JSON.stringify({
    select: 'id, name, email, avatar_url, created_at',
    filters: [{ column: 'id', operator: 'eq', value: userId }]
  })}`
  
  return useSWR(userId ? key : null, supabaseFetcher, {
    ...swrConfig,
    refreshInterval: 300000 // Refresh every 5 minutes for user data
  })
}

// Optimistic updates helper
export async function optimisticUpdate<T>(
  key: string,
  updateFn: (current: T) => T,
  mutateFn: () => Promise<T>
) {
  try {
    // Optimistically update the cache
    await mutate(key, updateFn, false)
    
    // Perform the actual update
    const result = await mutateFn()
    
    // Revalidate with the real data
    await mutate(key, result, true)
    
    return result
  } catch (error) {
    // Revert on error
    await mutate(key)
    throw error
  }
}

Real-time Cache Synchronization#

Synchronize SWR cache with Supabase Realtime for live updates.

// hooks/useRealtimeSync.ts
import { useEffect } from 'react'
import { mutate } from 'swr'
import { createClient } from '@/lib/supabase/client'

export function useRealtimeSync(table: string, cacheKeys: string[]) {
  const supabase = createClient()

  useEffect(() => {
    const channel = supabase
      .channel(`${table}_changes`)
      .on('postgres_changes', 
        { 
          event: '*', 
          schema: 'public', 
          table: table 
        },
        (payload) => {
          console.log('Realtime update:', payload)
          
          // Invalidate related cache keys
          cacheKeys.forEach(key => {
            mutate(key)
          })
        }
      )
      .subscribe()

    return () => {
      supabase.removeChannel(channel)
    }
  }, [table, cacheKeys, supabase])
}

// Usage in components
export function PostsList({ orgId }: { orgId: string }) {
  const { data: posts, error, isLoading } = usePosts(orgId, { status: 'published' })
  
  // Sync with realtime updates
  useRealtimeSync('posts', [`posts|${JSON.stringify({
    select: '*, author:users(name, avatar_url)',
    filters: [
      { column: 'organization_id', operator: 'eq', value: orgId },
      { column: 'status', operator: 'eq', value: 'published' }
    ],
    order: { column: 'created_at', ascending: false }
  })}`])

  if (isLoading) return <div>Loading...</div>
  if (error) return <div>Error: {error.message}</div>

  return (
    <div>
      {posts?.map(post => (
        <div key={post.id}>
          <h3>{post.title}</h3>
          <p>By {post.author.name}</p>
        </div>
      ))}
    </div>
  )
}

This comprehensive caching guide provides the foundation for building high-performance applications with sophisticated caching strategies. Proper caching implementation can reduce database load by 80-90% while dramatically improving user experience through faster response times and reduced latency.

Cache Invalidation Strategies#

Event-Driven Invalidation#

Implement sophisticated cache invalidation based on database events and business logic.

// lib/cache/invalidation.ts
import { CacheManager } from './architecture'
import { CacheKeyBuilder } from './keys'

export class CacheInvalidationManager {
  private cache = new CacheManager()

  async invalidatePost(postId: string, orgId: string) {
    const tags = [
      'posts',
      `post:${postId}`,
      `org:${orgId}:posts`,
      'posts-list'
    ]

    await Promise.all(tags.map(tag => this.cache.invalidateByTag(tag)))
  }

  async invalidateUser(userId: string) {
    const tags = [
      `user:${userId}`,
      'users-list'
    ]

    await Promise.all(tags.map(tag => this.cache.invalidateByTag(tag)))
  }

  async invalidateOrganization(orgId: string) {
    const tags = [
      `org:${orgId}`,
      `org:${orgId}:posts`,
      `org:${orgId}:users`,
      `org:${orgId}:analytics`
    ]

    await Promise.all(tags.map(tag => this.cache.invalidateByTag(tag)))
  }

  // Smart invalidation based on data relationships
  async smartInvalidate(table: string, record: any, eventType: 'INSERT' | 'UPDATE' | 'DELETE') {
    switch (table) {
      case 'posts':
        await this.handlePostInvalidation(record, eventType)
        break
      case 'users':
        await this.handleUserInvalidation(record, eventType)
        break
      case 'organizations':
        await this.handleOrganizationInvalidation(record, eventType)
        break
    }
  }

  private async handlePostInvalidation(post: any, eventType: string) {
    // Always invalidate the specific post
    await this.invalidatePost(post.id, post.organization_id)

    // If status changed to/from published, invalidate lists
    if (eventType === 'UPDATE' && post.status === 'published') {
      await this.cache.invalidateByTag('published-posts')
    }

    // If author changed, invalidate author's posts
    if (post.author_id) {
      await this.cache.invalidateByTag(`author:${post.author_id}:posts`)
    }
  }

  private async handleUserInvalidation(user: any, eventType: string) {
    await this.invalidateUser(user.id)

    // If user was deleted, invalidate all their content
    if (eventType === 'DELETE') {
      await this.cache.invalidateByTag(`author:${user.id}`)
    }
  }

  private async handleOrganizationInvalidation(org: any, eventType: string) {
    await this.invalidateOrganization(org.id)

    // If organization was deleted, clean up all related cache
    if (eventType === 'DELETE') {
      await this.cache.invalidateByTag(`org:${org.id}`)
    }
  }
}

// Database trigger for automatic invalidation
-- Create function to handle cache invalidation
CREATE OR REPLACE FUNCTION notify_cache_invalidation()
RETURNS TRIGGER AS $$
BEGIN
  -- Send notification with table name and record data
  PERFORM pg_notify(
    'cache_invalidation',
    json_build_object(
      'table', TG_TABLE_NAME,
      'operation', TG_OP,
      'new', row_to_json(NEW),
      'old', row_to_json(OLD)
    )::text
  );
  
  RETURN COALESCE(NEW, OLD);
END;
$$ LANGUAGE plpgsql;

-- Create triggers for cache invalidation
CREATE TRIGGER posts_cache_invalidation
  AFTER INSERT OR UPDATE OR DELETE ON posts
  FOR EACH ROW EXECUTE FUNCTION notify_cache_invalidation();

CREATE TRIGGER users_cache_invalidation
  AFTER INSERT OR UPDATE OR DELETE ON users
  FOR EACH ROW EXECUTE FUNCTION notify_cache_invalidation();

CREATE TRIGGER organizations_cache_invalidation
  AFTER INSERT OR UPDATE OR DELETE ON organizations
  FOR EACH ROW EXECUTE FUNCTION notify_cache_invalidation();

Time-Based Invalidation#

Implement time-based cache expiration with graceful degradation.

// lib/cache/time-based.ts
export class TimeBasedCache {
  private cache = new CacheManager()

  async getWithTTL<T>(
    key: string,
    fetchFn: () => Promise<T>,
    ttlSeconds: number,
    staleWhileRevalidate = true
  ): Promise<T> {
    const cached = await this.cache.get<{
      data: T
      timestamp: number
      ttl: number
    }>(key, { ttl: ttlSeconds, layer: CacheLayer.DATABASE })

    if (cached) {
      const age = Date.now() - cached.timestamp
      const isExpired = age > cached.ttl * 1000

      if (!isExpired) {
        return cached.data
      }

      if (staleWhileRevalidate) {
        // Return stale data while revalidating in background
        this.revalidateInBackground(key, fetchFn, ttlSeconds)
        return cached.data
      }
    }

    // Cache miss or expired without SWR - fetch fresh data
    const data = await fetchFn()
    await this.setCachedData(key, data, ttlSeconds)
    return data
  }

  private async revalidateInBackground<T>(
    key: string,
    fetchFn: () => Promise<T>,
    ttlSeconds: number
  ): Promise<void> {
    try {
      const data = await fetchFn()
      await this.setCachedData(key, data, ttlSeconds)
    } catch (error) {
      console.error('Background revalidation failed:', error)
    }
  }

  private async setCachedData<T>(
    key: string,
    data: T,
    ttlSeconds: number
  ): Promise<void> {
    await this.cache.set(key, {
      data,
      timestamp: Date.now(),
      ttl: ttlSeconds
    }, {
      ttl: ttlSeconds * 2, // Store longer than TTL for SWR
      layer: CacheLayer.DATABASE
    })
  }

  // Hierarchical TTL - different TTL for different data types
  getTTLForDataType(dataType: string): number {
    const ttlMap: Record<string, number> = {
      'user-profile': 300,      // 5 minutes
      'posts-list': 60,         // 1 minute
      'post-content': 3600,     // 1 hour
      'analytics': 1800,        // 30 minutes
      'settings': 86400,        // 24 hours
      'static-content': 604800  // 1 week
    }

    return ttlMap[dataType] || 300 // Default 5 minutes
  }
}

Performance Monitoring and Optimization#

Cache Performance Metrics#

Monitor cache performance to optimize hit rates and identify bottlenecks.

// lib/monitoring/cache-metrics.ts
export class CacheMetrics {
  private redis = new Redis(process.env.REDIS_URL!)

  async recordCacheHit(key: string, layer: CacheLayer) {
    const date = new Date().toISOString().split('T')[0]
    await Promise.all([
      this.redis.incr(`cache:hits:${layer}:${date}`),
      this.redis.incr(`cache:hits:total:${date}`),
      this.redis.sadd(`cache:keys:${date}`, key)
    ])
  }

  async recordCacheMiss(key: string, layer: CacheLayer) {
    const date = new Date().toISOString().split('T')[0]
    await Promise.all([
      this.redis.incr(`cache:misses:${layer}:${date}`),
      this.redis.incr(`cache:misses:total:${date}`),
      this.redis.sadd(`cache:missed-keys:${date}`, key)
    ])
  }

  async getCacheStats(days = 7): Promise<{
    hitRate: number
    hitsByLayer: Record<CacheLayer, number>
    missedKeys: string[]
    totalRequests: number
  }> {
    const dates = Array.from({ length: days }, (_, i) => {
      const date = new Date()
      date.setDate(date.getDate() - i)
      return date.toISOString().split('T')[0]
    })

    const pipeline = this.redis.pipeline()
    
    dates.forEach(date => {
      pipeline.get(`cache:hits:total:${date}`)
      pipeline.get(`cache:misses:total:${date}`)
      Object.values(CacheLayer).forEach(layer => {
        pipeline.get(`cache:hits:${layer}:${date}`)
      })
      pipeline.smembers(`cache:missed-keys:${date}`)
    })

    const results = await pipeline.exec()
    
    // Process results and calculate metrics
    let totalHits = 0
    let totalMisses = 0
    const hitsByLayer: Record<CacheLayer, number> = {} as any
    const missedKeys = new Set<string>()

    // Parse pipeline results
    let resultIndex = 0
    dates.forEach(() => {
      totalHits += parseInt(results![resultIndex++]?.[1] as string || '0')
      totalMisses += parseInt(results![resultIndex++]?.[1] as string || '0')
      
      Object.values(CacheLayer).forEach(layer => {
        const hits = parseInt(results![resultIndex++]?.[1] as string || '0')
        hitsByLayer[layer] = (hitsByLayer[layer] || 0) + hits
      })
      
      const dayMissedKeys = results![resultIndex++]?.[1] as string[] || []
      dayMissedKeys.forEach(key => missedKeys.add(key))
    })

    const totalRequests = totalHits + totalMisses
    const hitRate = totalRequests > 0 ? totalHits / totalRequests : 0

    return {
      hitRate,
      hitsByLayer,
      missedKeys: Array.from(missedKeys),
      totalRequests
    }
  }

  async getSlowCacheOperations(minDurationMs = 100): Promise<Array<{
    key: string
    operation: string
    duration: number
    timestamp: string
  }>> {
    // This would require custom instrumentation in your cache operations
    const { data } = await this.redis.zrevrangebyscore(
      'cache:slow-operations',
      '+inf',
      minDurationMs,
      'WITHSCORES',
      'LIMIT', 0, 100
    )

    const operations = []
    for (let i = 0; i < data.length; i += 2) {
      const [key, operation, timestamp] = data[i].split('|')
      operations.push({
        key,
        operation,
        duration: parseFloat(data[i + 1]),
        timestamp
      })
    }

    return operations
  }
}

// Instrumented cache wrapper
export class InstrumentedCache extends CacheManager {
  private metrics = new CacheMetrics()

  async get<T>(key: string, config: CacheConfig): Promise<T | null> {
    const startTime = Date.now()
    
    try {
      const result = await super.get<T>(key, config)
      const duration = Date.now() - startTime
      
      if (result !== null) {
        await this.metrics.recordCacheHit(key, config.layer)
      } else {
        await this.metrics.recordCacheMiss(key, config.layer)
      }
      
      // Record slow operations
      if (duration > 100) {
        await this.recordSlowOperation(key, 'get', duration)
      }
      
      return result
    } catch (error) {
      await this.metrics.recordCacheMiss(key, config.layer)
      throw error
    }
  }

  private async recordSlowOperation(key: string, operation: string, duration: number) {
    const timestamp = new Date().toISOString()
    const value = `${key}|${operation}|${timestamp}`
    
    await this.redis.zadd('cache:slow-operations', duration, value)
    await this.redis.zremrangebyrank('cache:slow-operations', 0, -1001) // Keep only top 1000
  }
}

Cache Warming Strategies#

Implement intelligent cache warming to improve application startup performance.

// lib/cache/warming.ts
export class CacheWarming {
  private cache = new InstrumentedCache()

  async warmCriticalData(orgId: string): Promise<void> {
    const warmingTasks = [
      this.warmUserProfiles(orgId),
      this.warmPopularPosts(orgId),
      this.warmOrganizationSettings(orgId),
      this.warmAnalyticsSummary(orgId)
    ]

    await Promise.allSettled(warmingTasks)
  }

  private async warmUserProfiles(orgId: string): Promise<void> {
    const supabase = createClient()
    
    const { data: users } = await supabase
      .from('user_organizations')
      .select('user_id, users(id, name, email, avatar_url)')
      .eq('organization_id', orgId)
      .limit(50) // Warm top 50 active users

    if (users) {
      const warmingPromises = users.map(async ({ users: user }) => {
        if (user) {
          const key = CacheKeyBuilder.user(user.id, 'profile')
          await this.cache.set(key, user, {
            ttl: 300,
            layer: CacheLayer.DATABASE,
            tags: [`user:${user.id}`]
          })
        }
      })

      await Promise.allSettled(warmingPromises)
    }
  }

  private async warmPopularPosts(orgId: string): Promise<void> {
    const supabase = createClient()
    
    const { data: posts } = await supabase
      .from('posts')
      .select('*')
      .eq('organization_id', orgId)
      .eq('status', 'published')
      .order('view_count', { ascending: false })
      .limit(20)

    if (posts) {
      const warmingPromises = posts.map(async (post) => {
        const key = CacheKeyBuilder.query('posts', { id: post.id })
        await this.cache.set(key, post, {
          ttl: 3600,
          layer: CacheLayer.DATABASE,
          tags: ['posts', `post:${post.id}`, `org:${orgId}:posts`]
        })
      })

      await Promise.allSettled(warmingPromises)
    }
  }

  private async warmOrganizationSettings(orgId: string): Promise<void> {
    const supabase = createClient()
    
    const { data: org } = await supabase
      .from('organizations')
      .select('*')
      .eq('id', orgId)
      .single()

    if (org) {
      const key = CacheKeyBuilder.organization(orgId, 'settings')
      await this.cache.set(key, org, {
        ttl: 86400, // 24 hours
        layer: CacheLayer.DATABASE,
        tags: [`org:${orgId}`]
      })
    }
  }

  private async warmAnalyticsSummary(orgId: string): Promise<void> {
    const key = CacheKeyBuilder.analytics(orgId, 'summary', '7d')
    
    // Check if already cached
    const cached = await this.cache.get(key, {
      ttl: 1800,
      layer: CacheLayer.DATABASE
    })

    if (!cached) {
      // Generate analytics summary
      const summary = await this.generateAnalyticsSummary(orgId)
      await this.cache.set(key, summary, {
        ttl: 1800, // 30 minutes
        layer: CacheLayer.DATABASE,
        tags: [`org:${orgId}:analytics`]
      })
    }
  }

  private async generateAnalyticsSummary(orgId: string) {
    const supabase = createClient()
    
    const { data } = await supabase
      .rpc('get_analytics_summary', {
        org_id: orgId,
        days_back: 7
      })

    return data
  }

  // Scheduled cache warming
  async scheduleWarming(): Promise<void> {
    // This would typically be called by a cron job or scheduled task
    const supabase = createClient()
    
    const { data: organizations } = await supabase
      .from('organizations')
      .select('id')
      .eq('status', 'active')

    if (organizations) {
      // Warm cache for active organizations in batches
      const batchSize = 5
      for (let i = 0; i < organizations.length; i += batchSize) {
        const batch = organizations.slice(i, i + batchSize)
        const warmingPromises = batch.map(org => this.warmCriticalData(org.id))
        await Promise.allSettled(warmingPromises)
        
        // Small delay between batches to avoid overwhelming the system
        await new Promise(resolve => setTimeout(resolve, 1000))
      }
    }
  }
}

This comprehensive caching guide provides enterprise-grade caching strategies that can handle millions of users while maintaining data consistency and optimal performance. The patterns covered here will help you build applications that scale efficiently and provide excellent user experiences through intelligent caching at every layer of your stack.

Remember that caching is about finding the right balance between performance and data freshness. Start with simple caching strategies and gradually implement more sophisticated patterns as your application scales and requirements become more complex.

Frequently Asked Questions

|

Have more questions? Contact us