Redis and Rails 8: The Complete Guide for Junior Developers

Introduction: Why Redis will change the way you develop


Are you starting with Rails and hearing about Redis everywhere? In discussions about performance, caching, user sessions... Redis seems to be the magical solution to all problems. But what exactly is it? And above all, how do you use it concretely in your Rails applications?


In this article, we’ll explore Redis from A to Z: what it is, why it’s so fast, how it compares to traditional databases, and above all how to integrate it easily into Rails 8. Get ready to discover a tool that will become indispensable in your developer toolbox!


Some aspects may seem a little complex to understand. That’s normal! This touches on optimization challenges for applications that must handle thousands, even millions of users. Don’t panic :)

What is Redis exactly?

The simple definition

Redis (Remote Dictionary Server) is a key-value database that stores all its data in memory (RAM). Think of it as a giant ultra-fast dictionary where you can store and retrieve information instantly.


Created in 2009 by Salvatore Sanfilippo, Redis is today one of the most popular NoSQL databases in the world. More than 30,000 companies use it, including British Airways, HackerRank and MGM Resorts. In 2025, Redis 8.2 is the latest major version available, delivering impressive performance improvements: up to 91% faster than Redis 7.2!

The key concept: RAM

The fundamental difference between Redis and a traditional database like PostgreSQL or MySQL? Where data is stored.

  1. Traditional database : stores data on disk (HDD or SSD)
  2. Redis : stores data in RAM


This difference may seem minor, but it’s enormous in terms of performance:

  1. RAM access: 100 nanoseconds
  2. SSD access: 100 microseconds (1,000 times slower!)


100 microseconds may already not seem like much, you’ll say? Consider the following analogy: Imagine you’re looking for a book in your library. With a traditional database, you have to go to another room to find it. With Redis, the book is already in your hands.

Redis data structures

Redis isn’t just a simple key-value store. It offers advanced data structures that make it extremely versatile. It’s a bit like having several types of containers to store different kinds of objects.


Below, I’ll show you some examples of how data is structured in Redis.

1. Strings

The most basic structure. A key points to a text (or binary) value.


Example usage:

# Store a view counter
REDIS.set("article:123:views", 42)
REDIS.get("article:123:views")
# => "42"
# Increment directly in Redis
REDIS.incr("article:123:views")
# => 43


Perfect use cases:

  1. Counters (article views, likes)
  2. HTML fragment cache
  3. Temporary session tokens
  4. Configuration values


Capacity: up to 512 MB per value!

2. Hashes

Imagine a Ruby hash stored in Redis. It’s perfect for representing objects with several attributes.


Example usage:

# Store user information
REDIS.hset("user:1", "name", "Alice")
REDIS.hset("user:1", "email", "alice@example.com")
REDIS.hset("user:1", "age", "28")
# Retrieve all data
REDIS.hgetall("user:1")
# => {"name"=>"Alice", "email"=>"alice@example.com", "age"=>"28"}
# Retrieve a specific field
REDIS.hget("user:1", "email")
# => "alice@example.com"


Perfect use cases:

  1. User profiles
  2. Object configurations
  3. Multi-tenant metrics
  4. Lightweight structured data


Advantage: very memory-efficient for storing many small objects

3. Lists

Ordered lists of elements. Think of a Ruby array, but in Redis. Perfect for queues!


Example usage:

# Ajouter des tâches à une file d'attente
REDIS.lpush("jobs:queue", "send_email")
REDIS.lpush("jobs:queue", "process_payment")
REDIS.lpush("jobs:queue", "generate_report")
# Traiter les tâches (FIFO)
REDIS.rpop("jobs:queue")
# => "send_email"
# Voir les 5 premières tâches sans les retirer
REDIS.lrange("jobs:queue", 0, 4)


Perfect use cases:

  1. Message queues
  2. Activity timelines (news feed)
  3. History of recent actions
  4. Notification system


Important note: operations at the head and tail of the list are O(1) — ultra-fast!

4. Sets

Collections of unique unordered elements. Exactly like a Set in Ruby.


Example usage:

# Article tags
REDIS.sadd("article:123:tags", "ruby", "rails", "redis")
REDIS.sadd("article:123:tags", "ruby") # Will not be added twice
# Check membership
REDIS.sismember("article:123:tags", "ruby")
# => true
# Set operations
REDIS.sadd("article:456:tags", "ruby", "javascript")
REDIS.sinter("article:123:tags", "article:456:tags")
# => ["ruby"] # Intersection


Perfect use cases:

  1. Tag systems
  2. Unique friend/follower lists
  3. Tracking of unique IP addresses
  4. Simple many-to-many relationships


Bonus: set operations (union, intersection, difference) in O(N)

5. Sorted Sets

The most powerful structure! Unique elements with a score that determines the order.


Example usage:

# Player ranking with their scores
REDIS.zadd("leaderboard", 1500, "player:alice")
REDIS.zadd("leaderboard", 2200, "player:bob")
REDIS.zadd("leaderboard", 1800, "player:charlie")
# Top 3 best players
REDIS.zrevrange("leaderboard", 0, 2, with_scores: true)
# => [["player:bob", 2200.0], ["player:charlie", 1800.0], ["player:alice", 1500.0]]
# Ranking of a specific player
REDIS.zrevrank("leaderboard", "player:alice")
# => 2 # Third position


Perfect use cases:

  1. Leaderboards (rankings)
  2. Priority queues
  3. Rate limiting
  4. Time-ordered data

Performance: all basic operations in O(log N)

Redis vs Traditional Databases: The Match


Now that you know Redis, let’s compare it with PostgreSQL or MySQL to understand when to use one or the other.

Comparison Table

Criterion
Redis
PostgreSQL/MySQL
Storage
In memory (RAM)
On disk (HDD/SSD)
Read speed
Sub-millisecond
Milliseconds to seconds
Data model
Key-value + structures
Relational (tables)
Complex queries
Limited
Full SQL, JOINs
*ACID Transactions ()**
Partial
Complete
Persistence
Optional
Native
Capacity
Limited by RAM
Limited by disk
Cost
RAM => more expensive
Disk => cheaper

(*) Quick look at ACID transactions: what is it?

An ACID transaction is a database operation that respects four essential properties: Atomicity, Consistency, Isolation and Durability. These properties guarantee reliability and data integrity, even in case of error or failure.


🔹 Atomicity

  1. All or nothing: a transaction is indivisible. Either all operations it contains are executed, or none are.
  2. Example: during a bank transfer, if the debit from account A fails, the credit to account B should not occur either.

🔹 Consistency

  1. The database moves from a valid state to another valid state after the transaction.
  2. Integrity rules (constraints, relationships, etc.) must always be respected.
  3. Example: an inventory should never contain a negative number of items after an update.

🔹 Isolation

  1. Concurrent transactions must not interfere with each other.
  2. Each transaction should execute as if it were the only one on the system.
  3. This avoids side effects like reading data that has not been committed by another transaction.

🔹 Durability

  1. Once a transaction is committed, its effects are permanent, even in a system crash.
  2. The modified data is saved


Redis offers only partial support since it guarantees only atomicity and a form of isolation. It does not guarantee consistency nor durability.


Atomicity

  1. Redis guarantees that the commands of a transaction are executed all or nothing.
  2. If an error occurs before execution nothing is applied.
  3. But beware: Redis does not cancel individual commands if one fails after execution.

⚠️ Consistency

  1. Redis does not automatically enforce integrity constraints like relational databases.
  2. Consistency therefore depends on the application logic, not the Redis engine itself.
  3. In a relational database like PostgreSQL this is ensured by the schema you must follow to the letter to insert information.

Isolation

  1. Redis executes the commands of a transaction sequentially and without interruption by other clients.
  2. This guarantees simple isolation, but not as fine as the isolation levels of relational DBMS.

⚠️ Durability

  1. Redis is an in-memory database, so data lives in RAM. In short, if there’s a crash your RAM is emptied and you lose everything.
  2. There are nevertheless modes that can improve durability, but of course they come at a cost:
  3. RDB (snapshot): periodic backups, risk of loss between snapshots.
  4. AOF (append-only file): better durability, but can be disabled or configured with delays.
  5. In case of a crash, some data may be lost if it hasn’t been written to disk.

When to use a traditional database?

✅ PostgreSQL/MySQL is perfect for:

  1. Persistent data: Everything that must survive a restart
  2. Complex relationships: JOINs, foreign keys, constraints
  3. Analytical queries: Aggregations, GROUP BY, complex calculations
  4. Large data: Several terabytes of data
  5. Data integrity: Complete ACID transactions
  6. Complete history: You never want to lose this data

The hybrid approach (the best!)

In reality, you will use both:


# PostgreSQL for core data
user = User.find(params[:id])
# Redis for caching
@posts = Rails.cache.fetch("user:#{user.id}:posts", expires_in: 5.minutes) do
user.posts.published.includes(:comments).to_a
end
# Redis for real-time counters
@views_count = REDIS.get("user:#{user.id}:profile_views").to_i
REDIS.incr("user:#{user.id}:profile_views")


General principle:

  1. PostgreSQL = Source of truth, permanent data
  2. Redis = Speed layer, temporary or derived data

Example of using Redis

Here are several concrete examples of using Redis.

  1. Caching: Temporarily store data that is expensive to compute
# Cache a complex query to avoid overloading PostgreSQL
Rails.cache.fetch("user:#{user.id}:dashboard", expires_in: 15.minutes) do
# This expensive query only runs if the cache is empty
{
posts: user.posts.includes(:comments).recent.to_a,
stats: user.calculate_stats,
notifications: user.notifications.unread.to_a
}
end

Why? It avoids hammering PostgreSQL with the same queries. The result is in RAM, instantly retrievable.


  1. User sessions: Ephemeral data that must be ultra-fast
# Session stored in Redis instead of cookies (or PostgreSQL)
config.session_store :redis_store,
servers: ["redis://localhost:6379/0/session"],
expire_after: 90.minutes

Why? Redis can handle thousands of session accesses per second without flinching. PostgreSQL would slow down with this ultra-frequent access pattern. Bonus: you can invalidate sessions instantly (logout everywhere).


  1. Real-time counters with delayed synchronization: Likes, views, live statistics
# Pattern hybride : Redis pour la vitesse, PostgreSQL pour la persistance
# 1. Incrémenter dans Redis (instantané, pas de lock sur PostgreSQL)
REDIS.incr("article:#{@article.id}:views")
# 2. Job en arrière-plan pour synchroniser vers PostgreSQL toutes les 5 minutes
class SyncViewCountsJob < ApplicationJob
def perform
Article.find_each do |article|
redis_count = REDIS.get("article:#{article.id}:views").to_i
if redis_count > 0
article.increment!(:view_count, redis_count)
REDIS.del("article:#{article.id}:views")
end
end
end
end

Why? Increment operations are atomic in Redis (no race conditions) and do not create any lock on PostgreSQL. You can absorb 10,000 increments per second without slowing down your main DB. PostgreSQL will keep only the final value.


  1. Job queues: With Sidekiq, for example
# Sidekiq uses Redis Lists as a job queue
SendEmailJob.perform_later(user_id: @user.id)
# In Redis, this creates:
# LPUSH "queue:default" "{class: 'SendEmailJob', args: [123]}"
# RPOP "queue:default" # The worker fetches the job

Why? Redis Lists offer ultra-fast LPUSH/RPOP atomic operations. PostgreSQL cannot beat this performance for high-frequency queues. Redis can handle millions of jobs per hour.


  1. Leaderboards and rankings: Perfect with Sorted Sets
# Redis for TOP N only (e.g., top 1000)
# We do NOT keep all players in memory!
# Score update
def update_leaderboard(player_id, score)
REDIS.zadd("game:leaderboard", score, player_id)
# IMPORTANT: Keep only the top 1000 to avoid memory overload
current_size = REDIS.zcard("game:leaderboard")
if current_size > 1000
REDIS.zremrangebyrank("game:leaderboard", 0, current_size - 1001)
end
end
# Instantly display the top 10 players
top_10_ids = REDIS.zrevrange("game:leaderboard", 0, 9, with_scores: true)
# For a player not in the top 1000
def player_rank(player_id)
redis_rank = REDIS.zrevrank("game:leaderboard", player_id)
if redis_rank.nil?
# Not in Redis top 1000, fallback to PostgreSQL
Player.where("score > ?", Player.find(player_id).score).count + 1
else
redis_rank + 1 # Redis uses 0-based indexing
end
end
# PostgreSQL stores ALL scores for history and global rankings
class Player < ApplicationRecord
# Index on score for ranking queries
# add_index :players, :score
end

Why? Redis Sorted Sets keep order automatically in O(log N) for the top N (1000 players). Perfect for showing leaders. For players outside the top, PostgreSQL takes over.


Important trade-off:

  1. ✅ Top 1000 in Redis: instant access, ultra-fast updates
  2. ✅ All players in PostgreSQL: full ranking available
  3. ⚠️ If you have 10 million players and put everything in Redis, you’ll consume a lot of RAM (avoid!).
  4. 💡 General rule: Redis for the "hot data" (top/active), PostgreSQL for the "cold data" (historic/complete)


  1. Real-time pub/sub: For ActionCable and WebSockets
# Redis as backend for ActionCable
# Enables communication between Rails servers in a multi-instance deployment
config.action_cable.cable = {
adapter: 'redis',
url: ENV['REDIS_URL']
}
# When a user posts a message:
ActionCable.server.broadcast(
"chat:#{room.id}",
message: message.as_json
)
# Redis instantly distributes it to all connected servers

Why? Redis Pub/Sub is non-blocking and can handle thousands of messages per second. PostgreSQL LISTEN/NOTIFY exists but isn’t designed for this throughput. Redis also enables horizontal scalability for your app (multiple Rails servers sharing the same Redis).


  1. Distributed rate limiting: Limiting API requests
# Limit to 100 requests per hour per user
key = "rate_limit:user:#{user.id}:#{Time.current.hour}"
count = REDIS.incr(key)
REDIS.expire(key, 1.hour) if count == 1
if count > 100
render json: { error: "Rate limit exceeded" }, status: 429
else
# Process the request
end

Why? INCR + EXPIRE operations are atomic in Redis. No race conditions even with 10 Rails servers in parallel. PostgreSQL would have locking and performance issues with this access pattern.



Need
Redis
PostgreSQL
Pure Speed
⚡⚡⚡⚡⚡
⚡⚡⚡
Atomic Operations
⚡⚡⚡⚡⚡ (native)
⚡⚡⚡ (llocks required)
High-frequency writes
⚡⚡⚡⚡⚡
⚡⚡ (contention)
Guaranteed durability
⚡⚡ (optional)
⚡⚡⚡⚡⚡
Complex queries
⚡ (limited)
⚡⚡⚡⚡⚡
Relations
⚡⚡⚡⚡⚡
Capacity
⚡⚡ (RAM limited)
⚡⚡⚡⚡⚡ (More Capacity HDD)
Cost
⚡⚡ (RAM expensive)
⚡⚡⚡⚡ (HDD Cheap)


Rails 8 and Redis: A New Era

Rails 8, released in 2024, introduced a major shift in the ecosystem: Solid Cache.

Solid Cache: The new paradigm

Solid Cache is an alternative to Redis for caching, using your database rather than memory. Surprising? Yes. Counterintuitive? Not so much!


The concept: with modern SSDs and in-memory caches built into databases, storing the cache on disk becomes viable. The upside? You can have a cache huge for a fraction of the RAM cost.


Basecamp numbers:

  1. Read requests 40% slower than with Redis
  2. But cache 6x larger!
  3. Result: 95th percentile response time dropped from 375ms to 225ms


Configuration in Rails 8:

# config/environments/production.rb
config.cache_store = :solid_cache_store


By default, Rails 8 uses Solid Cache instead of Redis. It’s a pragmatic choice to simplify deployment.

So, is Redis dead in Rails 8?

  1. Absolutely not! Redis remains essential for:
  2. Cases where speed matters: Rate limiting, high-frequency sessions
  3. Special data structures: Sorted Sets for leaderboards
  4. Sidekiq: Always based on Redis for jobs
  5. ActionCable in production: Redis recommended for pub/sub
  6. Atomic counters: Ultra-fast increments


The right approach:

# Solid Cache pour le cache général (fragment cache, requêtes)
config.cache_store = :solid_cache_store
# Redis pour le rate limiting spécifique
rate_limit to: 10, within: 1.minute,
store: ActiveSupport::Cache::RedisCacheStore.new(url: ENV['REDIS_URL'])

The dev container Rails 8

Important: Rails 8 no longer generates Redis by default in dev containers because it uses Solid Queue and Solid Cache. If you need Redis (for Sidekiq or others), add --skip-solid when creating the app:


rails new mon_app --skip-solid

Installation and configuration of Redis with Rails

Let’s get practical! Here’s how to install and configure Redis in your Rails project.

Step 1: Install Redis on your system

macOS (with Homebrew):

# Install Redis
brew install redis
# Launch Redis automatically at startup
brew tap homebrew/services
brew services start redis
# Check that Redis is running
redis-cli ping
# PONG

Ubuntu/Debian:

# Install Redis
sudo apt update
sudo apt install redis-server
# Start Redis
sudo systemctl start redis-server
# Activate Redis when booting
sudo systemctl enable redis-server
# Check
redis-cli ping
# PONG

With Docker:

# docker-compose.yml
version: '3.8'
services:
redis:
image: redis:7.2-alpine
ports:
- "6379:6379"
volumes:
- redis-data:/data
volumes:
redis-data:

Step 2: Add the necessary gems

# Gemfile
gem 'redis', '~> 5.0' # Client Redis officiel
gem 'redis-namespace' # Namespace pour isolation (optionnel mais recommandé)
bundle install

Step 3: Configure the Redis connection

# config/initializers/redis.rb
require 'redis'
require 'redis-namespace'
redis_host = ENV.fetch('REDIS_HOST', 'localhost')
redis_port = ENV.fetch('REDIS_PORT', 6379)
redis_db = ENV.fetch('REDIS_DB', 0)
# Global connexion
redis_connection = Redis.new(
host: redis_host,
port: redis_port,
db: redis_db,
timeout: 5,
reconnect_attempts: 3
)
# With namespace for isolation
REDIS = Redis::Namespace.new(
"#{Rails.application.class.module_parent_name.underscore}:#{Rails.env}",
redis: redis_connection
)

Create an initializer for the global connection:

  1. Why the namespace? If you have multiple Rails apps using the same Redis, the namespace avoids key collisions. For example:
  2. Without namespace : user:1
  3. With namespace : my_app:production:user:1

Step 4: Configure Redis as a cache store

To use Redis as cache:

# config/environments/production.rb
config.cache_store = :redis_cache_store, {
url: ENV.fetch('REDIS_URL', 'redis://localhost:6379/0'),
namespace: 'cache',
expires_in: 90.minutes,
# Reconnection options
reconnect_attempts: 3,
reconnect_delay: 0.5,
reconnect_delay_max: 2.0,
# Error handling
error_handler: -> (method:, returning:, exception:) {
Rails.logger.error("Redis error: #{exception.message}")
}
}

In development:

# config/environments/development.rb
config.cache_store = :redis_cache_store, {
url: 'redis://localhost:6379/0',
namespace: 'cache',
expires_in: 30.minutes
}
config.action_controller.perform_caching = true

Step 5: Use Redis in your code

Basic cache:

# In a controller or model
Rails.cache.fetch("user:#{user.id}:profile", expires_in: 1.hour) do
# This query will only be executed if the cache is empty
user.profile.to_json
end

Simple counter:

# Increment a counter
REDIS.incr("article:#{@article.id}:views")
# Read the counter
views = REDIS.get("article:#{@article.id}:views").to_i

Set for tracking:

# Add a unique visitor
REDIS.sadd("article:#{@article.id}:unique_visitors", current_user.id)
# Count unique visitors
unique_count = REDIS.scard("article:#{@article.id}:unique_visitors")

Leaderboard with Sorted Set:

# Add or update a score
REDIS.zadd("game:leaderboard", @player.score, @player.id)
# Top 10 players
top_players = REDIS.zrevrange("game:leaderboard", 0, 9, with_scores: true)
top_players.each_with_index do |(player_id, score), index|
puts "#{index + 1}. Player #{player_id}: #{score.to_i} points"
end

Step 6: Configuration for Sidekiq (background jobs)

If you use Sidekiq for asynchronous jobs:

# Gemfile
gem 'sidekiq', '~> 7.0'
# config/initializers/sidekiq.rb
Sidekiq.configure_server do |config|
config.redis = { url: ENV.fetch('REDIS_URL', 'redis://localhost:6379/1') }
end
Sidekiq.configure_client do |config|
config.redis = { url: ENV.fetch('REDIS_URL', 'redis://localhost:6379/1') }
end
# config/application.rb
config.active_job.queue_adapter = :sidekiq

Patterns and Best Practices

Now that Redis is installed, here are patterns to know to use it effectively. Some examples are quite advanced. Don’t panic—you won’t have to implement them right away but it’s good to keep them in the back of your mind :)

1. Key naming: be structured!

Principle: use descriptive, hierarchical keys with : as a separator.

# ❌ Bad
REDIS.set("u1", "data")
REDIS.set("userdata", "other")
# ✅ Good
REDIS.set("user:#{user_id}:profile", data)
REDIS.set("user:#{user_id}:stats:views", count)
REDIS.set("article:#{article_id}:comments:count", count)

Recommended convention:

resource:id:attribute
# or
resource:id:sub_resource:attribute

2. Automatic expiration: clean up after yourself

Always define a TTL (Time To Live) to prevent Redis from filling up:

# Expire when set
REDIS.setex("session:#{session_id}", 1.hour.to_i, session_data)
# Or after
REDIS.set("cache:key", data)
REDIS.expire("cache:key", 30.minutes.to_i)
# With Rails.cache (automatic)
Rails.cache.fetch("key", expires_in: 1.hour) { expensive_operation }

Why is this crucial? Redis stores everything in RAM. Without expiration, you risk running out of memory. Expiring frees memory.

3. Atomic operations: be thread-safe

Redis guarantees atomicity of operations. Use it for counters!

# ❌ Not thread-safe
current = REDIS.get("counter").to_i
REDIS.set("counter", current + 1)
# Between these two lines, another process could modify the value!
# ✅ Thread-safe and atomic
REDIS.incr("counter")
REDIS.decr("counter")
REDIS.incrby("counter", 10)

4. Pipelining: batch operations

When you have several Redis commands to execute, use pipelining:

# ❌ Slow (3 network round-trips)
REDIS.set("key1", "value1")
REDIS.set("key2", "value2")
REDIS.set("key3", "value3")
# ✅ Fast (1 single round-trip)
REDIS.pipelined do |pipeline|
pipeline.set("key1", "value1")
pipeline.set("key2", "value2")
pipeline.set("key3", "value3")
end

Performance gain: up to 10x faster for many operations!

5. Cache warming: preheat the cache

The problem of a "cold start":

Imagine your Redis app has just restarted (maintenance, crash, or simply first production). Your cache is empty.


Here’s what happens:

# First user arrives on the homepage
@popular_articles = Rails.cache.fetch("homepage:popular", expires_in: 1.hour) do
# ❌ CACHE MISS! The query must be executed
Article.published
.includes(:author, :comments)
.order(views_count: :desc)
.limit(10)
.to_a
end
# Time: 500ms (heavy query + potential N+1 queries)


The first user experiences all the slowness while the cache fills. Then all other users benefit from the cache for 1 hour. But when the cache expires or Redis restarts, it’s back to square one: a user will bear the penalty.

The solution: Cache Warming

Rather than waiting for a user to trigger the calculation, proactively pre-warm the cache:


Exemple 1: Warming at application startup


# config/initializers/cache_warming.rb
Rails.application.config.after_initialize do
# Only in production and not during migrations or console
if Rails.env.production? && !defined?(Rails::Console)
CacheWarmingJob.perform_later
end
end
# app/jobs/cache_warming_job.rb
class CacheWarmingJob < ApplicationJob
queue_as :low_priority # Don't block critical jobs
def perform
Rails.logger.info "🔥 Starting cache warming..."
# 1. Popular articles for the homepage
Rails.cache.fetch("homepage:popular", expires_in: 1.hour) do
Article.published
.includes(:author, :comments)
.order(views_count: :desc)
.limit(10)
.to_a
end
# 2. Site statistics
Rails.cache.fetch("site:stats", expires_in: 30.minutes) do
{
total_users: User.count,
total_articles: Article.count,
total_comments: Comment.count,
active_today: User.where("last_seen_at > ?", 24.hours.ago).count
}
end
# 3. Top 10 authors
Rails.cache.fetch("authors:top", expires_in: 2.hours) do
User.joins(:articles)
.select("users.*, COUNT(articles.id) as articles_count")
.group("users.id")
.order("articles_count DESC")
.limit(10)
.to_a
end
# 4. Categories with article counters
Category.find_each do |category|
Rails.cache.fetch("category:#{category.id}:articles_count", expires_in: 1.hour) do
category.articles.published.count
end
end
Rails.logger.info "✅ Cache warming completed!"
end
end

Result: When the first user arrives, the cache is already warm. Response time: 5ms instead of 500ms!

Exemple 2: Periodic warming with Sidekiq-Cron

If your cache has a short TTL or some pages are very important, you can periodically pre-warm before expiration:

# Gemfile
gem 'sidekiq-cron'
# config/schedule.yml (for Sidekiq-Cron)
cache_warming:
cron: "*/15 * * * *" # Every 15 minutes
class: "CacheWarmingJob"
queue: low_priority
homepage_critical:
cron: "*/5 * * * *" # Every 5 minutes
class: "HomepageCacheWarmingJob"
queue: critical


# app/jobs/homepage_cache_warming_job.rb
class HomepageCacheWarmingJob < ApplicationJob
queue_as :critical
def perform
# Warm up critical homepage caches BEFORE they expire
# This ensures users ALWAYS hit a hot cache
warm_cache("homepage:hero", 10.minutes) do
Article.featured.includes(:author, :tags).first
end
warm_cache("homepage:trending", 10.minutes) do
Article.trending_last_24h.limit(5).to_a
end
warm_cache("homepage:categories", 30.minutes) do
Category.with_article_counts.to_a
end
end
private
def warm_cache(key, expires_in)
# Force cache refresh even if it already exists
Rails.cache.delete(key)
Rails.cache.fetch(key, expires_in: expires_in) do
yield
end
end
end


When to use cache warming?

✅ Use cache warming if:

  1. You have high-traffic pages with heavy queries
  2. Your Redis cache restarts regularly
  3. You want predictable response times
  4. Your queries take > 100ms to compute
  5. You have predictable peak moments (e.g., product launch)

❌ Do not use cache warming if:

  1. Your data changes constantly (real-time)
  2. You have low traffic (no benefit)
  3. Your queries are already fast (less than 50ms)
  4. You have too many data variations to pre-warm (millions of combinations)


Cache warming as a "performance assurance"


Think of cache warming as an insurance:

  1. Cost: a few background jobs (server resources)
  2. Benefit: a consistently fast user experience
  3. Optimal moment: just before cache expiration or after a restart

Analogy: It’s like preheating your oven before putting in the dish. You could wait for it to heat when you put the dish in (the user waits), or preheat it beforehand (the user is happy immediately).

6. Error handling: plan for outages

Redis can go down. Your app should not crash as a result:

# Safe wrapper for Redis operations
module SafeRedis
def self.with_rescue(default: nil)
yield
rescue Redis::BaseError => e
Rails.logger.error("Redis error: #{e.message}")
default
end
end
# Usage
views_count = SafeRedis.with_rescue(default: 0) do
REDIS.get("article:#{id}:views").to_i
end

For the Rails cache, error handling is built-in:

config.cache_store = :redis_cache_store, {
url: ENV['REDIS_URL'],
error_handler: -> (method:, returning:, exception:) {
# Log the error
Rails.logger.error("Redis cache error: #{exception.message}")
# Notify the team if needed
ErrorNotifier.notify(exception) if Rails.env.production?
}
}

7. Monitoring: watch the metrics

Useful commands to monitor Redis:

# General information
redis-cli INFO
# Memory usage
redis-cli INFO memory
# Command statistics
redis-cli INFO stats
# Count keys matching a pattern
redis-cli KEYS "user:*" | wc -l
# Monitor commands in real time
redis-cli MONITOR

In your Rails code:

# Get Stats
info = REDIS.info
memory_used = info["used_memory_human"]
connected_clients = info["connected_clients"]
total_commands = info["total_commands_processed"]

8. Memory management: never forget the limit!

The problem: Redis stores EVERYTHING in RAM. If you fill memory, Redis crashes or starts rejecting commands.


You should start by setting a memory limit in Redis’s configuration

# In redis.conf
maxmemory 2gb
# Eviction policy when the memory limit is reached
maxmemory-policy allkeys-lru


Eviction policies available:

  1. allkeys-lru : Removes the least recently used keys (recommended for cache)
  2. volatile-lru : Removes only keys with TTL
  3. allkeys-lfu : Removes least frequently used keys (Redis 4.0+)
  4. volatile-ttl : Removes keys with the shortest TTL
  5. noeviction : Rejects writes when full (dangerous!)

You can also limit the size of your collections

# ❌ Dangerous: Unbounded list growth
def add_to_recent_activity(user_id, activity)
REDIS.lpush("user:#{user_id}:activity", activity.to_json)
# This list can grow infinitely!
end
# ✅ Safe: Limit to N items
def add_to_recent_activity(user_id, activity)
REDIS.lpush("user:#{user_id}:activity", activity.to_json)
REDIS.ltrim("user:#{user_id}:activity", 0, 99) # Keep only the latest 100 items
end
# ✅ Alternative: Sorted Set with limit
def add_to_leaderboard(player_id, score)
REDIS.zadd("leaderboard", score, player_id)
# Keep only the top 1000 players
total = REDIS.zcard("leaderboard")
REDIS.zremrangebyrank("leaderboard", 0, total - 1001) if total > 1000
end

Memory usage monitoring


# Estimate memory usage of a single key
REDIS.memory("usage", "user:123:profile")
# => 1024 (bytes)
# Scan and measure memory usage of all keys matching a pattern
total_memory = 0
REDIS.scan_each(match: "user:*", count: 100) do |key|
total_memory += REDIS.memory("usage", key)
end
puts "Total memory used by user:* keys: #{total_memory / 1024 / 1024}MB"

Managing hot data / cold data properly


# Principle: Redis for "hot" data (frequent access)
# PostgreSQL for "cold" data (historical)
class ActivityLog
# Keep the last 30 days in Redis
def self.recent(user_id)
cached = REDIS.lrange("user:#{user_id}:activities", 0, 99)
cached.map { |json| JSON.parse(json) }
end
# PostgreSQL for full historical data
def self.all_time(user_id)
Activity.where(user_id: user_id).order(created_at: :desc)
end
def self.add(user_id, activity)
# Redis for fast access
REDIS.lpush("user:#{user_id}:activities", activity.to_json)
REDIS.ltrim("user:#{user_id}:activities", 0, 99)
REDIS.expire("user:#{user_id}:activities", 30.days)
# PostgreSQL for persistence
Activity.create!(user_id: user_id, data: activity)
end
end


Golden rules for memory:

  1. 📏 Always set maxmemory in production
  2. ⏱️ Always set TTLs on your keys
  3. 📊 Limit the size of collections (Lists, Sets, Sorted Sets)
  4. 🔍 Regularly monitor memory usage
  5. 🔥 Keep only the "hot data" in Redis
  6. 💾 Use PostgreSQL as the source of truth and archive


Practical cases: concrete examples

Let’s look at a few real implementations you’ll encounter often.

Example 1: Cache of expensive queries

# app/models/user.rb
class User < ApplicationRecord
def dashboard_data
Rails.cache.fetch("user:#{id}:dashboard", expires_in: 15.minutes) do
{
recent_posts: posts.recent.limit(5).to_a,
stats: calculate_stats,
notifications: notifications.unread.to_a,
activity_feed: generate_activity_feed
}
end
end
private
def calculate_stats
{
total_posts: posts.count,
total_comments: comments.count,
followers: followers.count,
following: following.count
}
end
end

Example 2: Rate limiting

# app/controllers/concerns/rate_limitable.rb
module RateLimitable
extend ActiveSupport::Concern
def rate_limit!(key, max_requests: 100, period: 1.hour)
redis_key = "rate_limit:#{key}"
current_count = REDIS.get(redis_key).to_i
if current_count >= max_requests
render json: { error: "Rate limit exceeded" }, status: :too_many_requests
return false
end
REDIS.multi do |multi|
multi.incr(redis_key)
multi.expire(redis_key, period.to_i) if current_count == 0
end
true
end
end
# app/controllers/api/v1/posts_controller.rb
class Api::V1::PostsController < ApplicationController
include RateLimitable
before_action -> { rate_limit!("api:#{current_user.id}", max_requests: 1000) }
def index
@posts = Post.all
render json: @posts
end
end

Example 3: Game leaderboard

# app/models/game_leaderboard.rb
class GameLeaderboard
REDIS_KEY = "game:leaderboard"
def self.update_score(player_id, score)
REDIS.zadd(REDIS_KEY, score, player_id)
end
def self.top(limit = 10)
player_ids_with_scores = REDIS.zrevrange(REDIS_KEY, 0, limit - 1, with_scores: true)
player_ids_with_scores.map do |player_id, score|
{
player: Player.find(player_id),
score: score.to_i,
rank: rank(player_id)
}
end
end
def self.rank(player_id)
REDIS.zrevrank(REDIS_KEY, player_id)&.+ 1
end
def self.score(player_id)
REDIS.zscore(REDIS_KEY, player_id)&.to_i
end
def self.player_stats(player_id)
{
score: score(player_id),
rank: rank(player_id),
total_players: REDIS.zcard(REDIS_KEY)
}
end
end
# Use it
GameLeaderboard.update_score(player.id, 2500)
top_players = GameLeaderboard.top(10)
my_stats = GameLeaderboard.player_stats(current_player.id)

Example 4: Session store for authentication

# config/initializers/session_store.rb
Rails.application.config.session_store :redis_store,
servers: [
{
host: ENV.fetch('REDIS_HOST', 'localhost'),
port: ENV.fetch('REDIS_PORT', 6379),
db: 1,
namespace: 'session'
}
],
expire_after: 90.minutes,
key: "_#{Rails.application.class.module_parent_name.underscore}_session",
secure: Rails.env.production?,
httponly: true,
same_site: :lax

Example 5: Pub/Sub for real-time notifications

# app/models/notification_broadcaster.rb
class NotificationBroadcaster
CHANNEL = "notifications"
def self.broadcast(user_id, message)
REDIS.publish(CHANNEL, {
user_id: user_id,
message: message,
timestamp: Time.current
}.to_json)
end
def self.subscribe
REDIS.subscribe(CHANNEL) do |on|
on.message do |channel, message|
data = JSON.parse(message)
# Broadcast the message
ActionCable.server.broadcast(
"user:#{data['user_id']}",
notification: data['message']
)
end
end
end
end
# Use it
NotificationBroadcaster.broadcast(user.id, "New message!")

Pitfalls to avoid

1. Forgetting that everything is in memory

Problem: Storing too much data without expiration consumes all RAM.

Solution: Always set TTLs and monitor memory usage.

# ❌ Dangerous
1_000_000.times do |i|
REDIS.set("item:#{i}", large_data) # Pas d'expiration !
end
# ✅ Safe
1_000_000.times do |i|
REDIS.setex("item:#{i}", 1.hour.to_i, large_data)
end


2. Using Redis as the main database

Problem: Redis is not designed to replace PostgreSQL. Persistence isn’t its strong suit.

Solution: Use Redis as a complement, not a replacement.

# ❌ Risky
REDIS.set("user:#{id}:email", user.email) # Donnée critique dans Redis
# ✅ Correct
user = User.find(id) # PostgreSQL = source de vérité
REDIS.setex("user:#{id}:cache", 5.minutes, user.to_json) # Redis = cache

3. Ignoring O(N) operations

Problem: Some commands are slow on large collections.

  1. Commands to avoid on large data:
  2. KEYS * (use SCAN instead)
  3. SMEMBERS on large sets (use SSCAN)
  4. HGETALL on large hashes (use HSCAN)
# ❌ Bloque Redis sur 1M+ de clés
all_keys = REDIS.keys("user:*")
# ✅ Itère par batch
REDIS.scan_each(match: "user:*", count: 100) do |key|
# Traiter chaque clé
end

4. Not handling serialization

Problem: Redis stores strings. You should serialize complex objects.

# ❌ Doesn't work
REDIS.set("user:data", { name: "Alice", age: 25 })
# => Error or unexpected result
# ✅ Serialize to JSON
REDIS.set("user:data", { name: "Alice", age: 25 }.to_json)
data = JSON.parse(REDIS.get("user:data"))
# ✅ Or use Marshal (beware of compatibility)
REDIS.set("user:data", Marshal.dump(object))
object = Marshal.load(REDIS.get("user:data"))

5. Ignoring security

Problem: Redis by default has no authentication and listens on all interfaces.

Production solution:

# /etc/redis/redis.conf
bind 127.0.0.1 # Listen in local
requirepass votre_mot_de_passe_fort
# config/initializers/redis.rb
REDIS = Redis.new(
host: ENV['REDIS_HOST'],
port: ENV['REDIS_PORT'],
password: ENV['REDIS_PASSWORD'], # Important !
ssl: Rails.env.production? # TLS in production

SEO and performance optimization

Impact on SEO

  1. Redis indirectly improves your SEO by speeding up your site:
  2. Load time: Google favors fast sites
  3. Shorter response times = better ranking
  4. Cache of HTML fragments for instant pages
  5. Bounce rate: Faster site means users stay longer
  6. Less frustration
  7. More engagement
  8. Core Web Vitals: Redis helps improve key metrics
  9. LCP (Largest Contentful Paint): caches images and content
  10. FID (First Input Delay): faster response
  11. CLS (Cumulative Layout Shift): predictable loading

Example of a cache for SEO

# Full-page caching for SEO
class ArticlesController < ApplicationController
def show
@article = Article.find(params[:id])
# Full HTML caching for bots and unauthenticated visitors
if !user_signed_in? && !request.bot?
fresh_when(etag: @article, last_modified: @article.updated_at)
end
end
end
# Fragment caching for expensive partials
<% cache ["article", @article, "sidebar"], expires_in: 1.hour do %>
<%= render "articles/sidebar", article: @article %>
<% end %>

Conclusion: Redis, your new performance ally


We’ve taken a full tour of Redis, from its fundamental concepts to its integration in Rails 8. Let’s recap what you should remember:

What Redis is:

  1. An in-memory, ultra-fast key-value database
  2. A versatile tool with rich data structures
  3. An essential complement to traditional databases


When to use Redis:

  1. ✅ Caching expensive data
  2. ✅ User sessions
  3. ✅ Real-time counters and statistics
  4. ✅ Leaderboards and rankings
  5. ✅ Job queues (Sidekiq)
  6. ✅ Rate limiting
  7. ✅ Pub/Sub for WebSockets


The Rails 8 ecosystem:

  1. Solid Cache by default to simplify deployment
  2. Redis still relevant for specific cases
  3. Hybrid approach recommended


Best practices:

  1. Name your keys in a structured way
  2. Always define expirations
  3. Use the appropriate data structures
  4. Handle errors gracefully
  5. Monitor memory usage

Redis isn’t magical, but it’s a powerful tool that, when used well, will transform the performance of your Rails application. Start small: add caching to a slow request, implement a simple counter, then gradually explore other possibilities.


Next steps:

  1. Install Redis locally and test the basic commands
  2. Add Redis caching to a slow page in your app
  3. Explore Sidekiq for asynchronous jobs
  4. Implement a leaderboard or a rate-limiting system


You now have all the keys (no pun intended 😉) to master Redis in your Rails projects. Happy coding!

Resources to go further

  1. Official Redis Documentation
  2. Redis University - Free courses
  3. Rails Guides - Caching
  4. Solid Cache GitHub
  5. Redis Command Reference