Your database handles every request. Every page load triggers queries. Every API call hits the same tables. As traffic grows, response times climb and your server groans under the load.
Redis fixes this. It's an in-memory data store that sits between your application and your database, serving frequently requested data in microseconds instead of milliseconds. Adding Redis caching to a typical web application can reduce database load by 80% and cut response times from 200ms to under 10ms.
Here's how to set it up.
What Is Redis?
Redis (Remote Dictionary Server) is an open-source, in-memory key-value store. Think of it as a super-fast dictionary that lives in RAM:
Key: "user:1234" → Value: {"name": "John", "email": "john@example.com"}
Key: "page:home" → Value: "<html>...cached homepage...</html>"
Key: "api:products" → Value: "[{product1}, {product2}, ...]"
Because data lives in RAM instead of on disk, Redis reads and writes are orders of magnitude faster than traditional databases:
- MySQL query: 1-50ms
- Redis lookup: 0.1-0.5ms
That difference compounds with every request your application handles.
Installing Redis
Ubuntu/Debian
sudo apt update
sudo apt install redis-server -y
sudo systemctl enable redis-server
sudo systemctl start redis-server
Verify Installation
redis-cli ping
# Should return: PONG
redis-cli info server | head -5
Basic Configuration
Edit /etc/redis/redis.conf for production use:
sudo nano /etc/redis/redis.conf
Key settings to adjust:
# Bind to localhost only (security)
bind 127.0.0.1
# Set a password
requirepass your_strong_password_here
# Set max memory (use 25-50% of available RAM)
maxmemory 512mb
# Eviction policy when memory is full
maxmemory-policy allkeys-lru
Restart Redis after changes:
sudo systemctl restart redis-server
The allkeys-lru policy automatically removes the least recently used keys when memory is full — perfect for caching where stale data can be safely evicted.
Caching Patterns
Pattern 1: Cache-Aside (Most Common)
Check the cache first. If the data exists (cache hit), return it. If not (cache miss), fetch from the database, store in cache, then return.
Node.js Example:
const Redis = require('ioredis');
const redis = new Redis({ password: 'your_password' });
async function getUser(userId) {
const cacheKey = `user:${userId}`;
// Check cache first
const cached = await redis.get(cacheKey);
if (cached) {
return JSON.parse(cached);
}
// Cache miss — fetch from database
const user = await db.query('SELECT * FROM users WHERE id = ?', [userId]);
// Store in cache for 1 hour
await redis.setex(cacheKey, 3600, JSON.stringify(user));
return user;
}
PHP Example (Laravel):
use Illuminate\Support\Facades\Cache;
function getUser($userId) {
return Cache::remember("user:{$userId}", 3600, function () use ($userId) {
return User::find($userId);
});
}
Python Example (Flask):
import redis
import json
r = redis.Redis(host='localhost', port=6379, password='your_password')
def get_user(user_id):
cache_key = f"user:{user_id}"
cached = r.get(cache_key)
if cached:
return json.loads(cached)
user = db.execute("SELECT * FROM users WHERE id = %s", (user_id,))
r.setex(cache_key, 3600, json.dumps(user))
return user
Pattern 2: Write-Through Cache
Update the cache whenever you update the database. This keeps the cache always fresh:
async function updateUser(userId, data) {
// Update database
await db.query('UPDATE users SET ? WHERE id = ?', [data, userId]);
// Update cache immediately
const updatedUser = await db.query('SELECT * FROM users WHERE id = ?', [userId]);
await redis.setex(`user:${userId}`, 3600, JSON.stringify(updatedUser));
return updatedUser;
}
Pattern 3: Cache Invalidation
When data changes, delete the cached version so the next request fetches fresh data:
async function deleteUser(userId) {
await db.query('DELETE FROM users WHERE id = ?', [userId]);
await redis.del(`user:${userId}`);
}
// Invalidate related caches too
async function updateProduct(productId, data) {
await db.query('UPDATE products SET ? WHERE id = ?', [data, productId]);
// Delete specific cache
await redis.del(`product:${productId}`);
// Delete list caches that might include this product
await redis.del('products:all');
await redis.del('products:featured');
}
What to Cache
Not everything should be cached. Focus on:
✅ Good Candidates for Caching
- Database queries that run frequently with the same parameters
- API responses from external services (rate-limited or slow)
- Computed results (aggregations, reports, search results)
- Session data (faster than file or database sessions)
- Page fragments (navigation, sidebar, footer)
- Configuration that rarely changes
❌ Poor Candidates
- Data that changes every request
- User-specific data that's rarely re-accessed
- Large binary files (use CDN instead)
- Data where staleness causes problems (financial transactions)
Setting TTL (Time to Live)
Choose cache duration based on how frequently data changes:
// Rarely changes — cache for 24 hours
await redis.setex('site:config', 86400, JSON.stringify(config));
// Changes occasionally — cache for 1 hour
await redis.setex('products:featured', 3600, JSON.stringify(products));
// Changes frequently — cache for 5 minutes
await redis.setex('dashboard:stats', 300, JSON.stringify(stats));
// Real-time data — cache for 30 seconds
await redis.setex('active:users', 30, JSON.stringify(count));
WordPress + Redis
For WordPress sites, Redis can dramatically speed up page loads:
# Install PHP Redis extension
sudo apt install php-redis -y
sudo systemctl restart php8.2-fpm
Install the Redis Object Cache plugin:
wp plugin install redis-cache --activate --path=/var/www/yoursite
wp redis enable --path=/var/www/yoursite
Add to wp-config.php:
define('WP_REDIS_HOST', '127.0.0.1');
define('WP_REDIS_PASSWORD', 'your_password');
define('WP_REDIS_DATABASE', 0);
This caches WordPress object cache in Redis instead of the database, reducing database queries by 50-90% on typical WordPress sites.
Monitoring Redis Performance
Quick Health Check
redis-cli info stats | grep -E "keyspace_hits|keyspace_misses"
Calculate your hit rate:
Hit Rate = keyspace_hits / (keyspace_hits + keyspace_misses) × 100
A healthy cache should have a hit rate above 80%. Below 50% means you're not caching the right things.
Monitor in Real-Time
# Watch all commands in real-time
redis-cli monitor
# Check memory usage
redis-cli info memory | grep -E "used_memory_human|maxmemory_human"
# See how many keys exist
redis-cli dbsize
Common Mistakes to Avoid
1. Caching everything — Only cache data that's expensive to fetch and frequently requested. Caching rarely-accessed data wastes memory.
2. No expiration — Always set a TTL. Without it, stale data lives forever and memory fills up.
3. Cache stampede — When a popular cache key expires, hundreds of requests hit the database simultaneously. Fix with mutex locks or staggered TTLs:
// Add random jitter to prevent stampede
const ttl = 3600 + Math.floor(Math.random() * 300); // 1hr + 0-5min random
await redis.setex(key, ttl, value);
4. No password — Always set requirepass in production. An unprotected Redis server is a security risk.
5. Not monitoring — If you don't track hit rates and memory usage, you can't optimize your caching strategy.
How Much Memory Does Redis Need?
For most web applications:
| Site Type | Recommended Redis Memory |
|---|---|
| Small blog/portfolio | 64-128MB |
| Business website | 128-256MB |
| E-commerce (1K products) | 256-512MB |
| High-traffic application | 512MB-2GB |
Redis is extremely memory-efficient. A million simple key-value pairs use roughly 85MB.
Speed Up Your Application with DeployBase
Redis caching is one of the highest-impact performance improvements you can make. Combined with proper hosting, it transforms sluggish applications into lightning-fast experiences.
At DeployBase, our VPS plans include enough RAM to run Redis alongside your application. Starting at $5/month with NVMe SSD storage, full root access, and 24/7 support, you get the infrastructure your application needs to perform at its best.
Get your VPS at DeployBase → — fast hosting for fast applications.




