Redis is the most widely deployed in-memory data structure store in production web infrastructure — used for caching, session storage, message queues, rate limiting, real-time leaderboards, and pub/sub messaging. Running Redis on your Hong Kong VPS eliminates the latency of a remote managed Redis service and gives you direct control over persistence, memory limits, and eviction policies.
This guide covers Redis installation, production configuration, common use case setups, and security hardening for a standalone Redis instance on Ubuntu 22.04.
Step 1: Install Redis
sudo apt update
sudo apt install -y redis-serverFor the latest Redis version (7.x), add the official Redis repository:
curl -fsSL https://packages.redis.io/gpg | \
sudo gpg --dearmor -o /usr/share/keyrings/redis-archive-keyring.gpg
echo "deb [signed-by=/usr/share/keyrings/redis-archive-keyring.gpg] \
https://packages.redis.io/deb $(lsb_release -cs) main" | \
sudo tee /etc/apt/sources.list.d/redis.list
sudo apt update
sudo apt install -y redisVerify the installation:
redis-server --version
redis-cli pingYou should see PONG — Redis is running.
Step 2: Production Configuration
sudo nano /etc/redis/redis.confKey settings to configure for production:
# Bind to localhost only — never expose Redis to the public internet
bind 127.0.0.1
# Set a strong password
requirepass YOUR_STRONG_REDIS_PASSWORD
# Maximum memory limit — prevents Redis from consuming all available RAM
# Set to 75% of available RAM for a dedicated Redis instance
# Example for 2 GB VPS (leave headroom for OS and app):
maxmemory 512mb
# Eviction policy — what happens when maxmemory is reached
# allkeys-lru: evict least recently used keys (best for pure cache)
# volatile-lru: evict LRU keys with TTL set (for mixed cache+persistent use)
# noeviction: return errors when memory full (for queues/persistent data)
maxmemory-policy allkeys-lru
# Enable persistence — AOF (Append Only File) for durability
appendonly yes
appendfsync everysec
# Slow log — log commands taking longer than 10ms
slowlog-log-slower-than 10000
slowlog-max-len 128
# Disable dangerous commands in production
rename-command FLUSHALL ""
rename-command FLUSHDB ""
rename-command CONFIG "CONFIG_b3f8a92c"
rename-command DEBUG ""Restart Redis to apply:
sudo systemctl restart redis-server
sudo systemctl enable redis-serverStep 3: Use Case — Application Caching
Caching database query results in Redis is the most impactful performance optimisation for database-backed web applications. Instead of executing the same MySQL query on every request, the first request stores the result in Redis and subsequent requests retrieve it from memory.
Python (Flask/Django) with redis-py:
pip install redisimport redis
import json
r = redis.Redis(host='127.0.0.1', port=6379, password='YOUR_REDIS_PASSWORD', decode_responses=True)
def get_products(category_id):
cache_key = f"products:category:{category_id}"
# Try cache first
cached = r.get(cache_key)
if cached:
return json.loads(cached)
# Cache miss — query database
products = db.query("SELECT * FROM products WHERE category_id = %s", category_id)
# Store in cache for 5 minutes
r.setex(cache_key, 300, json.dumps(products))
return productsNode.js with ioredis:
npm install ioredisconst Redis = require('ioredis');
const redis = new Redis({ host: '127.0.0.1', port: 6379, password: 'YOUR_REDIS_PASSWORD' });
async function getProducts(categoryId) {
const cacheKey = `products:category:${categoryId}`;
const cached = await redis.get(cacheKey);
if (cached) return JSON.parse(cached);
const products = await db.query('SELECT * FROM products WHERE category_id = ?', [categoryId]);
await redis.setex(cacheKey, 300, JSON.stringify(products));
return products;
}Step 4: Use Case — Session Storage
Storing user sessions in Redis instead of database tables or filesystem provides faster session reads and enables horizontal scaling (multiple app servers sharing sessions via a central Redis instance).
Flask session storage with Flask-Session:
pip install Flask-Session redisfrom flask import Flask, session
from flask_session import Session
import redis
app = Flask(__name__)
app.config['SECRET_KEY'] = 'your_secret_key'
app.config['SESSION_TYPE'] = 'redis'
app.config['SESSION_REDIS'] = redis.Redis(
host='127.0.0.1', port=6379, password='YOUR_REDIS_PASSWORD'
)
app.config['PERMANENT_SESSION_LIFETIME'] = 3600 # 1 hour
Session(app)Django session storage:
pip install django-redis# settings.py
CACHES = {
"default": {
"BACKEND": "django_redis.cache.RedisCache",
"LOCATION": "redis://:YOUR_REDIS_PASSWORD@127.0.0.1:6379/0",
"OPTIONS": {
"CLIENT_CLASS": "django_redis.client.DefaultClient",
}
}
}
SESSION_ENGINE = "django.contrib.sessions.backends.cache"
SESSION_CACHE_ALIAS = "default"Step 5: Use Case — Message Queue with Redis Lists
Redis lists provide a simple, high-performance message queue for background job processing — pushing tasks from web workers and consuming them in background Celery or custom worker processes.
# Producer — add job to queue
r.lpush('email_queue', json.dumps({
'to': 'user@example.com',
'template': 'order_confirmation',
'order_id': 12345
}))
# Consumer — blocking pop (waits for new jobs)
while True:
_, job_data = r.brpop('email_queue', timeout=0)
job = json.loads(job_data)
send_email(job['to'], job['template'], job['order_id'])Step 6: Monitor Redis Performance
# Connect to Redis CLI with password
redis-cli -a YOUR_REDIS_PASSWORD
# Real-time statistics
INFO stats
INFO memory
INFO clients
# Monitor all commands in real time (use sparingly in production)
MONITOR
# Check memory usage per key pattern
MEMORY USAGE products:category:1
# View slow log
SLOWLOG GET 10Key metrics to monitor:
used_memory_human— current memory consumption vs maxmemory limitkeyspace_hits/keyspace_misses— cache hit ratio (target above 80%)connected_clients— active connectionsinstantaneous_ops_per_sec— operations per second under load
Step 7: Security Hardening
# Ensure Redis is not accessible from outside the server
ss -tlnp | grep 6379The output should show Redis binding only to 127.0.0.1:6379 — never to 0.0.0.0 (all interfaces). A Redis instance exposed to the public internet without authentication is a critical security vulnerability commonly exploited by automated scanners.
Block port 6379 at the firewall level as a defence-in-depth measure:
sudo ufw deny 6379Conclusion
Redis on a Hong Kong VPS provides sub-millisecond cache responses, persistent session storage, and reliable queue infrastructure — all running locally alongside your application with zero network round-trip overhead. Combined with CN2 GIA routing, a Redis-optimised application stack delivers fast, consistent response times for users across mainland China and Asia-Pacific.
Deploy Redis on Server.HK’s NVMe SSD Hong Kong VPS plans — fast local storage ensures Redis AOF persistence does not become an I/O bottleneck under write-heavy workloads.
Frequently Asked Questions
Should I run Redis on the same VPS as my application or on a separate server?
For most small to mid-size applications, running Redis on the same VPS as your application is practical and efficient — local socket or localhost communication has near-zero latency. Separate the Redis instance onto a dedicated server when your application scales to multiple app servers (all needing shared session/cache access) or when Redis memory requirements compete significantly with application memory.
How much RAM does Redis need on a Hong Kong VPS?
Redis’s memory footprint depends entirely on the volume and size of cached data. A cache storing 100,000 typical web application objects (each averaging 1 KB) uses approximately 150–200 MB of RAM including Redis overhead. Set maxmemory to a defined limit and choose an appropriate eviction policy to prevent Redis from consuming unbounded memory.
Does Redis persistence (AOF) affect performance on NVMe SSD?
With appendfsync everysec (the recommended setting), Redis fsync’s the AOF log once per second — a write pattern that NVMe SSD handles with negligible impact on throughput. The performance cost of AOF persistence on NVMe storage is effectively unmeasurable in production workloads.