Your Node.js app runs on port 3000. Your Python API runs on port 8000. But visitors expect to reach your site on port 80 (HTTP) and 443 (HTTPS) — without typing port numbers in the URL.
That's where Nginx as a reverse proxy comes in. It sits in front of your application, handles incoming requests, and forwards them to the right backend. Along the way, it adds SSL termination, caching, compression, and load balancing — things your application shouldn't have to worry about.
Here's how to set it up properly.
What Is a Reverse Proxy?
A reverse proxy is a server that sits between clients (browsers) and your backend application. Instead of users connecting directly to your app, they connect to Nginx, which forwards the request to your application and returns the response.
Client → Nginx (port 80/443) → Your App (port 3000)
Why not just run your app on port 80?
- Running on port 80/443 requires root privileges — a security risk
- Your app shouldn't handle SSL certificates, compression, or static files
- Nginx handles thousands of concurrent connections more efficiently than most application servers
- You can run multiple apps on the same server, each on different domains
Prerequisites
- A VPS with Ubuntu 22.04 or 24.04
- SSH access with sudo privileges
- A domain name pointed to your server's IP
- A running web application (Node.js, Python, Go, etc.)
Step 1: Install Nginx
sudo apt update
sudo apt install nginx -y
sudo systemctl enable nginx
sudo systemctl start nginx
Verify it's running:
sudo systemctl status nginx
Visit your server's IP in a browser — you should see the default Nginx welcome page.
Step 2: Configure the Reverse Proxy
Create a new server block configuration:
sudo nano /etc/nginx/sites-available/myapp
Basic Reverse Proxy for a Node.js App
server {
listen 80;
server_name yourdomain.com www.yourdomain.com;
location / {
proxy_pass http://127.0.0.1:3000;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection 'upgrade';
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
proxy_cache_bypass $http_upgrade;
}
}
Enable the configuration:
sudo ln -s /etc/nginx/sites-available/myapp /etc/nginx/sites-enabled/
sudo nginx -t
sudo systemctl reload nginx
Let's break down the important headers:
proxy_set_header Host $host— Passes the original domain name to your appX-Real-IP— Your app sees the visitor's real IP, not Nginx's local IPX-Forwarded-For— Preserves the chain of IPs if multiple proxies are involvedX-Forwarded-Proto— Tells your app whether the original request was HTTP or HTTPSUpgradeandConnection— Required for WebSocket support
Step 3: Add SSL with Let's Encrypt
Never run a production site without SSL. Certbot makes it free and automatic:
sudo apt install certbot python3-certbot-nginx -y
sudo certbot --nginx -d yourdomain.com -d www.yourdomain.com
Certbot automatically modifies your Nginx config to handle HTTPS and redirect HTTP traffic. It also sets up auto-renewal.
Test renewal:
sudo certbot renew --dry-run
Step 4: Serve Static Files Directly
Nginx serves static files much faster than your application server. Offload images, CSS, and JavaScript:
server {
listen 443 ssl;
server_name yourdomain.com;
# SSL config (managed by Certbot)
ssl_certificate /etc/letsencrypt/live/yourdomain.com/fullchain.pem;
ssl_certificate_key /etc/letsencrypt/live/yourdomain.com/privkey.pem;
# Serve static files directly
location /static/ {
alias /var/www/myapp/static/;
expires 1y;
add_header Cache-Control "public, immutable";
}
location /uploads/ {
alias /var/www/myapp/uploads/;
expires 30d;
}
# Proxy everything else to the app
location / {
proxy_pass http://127.0.0.1:3000;
proxy_http_version 1.1;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
}
}
This can reduce your application server's load by 50% or more, since it no longer handles static asset requests.
Step 5: Enable Compression
Compress responses to reduce bandwidth and improve load times:
# Add to the http block in /etc/nginx/nginx.conf
gzip on;
gzip_vary on;
gzip_min_length 1000;
gzip_proxied any;
gzip_types
text/plain
text/css
text/javascript
application/javascript
application/json
application/xml
image/svg+xml;
This typically reduces transfer sizes by 60-80% for text-based content.
Step 6: Add Response Caching
Cache responses from your backend to reduce load:
# Add to http block in nginx.conf
proxy_cache_path /tmp/nginx-cache levels=1:2 keys_zone=APPCACHE:10m max_size=1g inactive=60m;
# In your server block
location /api/ {
proxy_pass http://127.0.0.1:3000;
proxy_cache APPCACHE;
proxy_cache_valid 200 10m;
proxy_cache_valid 404 1m;
proxy_cache_bypass $http_authorization;
add_header X-Cache-Status $upstream_cache_status;
}
The X-Cache-Status header helps you debug — check it in browser DevTools to see if responses are being served from cache (HIT) or from your backend (MISS).
Multiple Applications on One Server
One of the biggest advantages of Nginx as a reverse proxy is running multiple apps on a single server:
# App 1: Main website
server {
listen 443 ssl;
server_name mysite.com;
location / {
proxy_pass http://127.0.0.1:3000;
# ... proxy headers
}
}
# App 2: API server
server {
listen 443 ssl;
server_name api.mysite.com;
location / {
proxy_pass http://127.0.0.1:4000;
# ... proxy headers
}
}
# App 3: Admin panel
server {
listen 443 ssl;
server_name admin.mysite.com;
location / {
proxy_pass http://127.0.0.1:5000;
# ... proxy headers
}
}
Each app runs independently on its own port, and Nginx routes traffic based on the domain name.
Basic Load Balancing
If your app runs on multiple instances, Nginx can distribute traffic:
upstream myapp {
server 127.0.0.1:3000;
server 127.0.0.1:3001;
server 127.0.0.1:3002;
}
server {
listen 443 ssl;
server_name yourdomain.com;
location / {
proxy_pass http://myapp;
# ... proxy headers
}
}
This round-robin setup distributes requests evenly across three app instances. For Node.js apps, you can use PM2's cluster mode to spawn multiple instances easily:
pm2 start app.js -i 3 --name myapp
Security Headers
Add security headers at the Nginx level so your application doesn't have to:
add_header X-Frame-Options "SAMEORIGIN" always;
add_header X-Content-Type-Options "nosniff" always;
add_header X-XSS-Protection "1; mode=block" always;
add_header Referrer-Policy "strict-origin-when-cross-origin" always;
add_header Strict-Transport-Security "max-age=31536000; includeSubDomains" always;
# Hide Nginx version
server_tokens off;
Rate Limiting
Protect your backend from abuse:
# Define rate limit zones (in http block)
limit_req_zone $binary_remote_addr zone=api:10m rate=10r/s;
limit_req_zone $binary_remote_addr zone=login:10m rate=5r/m;
# Apply to locations
location /api/ {
limit_req zone=api burst=20 nodelay;
proxy_pass http://127.0.0.1:3000;
}
location /auth/login {
limit_req zone=login burst=3 nodelay;
proxy_pass http://127.0.0.1:3000;
}
Troubleshooting Common Issues
502 Bad Gateway: Your backend app isn't running. Check with pm2 status or systemctl status yourapp.
504 Gateway Timeout: Your app is taking too long to respond. Increase proxy timeouts:
proxy_connect_timeout 60s;
proxy_send_timeout 60s;
proxy_read_timeout 60s;
WebSocket connections failing: Make sure you have the Upgrade headers:
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection "upgrade";
Real IP not showing in app logs: Ensure X-Real-IP and X-Forwarded-For headers are set, and configure your app to trust the proxy.
Testing Your Configuration
Always test before reloading:
# Test config syntax
sudo nginx -t
# Reload without downtime
sudo systemctl reload nginx
# Check error logs if something breaks
sudo tail -f /var/log/nginx/error.log
Deploy Behind Nginx with DeployBase
Setting up Nginx as a reverse proxy is one of the most impactful things you can do for your web application's performance and security. It's the standard architecture used by companies of all sizes, from startups to enterprises.
At DeployBase, our VPS plans come with Nginx pre-installed and SSH access for full configuration control. Whether you're running a Node.js API, a Python Django app, or multiple sites on one server, DeployBase gives you the performance and flexibility to set up your ideal architecture. Starting at $5/month with NVMe SSD storage and 24/7 support.
Get your VPS at DeployBase → — fast hosting, full control, zero compromise.




