Your database is the most valuable thing on your server. Code can be re-deployed from Git. Static files can be re-uploaded. But your database — customer records, orders, content, user accounts — is irreplaceable if lost.
Yet most developers don't set up proper backups until after they've lost data. Don't be that developer. Here's how to set up automated, reliable database backups in under 30 minutes.
Why Automated Backups Matter
Manual backups don't work because humans forget. You'll do it religiously for two weeks, then skip a day, then a week, then it's been three months since your last backup — right when your disk fails.
Automated backups run whether you remember or not. Set them up once, verify they work, and sleep better knowing your data is protected.
Method 1: MySQL/MariaDB Backup Script
This covers WordPress, Laravel, and most PHP applications.
The Basic Script
Create a backup script:
sudo nano /home/deploy/scripts/db-backup.sh
#!/bin/bash
# Database Backup Script
# Runs daily via cron
# Configuration
DB_USER="backup_user"
DB_PASS="your_secure_password"
BACKUP_DIR="/home/deploy/backups/db"
RETENTION_DAYS=14
DATE=$(date +%Y-%m-%d_%H%M)
# Create backup directory
mkdir -p "$BACKUP_DIR"
# Dump all databases
mysqldump -u "$DB_USER" -p"$DB_PASS" \
--all-databases \
--single-transaction \
--routines \
--triggers \
--events \
| gzip > "$BACKUP_DIR/all-databases-$DATE.sql.gz"
# Check if backup succeeded
if [ $? -eq 0 ]; then
echo "[$DATE] Backup successful: $(ls -lh $BACKUP_DIR/all-databases-$DATE.sql.gz | awk '{print $5}')"
else
echo "[$DATE] ERROR: Backup failed!" >&2
exit 1
fi
# Remove old backups
find "$BACKUP_DIR" -name "*.sql.gz" -mtime +$RETENTION_DAYS -delete
echo "[$DATE] Cleanup complete. Remaining backups:"
ls -lh "$BACKUP_DIR"
Make it executable:
chmod +x /home/deploy/scripts/db-backup.sh
Create a Dedicated Backup User
Don't use your root MySQL credentials. Create a read-only backup user:
CREATE USER 'backup_user'@'localhost' IDENTIFIED BY 'secure_backup_password';
GRANT SELECT, SHOW DATABASES, LOCK TABLES, EVENT, TRIGGER, SHOW VIEW
ON *.* TO 'backup_user'@'localhost';
FLUSH PRIVILEGES;
Schedule with Cron
crontab -e
Add this line to run daily at 3 AM:
0 3 * * * /home/deploy/scripts/db-backup.sh >> /var/log/db-backup.log 2>&1
Test It
Run the script manually first:
/home/deploy/scripts/db-backup.sh
Verify the backup works by restoring to a test database:
gunzip < /home/deploy/backups/db/all-databases-2026-03-04_0300.sql.gz | mysql -u root -p test_restore_db
If your data is there, you're good.
Method 2: PostgreSQL Backups
For Django, Rails, or Node.js apps using PostgreSQL:
#!/bin/bash
# PostgreSQL Backup Script
BACKUP_DIR="/home/deploy/backups/pg"
DATE=$(date +%Y-%m-%d_%H%M)
RETENTION_DAYS=14
mkdir -p "$BACKUP_DIR"
# Dump all databases
pg_dumpall -U postgres | gzip > "$BACKUP_DIR/pg-all-$DATE.sql.gz"
# Or dump a specific database
# pg_dump -U postgres myapp_production | gzip > "$BACKUP_DIR/myapp-$DATE.sql.gz"
# Cleanup old backups
find "$BACKUP_DIR" -name "*.sql.gz" -mtime +$RETENTION_DAYS -delete
For PostgreSQL, configure ~/.pgpass to avoid password prompts in scripts:
localhost:5432:*:postgres:your_password
Set permissions:
chmod 600 ~/.pgpass
Offsite Backup: Don't Keep All Eggs in One Basket
A backup on the same server as your database isn't really a backup. If the disk dies, you lose both. Send copies offsite.
Option A: Rsync to Another Server
# Add to your backup script
rsync -avz --delete \
/home/deploy/backups/ \
backup@remote-server:/home/backup/myserver/
Option B: Upload to S3 or Compatible Storage
Install the AWS CLI:
sudo apt install awscli -y
aws configure
Add to your backup script:
# Upload to S3
aws s3 cp "$BACKUP_DIR/all-databases-$DATE.sql.gz" \
s3://my-backups/database/ \
--storage-class STANDARD_IA
# Remove S3 backups older than 30 days
aws s3 ls s3://my-backups/database/ | \
awk '{print $4}' | while read file; do
file_date=$(echo "$file" | grep -oP '\d{4}-\d{2}-\d{2}')
if [[ $(date -d "$file_date" +%s) -lt $(date -d "-30 days" +%s) ]]; then
aws s3 rm "s3://my-backups/database/$file"
fi
done
Option C: Rclone (Works with 40+ Cloud Providers)
Rclone supports Google Drive, Dropbox, Backblaze B2, Wasabi, and more:
# Install rclone
curl https://rclone.org/install.sh | sudo bash
# Configure (interactive)
rclone config
# Sync backups to cloud storage
rclone sync /home/deploy/backups/ remote:server-backups/
Monitoring Your Backups
Backups that silently fail are worse than no backups — they give you false confidence.
Simple Health Check
Add this to the end of your backup script:
# Check backup file size (should be > 1KB at minimum)
BACKUP_SIZE=$(stat -f%z "$BACKUP_DIR/all-databases-$DATE.sql.gz" 2>/dev/null || stat -c%s "$BACKUP_DIR/all-databases-$DATE.sql.gz")
if [ "$BACKUP_SIZE" -lt 1024 ]; then
echo "WARNING: Backup file suspiciously small ($BACKUP_SIZE bytes)" >&2
# Send alert
curl -s -X POST "https://your-webhook-url" \
-d "text=⚠️ Database backup may have failed - file size: $BACKUP_SIZE bytes"
fi
Check Backup Logs
Review your logs periodically:
tail -20 /var/log/db-backup.log
Full Backup Strategy: The 3-2-1 Rule
Follow the industry-standard 3-2-1 backup rule:
- 3 copies of your data
- 2 different storage types (local disk + cloud)
- 1 offsite copy
Here's what that looks like in practice:
Copy 1: Live database (your server)
Copy 2: Local backup files (/home/deploy/backups/)
Copy 3: Cloud storage (S3, Backblaze B2, etc.)
Backup Schedule Recommendations
| Site Type | Frequency | Retention |
|---|---|---|
| Blog / portfolio | Daily | 14 days |
| Business site | Daily | 30 days |
| E-commerce store | Every 6 hours | 30 days |
| SaaS application | Hourly | 7 days + daily for 30 |
Adjust based on how much data you can afford to lose. If losing 24 hours of orders would be catastrophic, back up more frequently.
Quick-Start Checklist
- Create backup script with proper credentials
- Test backup by restoring to a test database
- Schedule via cron
- Set up offsite copy (S3, rclone, or rsync)
- Add monitoring/alerting for failures
- Document your restore procedure
- Test a full restore at least once per quarter
Protect Your Data with DeployBase
Automated backups should be a given, not an afterthought. At DeployBase, every VPS plan includes automated daily backups with 7-day retention — configured and running from the moment you deploy. For mission-critical applications, our managed hosting plans offer hourly backups with 30-day retention and one-click restore.
Combined with NVMe SSD storage, full root access, and 24/7 expert support, DeployBase gives you the infrastructure to keep your data safe without the setup headache.
Get started with DeployBase → — your data is too important to leave unprotected.




