$docker.recipes
·13 min read·Updated September 2025

Backup Strategies for Self-Hosted Docker Services

A practical guide to backing up Docker volumes, databases, and configurations with automation scripts and off-site backup strategies.

backupdockerself-hostingdisaster-recovery

01The Backup Lesson I Learned the Hard Way

In 2022, a failed SSD took down everything: Nextcloud files, my Bookstack wiki, Gitea repositories, and years of photos. I had backups — on a drive connected to the same machine. The SSD failure corrupted the USB controller, taking the backup drive with it. That experience taught me the most important lesson in self-hosting: backups aren't real until they're tested, automated, and stored off-site. This guide covers the strategy I've used since then. No data loss in three years.

02What to Back Up

Not everything needs backing up. Docker images can be re-pulled. Application code is in Git. Focus on: Docker volumes: Your application data — databases, documents, media. Most critical target. Compose files and .env files: Your infrastructure definition. Store in Git. Database dumps: Both filesystem backups AND logical dumps (pg_dump, mysqldump) for portability. SSL certificates and secrets: Especially Let's Encrypt certs, which have issuance rate limits. Custom configs: Prometheus configs, Grafana dashboards, reverse proxy rules.

03Backing Up Docker Volumes

For services that can tolerate brief downtime, stop, copy, restart:
[backup-volumes.sh]
1#!/bin/bash
2# backup-volumes.sh
3BACKUP_DIR="/backups/$(date +%Y-%m-%d)"
4mkdir -p "$BACKUP_DIR"
5
6for service in nextcloud vaultwarden gitea; do
7 echo "Backing up $service..."
8 docker compose -f ~/docker/$service/docker-compose.yml stop
9 tar -czf "$BACKUP_DIR/$service.tar.gz" \
10 /var/lib/docker/volumes/${service}_data
11 docker compose -f ~/docker/$service/docker-compose.yml start
12done

For databases, prefer logical backups (pg_dump, mysqldump) over volume snapshots. They're more portable and can restore to different database versions.

04Off-Site Backups with Rclone

Local backups protect against deletion and corruption. Off-site protects against hardware failure and disasters. You need both. Rclone supports 50+ cloud providers. Backblaze B2 is my recommendation — first 10GB free, then $0.005/GB/month:
[terminal]
1# One-time setup
2rclone config
3
4# Add to your backup script
5rclone sync /backups/ b2:my-homelab-backups/ \
6 --transfers 4 \
7 --log-file /var/log/rclone.log
8
9# Automate with cron (weekly at 3 AM Sunday)
10# 0 3 * * 0 /home/user/scripts/backup-and-sync.sh

05Testing Your Backups

A backup you haven't tested is a hope, not a backup. Schedule quarterly restore tests: 1. Spin up a temporary Docker environment 2. Restore volume backups and database dumps 3. Start services and verify they work 4. Check that recent data exists 5. Document the restore procedure I keep a restore-test.md file with step-by-step restoration instructions. When you need to restore at 2 AM during a crisis, you don't want to figure it out from scratch. Check out our storage and backup recipes for Docker Compose configurations of tools like Duplicati, Restic, and BorgBackup that automate much of this with a web UI.

Always verify your database backups aren't empty. Add size checks to your scripts — a backup file under 100 bytes likely contains an error message, not data.

About the Author

Frank Pegasus

DevOps engineer and self-hosting enthusiast with over a decade of experience running containerized workloads in production. Creator of docker.recipes.