Backup a data-oriented container and a MariaDB/MySQL container and
upload them to a FTP/SFTP server using lftp.
This script will archive your /data (or custom) folder and use mysqldump to
backup your database.
FTP_USER- FTP server usernameFTP_PASS- FTP server user passwordFTP_HOST- FTP server hostnameFTP_PORT- FTP server portFTP_PROTO- Protocol to use (default: ftp)LOCAL_PATH- Absolute path for folder to backup (default:/data)REMOTE_PATH- Your FTP backup destination folder
COMPRESS- (Optional) Default:1, compress files TAR archiveCHUNK_SIZE- (Optional) Default:0, in mega Bytes, splits TAR archive into parts for better FTP upload with very large archivesPARALLEL_UPLOADS- (Optional) Default:3, only for split archives, max parallel uploads at the same time
DB_USER- (Optional) MySQL user nameDB_HOST- (Optional) MySQL host nameDB_PASS- (Optional) MySQL user passwordDB_NAME- (Optional) MySQL namePGDATABASE- (Optional) PostgreSQL Database namPGHOST- (Optional) PostgreSQL host namePGOPTIONS- (Optional) PostgreSQL optionsPGPORT- (Optional) PostgreSQL portPGUSER- (Optional) PostgreSQL user namePGPASSWORD- (Optional) PostgreSQL user password
Your PostgreSQL server version must match pg_dump: version 12.x max
docker run --rm -t --name="backup1" -v my-data-volume:/data:ro \
-e DB_USER="toto" \
-e DB_HOST="mariadb" \
-e DB_PASS="123abc" \
-e DB_NAME="foo_db" \
-e FTP_USER="username" \
-e FTP_PASS="butterfly" \
-e FTP_HOST="foobar.com" \
-e FTP_PORT="21" \
-e REMOTE_PATH="/backups/my-site" \
-e COMPRESS="0" \
-e CHUNK_SIZE="128" \
--link my-mariadb:mariadb ambroisemaupate/ftp-backupIf your folder to backup contains mostly JPG, PNG, WebP images (or any already compressed data), do not compress archive as it will use
lots of CPU for a little less space: COMPRESS=0.
Splitting archives before uploading can enhance stability and allow transfers to resume if you disconnected from FTP server : CHUNK_SIZE=128.
split command will generate *.partaa to *.partzz files in order to keep them in order and easily allow cat to join them back.
If you need to recover archive from a split file, use:
cat 20210727_0904_files/20210727_0904_files.tar.part* >> 20210727_0904_files.tarI use a simple bash script to automatize docker backups along a ftp-credential.sh script to store FTP access data once for all.
I variabilized docker name to generate automatically container names. I use the following naming policy:
- Main worker container is called
NAME - Database worker container is called after
NAME+_DBsuffix - Main data container is called after
NAME+_DATAsuffix - Database data container is called after
NAME+_DBDATAsuffix - Main back-up container is called after
NAME+_BCKsuffix
#!/usr/bin/env bash
# Author: Ambroise Maupate
# File: /root/scripts/bck-my-docker-container.sh
. `dirname $0`/ftp-credentials.sh || {
echo "`dirname $0`/ftp-credentials.sh";
echo 'Impossible to import your ftp config.';
exit 1;
}
NAME="my-docker-container"
docker run --rm -t --name="${NAME}_BCK" -v ${NAME}_DATA:/data:ro \
-e FTP_USER="${FTP_USER}"\
-e FTP_PASS="${FTP_PASS}"\
-e FTP_HOST="${FTP_HOST}"\
-e FTP_PORT="${FTP_PORT}"\
-e DB_USER="my_docker_container_dbuser"\
-e DB_HOST="mariadb"\
-e DB_PASS="my_docker_container_dbpass"\
-e DB_NAME="my_docker_container_db"\
-e REMOTE_PATH="/docker-bck/${NAME}"\
--link ${NAME}_DB:mariadb ambroisemaupate/ftp-backup
Here is the central ftp access data script.
#!/usr/bin/env bash
# Author: Ambroise Maupate
# File: /root/scripts/ftp-credentials.sh
FTP_USER="myFtpUser"
FTP_PASS="myFtpPassword"
FTP_HOST="myFtp.host.com"
FTP_PORT="21"Then all you need is to setup this in your root’s crontab:
0 2 * * * /bin/bash ~/scripts/bck-my-docker-container.sh >> bcklog-my-docker-container.logAnd do not forget to set executable flag on your scripts:
chmod u+x ~/scripts/ftp-credentials.sh
chmod u+x ~/scripts/bck-my-docker-container.shAdd FTP_PORT and FTP_PROTO environment vars.
docker run --rm -t --name="backup1" -v my-data-volume:/data:ro \
-e DB_USER="toto" \
-e DB_HOST="mariadb" \
-e DB_PASS="123abc" \
-e DB_NAME="foo_db" \
-e FTP_USER="username" \
-e FTP_PASS="butterfly" \
-e FTP_HOST="foobar.com" \
-e FTP_PORT="22" \
-e FTP_PROTO="sftp" \
-e CHUNK_SIZE="512" \
-e REMOTE_PATH="/home/username/backups/my-site" \
--link my-mariadb:mariadb ambroisemaupate/ftp-backupversion: "3"
services:
db:
image: mysql:5.7
volumes:
- DBDATA:/var/lib/mysql
environment:
MYSQL_DATABASE: test
MYSQL_USER: test
MYSQL_PASSWORD: test
MYSQL_RANDOM_ROOT_PASSWORD: "yes"
restart: always
backup:
image: ambroisemaupate/ftp-backup
depends_on:
- db
environment:
LOCAL_PATH: /var/www/html
DB_USER: test
DB_HOST: db
DB_PASS: test
DB_NAME: test
FTP_PROTO: ftp
FTP_PORT: 21
FTP_HOST: ftp.server.test
FTP_USER: test
FTP_PASS: test
CHUNK_SIZE: 512
REMOTE_PATH: /home/test/backups
volumes:
- public_files:/var/www/html/web/files:ro
volumes:
public_files:
DBDATA:You can add as much services as you want to create rolling backups: daily, weekly, monthly:
# DAILY
backup_daily:
image: ambroisemaupate/ftp-backup
depends_on:
- db
environment:
LOCAL_PATH: /var/www/html
DB_USER: test
DB_HOST: db
DB_PASS: test
DB_NAME: test
FTP_PROTO: ftp
FTP_PORT: 21
FTP_HOST: ftp.server.test
FTP_USER: test
FTP_PASS: test
REMOTE_PATH: /home/test/backups/daily
volumes:
- public_files:/var/www/html/web/files:ro
backup_cleanup_daily:
image: ambroisemaupate/ftp-cleanup
environment:
FTP_PROTO: ftp
FTP_PORT: 21
FTP_HOST: ftp.server.test
FTP_USER: test
FTP_PASS: test
STORE_DAYS: 7
FTP_PATH: /home/test/backups/daily
# WEEKLY
backup_weekly:
image: ambroisemaupate/ftp-backup
depends_on:
- db
environment:
LOCAL_PATH: /var/www/html
DB_USER: test
DB_HOST: db
DB_PASS: test
DB_NAME: test
FTP_PROTO: ftp
FTP_PORT: 21
FTP_HOST: ftp.server.test
FTP_USER: test
FTP_PASS: test
REMOTE_PATH: /home/test/backups/weekly
volumes:
- public_files:/var/www/html/web/files:ro
backup_cleanup_weekly:
image: ambroisemaupate/ftp-cleanup
environment:
FTP_PROTO: ftp
FTP_PORT: 21
FTP_HOST: ftp.server.test
FTP_USER: test
FTP_PASS: test
STORE_DAYS: 30
FTP_PATH: /home/test/backups/weekly
# MONTHLY
backup_monthly:
image: ambroisemaupate/ftp-backup
depends_on:
- db
environment:
LOCAL_PATH: /var/www/html
DB_USER: test
DB_HOST: db
DB_PASS: test
DB_NAME: test
FTP_PROTO: ftp
FTP_PORT: 21
FTP_HOST: ftp.server.test
FTP_USER: test
FTP_PASS: test
REMOTE_PATH: /home/test/backups/monthly
volumes:
- public_files:/var/www/html/web/files:ro
backup_cleanup_monthly:
image: ambroisemaupate/ftp-cleanup
environment:
FTP_PROTO: ftp
FTP_PORT: 21
FTP_HOST: ftp.server.test
FTP_USER: test
FTP_PASS: test
STORE_DAYS: 366
FTP_PATH: /home/test/backups/monthlythen launch them once a day, once a week, once a month from your crontab:
# Rolling backups (do not use same hour of night to save CPU)
# Daily
00 2 * * * cd /mywebsite.com && /usr/local/bin/docker-compose run --rm --no-deps backup_daily
20 2 * * * cd /mywebsite.com && /usr/local/bin/docker-compose run --rm --no-deps backup_cleanup_daily
# Weekly
00 3 * * 1 cd /mywebsite.com && /usr/local/bin/docker-compose run --rm --no-deps backup_weekly
20 3 * * 1 cd /mywebsite.com && /usr/local/bin/docker-compose run --rm --no-deps backup_cleanup_weekly
# Monthly
00 4 1 * * cd /mywebsite.com && /usr/local/bin/docker-compose run --rm --no-deps backup_monthly
20 4 1 * * cd /mywebsite.com && /usr/local/bin/docker-compose run --rm --no-deps backup_cleanup_monthly