Dockerfile with Alpine Linux postgres-client that used the postgres-client executable pg_dump utility to download a backup of a PostrgreSQL database and then use the Google SDK gsutil to send the back up to Google Cloud Storage.
The docker container is meant to be executed and then terminate after processing is complete. It does not have a long running process to keep the container alive.
find this image on docker hub at: thestevenbell/pg_dump-to-google_gcs:latest https://cloud.docker.com/repository/docker/thestevenbell/pg_dump-to-google_gcs/general
-
join or init a docker swarm
-
set the required environment variables. These are passed as flags to the pg_dump command
export $PSQL_REMOTE_HOST=<psql> export $PSQL_USERNAME=<username> export $PSQL_DBNAME=<dbname> export $PGPASSWORD=<userpassword> export PSQL_REMOTE_HOST_PORT=<port> export PSQL_SCHEMA=<schema> export GCS_BUCKET_NAME=<googleCloudStorageBucketName> export PATH_TO_GCLOUD_SCV_ACCOUNT_CREDENTIALS_FILE=<pathToBucket.json>
-
create the secrets needed in the container
./manage-secrets.sh
-
start the container using the provided docker-compose.yml file.
docker stack deploy -c docker-compose.yml pg_dump2GC
-
checkout the logs for the container with
docker service logs pg_dump2GC_pg_dump-to-google_gcs -f