/go-s3

S3 api made easy with golang

Primary LanguageGoApache License 2.0Apache-2.0

💾 Go s3

Utility to use s3 storage using golang. It could upload content to s3 storage to use as backup service.

👀 Project status

Action Status

Coverage Quality Gate Status Maintainability Rating Security Rating Bugs Vulnerabilities

🔑 Create keys

Access key and Secre key should be defined in environment variables check example.env for more info about environment variables

⛵ Docker usage

You could use cli app in docker

🚣 Build docker image

docker build -t s3 .

🔰 Environment variables

  • ENDPOINT: Endpoint to connect, could be any s3 compatible endpoint (it could be defined in commandline with --endpoint or -e, example: http://s3.eu-central-1.amazonaws.com:9000)
  • ACCESS_KEY: Access key to use (required)
  • SECRET_KEY: Secret key to use (required)
  • BUCKET: Bucket name to connect (it could be created using -c or it could be defined in commandline with --bucket)
  • MAX_RETRIES: Maximum retry connections when fail (optional, example: 3)
  • FORCE_PATH_STYLE: Force path style (optional, example: true)
  • SSL_ENABLED: Enable ssl connection to endpoint (optional, example: false)

🏁 Cli help output

Docker usage

Using docker image from hub.docker.com

docker run --rm d0whc3r/gos3 --help

Cli usage

Help for go s3

Usage:
  gos3 [flags]

Examples:
  1. List files in "sample" bucket.                                                                             $ gos3 -e http://s3.eu-central-1.amazonaws.com --bucket sample -l
  2. Backup multiple files to "backupFolder" folder.                                                            $ gos3 -e http://s3.eu-central-1.amazonaws.com --bucket sample -b src/index.ts -b images/logo.png -f backupFolder
  3. Backup files using wildcard to "backup" folder.                                                            $ gos3 -e http://s3.eu-central-1.amazonaws.com --bucket sample -b src/* -b images/* -f backup
  4. Backup files using wildcard and zip into "zipped" folder, bucket will be created if it doesn't exists.     $ gos3 -e http://s3.eu-central-1.amazonaws.com --bucket sample -b src/* -b images/* -z -f zipped -c
  5. Backup files using wildcard and zip using "allfiles.zip" as filename into "zipped" folder, bucket will     $ gos3 -e http://s3.eu-central-1.amazonaws.com --bucket sample -b src/* -b images/* -n allfiles.zip -f zipped -c -r
  be created if it doesn't exists and zipfile will be replaced if it exists                                                                                                                                     
  6. Delete files in "uploads" folder older than 2days and files in "monthly" folder older than 1month          $ gos3 -e http://s3.eu-central-1.amazonaws.com --bucket sample -d uploads=2d -d monthly=1M
  7. Delete files in "uploads" folder older than 1minute                                                        $ gos3 -e http://s3.eu-central-1.amazonaws.com --bucket sample -f uploads -d 1m
  8. Generate mysql dump file zip it and upload to "mysql-backup" folder                                        $ gos3 -e http://s3.eu-central-1.amazonaws.com --bucket sample -f mysql-backup -m -z

Flags:
  -b, --backup stringArray   Backup files
      --bucket string        Destination bucket (can be defined by $BUCKET env variable)
  -c, --create               Create destination upload bucket
  -d, --delete stringArray   Clean files older than duration in foldername
  -e, --endpoint string      Destination url (can be defined by $ENDPOINT env variable)
  -f, --folder string        Folder name to upload file/s
  -h, --help                 help for gos3
  -l, --list                 List all files
  -m, --mysql                Mysql backup using environment variables to connect mysql server ($MYSQL_USER, $MYSQL_PASSWORD, $MYSQL_DATABASE, $MYSQL_HOST, $MYSQL_PORT)
  -r, --replace              Replace files if already exists when backup upload
  -z, --zip                  Zip backup files
  -n, --zipname string       Zip name for backup files                      

Alternatives

The same interface/api using nodejs: node-s3