backup/backup

Keep not working with S3

wasif-saee-1993 opened this issue · 5 comments

Keep option is not working with s3 settings.

I have almost 700 backups on my s3 account now i am trying to limit it to only saving the last 40 backups. I updated my store_with settings and add a keep option with a value of 40. But still, backups are being saved but old ones are not being deleted.

config is as follows.

store_with S3 do |s3|
s3.access_key_id = '....'
s3.secret_access_key = '.....'
s3.region = '....'
s3.bucket = 'abc'
s3.path = ''
s3.keep = 40
end

  • Operating system: Ubuntu 14.04.3 LTS
  • Ruby version: ruby 2.5.0p0
  • Tools or services used by backup: PostgreSQL
  • backup (5.0.0.beta.1)

I'm seeing this myself also, both S3 and Local store options.
It used to work on my old setup, but stopped after moving to Dockerised setup, so perhaps there's something with containers that I'm missing?
Some state file or something?

I'm running it via

docker-compose run --rm backup /bin/bash -c '/usr/local/bundle/bin/backup perform -t my_model --config-file /backup/model.rb'

The backup container is running ruby-2.6

I had a similar problem.
For me, it was because Backup's persistent information didn't match with the actual backup data in S3.

Here's how Backup's Cycler works

  1. When Backup's Cycler is used, small YAML files are stored in data_path.
  2. And it removes outdated backup data based on that information.

So if the outdated backup data is somehow missing in the persistent information it will never be deleted.

How to solve

You have to match the persistent information to the actual backup data.
I just deleted all backup data and the persistent information.

The location of the data_path can be configured in config.rb. The default is <root_path>/.data.
For S3 backups, it would be <root_path>/.data/<backup_name>/S3.yml.

stale commented

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

I don't have much time to dig into these kind of issues. You'll have to send a PR

dlynam commented

Are you using different storage ids (such as daily/weekly/monthly)?

If so, you need to specify the storage_id in the store_with method such as:

store_with S3, storage_id do |sftp| .... end

Another example here: http://backup.github.io/backup/v4/storages/