ipfs/go-ds-s3

Multiple IPFS instances one S3 bucket. Is this possible?

Rick7C2 opened this issue · 4 comments

I want to set up multiple IPFS instances around the world that share the same S3 datastore.

Only one IPFS instance would need to be able to write to the S3 bucket. The rest will only read the data on S3 and serve the content via IPFS.

Is this possible with the S3 datastore? If so how would I go about doing this?

I've got two VPS's with IPFS and the S3 plugin setup to use the same bucket. I add and pin a file on one then shut that IPFS server down.

I then try to receive the file from an IPFS client bootstrapped to the second IPFS server attached to the S3 bucket and nothing happens.

Thank you for submitting your first issue to this repository! A maintainer will be here shortly to triage and review.
In the meantime, please double-check that you have provided all the necessary information to make this process easy! Any information that can help save additional round trips is useful! We currently aim to give initial feedback within two business days. If this does not happen, feel free to leave a comment.
Please keep an eye on how this issue will be labeled, as labels give an overview of priorities, assignments and additional actions requested by the maintainers:

  • "Priority" labels will show how urgent this is for the team.
  • "Status" labels will show if this is ready to be worked on, blocked, or in progress.
  • "Need" labels will indicate if additional input or analysis is required.

Finally, remember to use https://discuss.ipfs.io if you just need general support.

cc @MichaelMure any thoughts or ideas here? It seems like if you don't GC then you could potentially share a blockstore (not the whole datastore) across a number of repos.

Seems this was already asked here... #19

Is this just speculation or would this actually work in RO mode?

Is there a way to make IPFS RO?

Not really. Basically, you can share a single S3 bucket, but it's a "you need to know what you're doing" kind of thing.