TurboGit/hubicfuse

Unable to download large file

Closed this issue · 2 comments

Hello,

I uploaded a large backup (350go), I would like to download it, but for an unknown reason my transfert crashes at 100go approximately using hubicfuse.

I see that there are multiples segments of my backup file, is it possible to use those segments in order to rebuild the original file (I also created and successfully downloaded par archive in order to protect my backup image).

Best regards, Nikos

Hard to help as I'm not using large files with GitHub. Have you watched the /tmp directory? I think this is used to temporarily get the files. Has it enough space?

Hello @TurboGit,

I can answer now my question :)

As I used PAR2 archive based on my 300GB file I was able to check damaged or missing blocks and indeed the segments are only splitted elements from my file.

You can safely download all segments from your files, you can then join them using "cat" on Linux system.

Regards,

Nikoos