Encryption crashes with large files
Ozwel opened this issue · 8 comments
Hi,
Picocrypt sounds great to encrypt backup files prior to uploading them on storage platforms such as Azure Storage / AWS S3, etc.
However, while I can encrypt my 1 or 2TB files with 7z, Picocrypt cannot.
Is there a way to allow large file encryption with Picocrypt? I guess the limitation comes from my system disk (1TB SSD) but there should be a way since 7z has no issue.
Thanks for reporting this issue! There is no designed size limitation in Picocrypt, so this is an interesting issue. First, by not working, do you mean Picocrypt is crashing, or it is showing an error message but still running? These are two very different cases so I need to know before I can look into it deeper. Ideally, run Picocrypt in a terminal and then screenshot what happens. Also, give me your OS and its version/release/etc., exact system disk size in GB, exact system disk usage in GB, and the exact size of the files you are encrypting in GB. Thanks.
Hi,
I performed my "test" again and I feel like the issue is you are using the system disk as a cache while combining prior to compressing/encrypting. Therefore, if free space on c:\ is smaller than the files to encrypt, it will fail. Since many users will use a small SSD as their system disk (500GB or 1TB) and keep larger ones to store data, the scenario when combining fails will be often met when encrypting backups.
Here is my test to reproduce:
I took my OneDrive folder (1,33TB large) and drag&dropped it into your tool.
Here is a screenshot of my setup with all the info right before launching the encryption process: https://ibb.co/dttF4SR
Note on this screenshot there is ~500GB of free space on my system disk and there is enough free space on d:\ which will host the encrypted output.
Then I launched the process: I set a password and clicked on the Encrypt button.
Here is a screenshot while the tool is combining: https://ibb.co/qC9mH1F
Here I have noticed free space on c:\ is quickly vanishing at the rate of the combining speed (here ~300MB/s).
Then when there was not free space on c: anymore, your tool stopped as shown here: https://ibb.co/HgjC2by
Message from your tool is "Insufficient disk space".
Thanks for the detailed debugging! I know exactly what is going on now, and it is indeed what you predicted. Good work!
In v1.29 and earlier, I used the target drive as a cache to combine the files into a .zip before encrypting. However, in v1.30, I changed things so that the system temporary folder would be used as a cache, since the system disk is almost always the fastest disk, and if anything goes wrong, the temporary folder will be cleared by the OS eventually anyways. This allowed for much faster and safer combining speeds when the target is a slow storage medium such as an SD card, but it also requires enough free space on the system disk to do it.
You are very correct in pointing out the system disk to external disk size ratio. I have a 256 GB SSD in both my laptop and desktop, but my external hard drive is 1 TB. So I can see this being a issue for many people indeed.
The solution to this problem is complicated, because there's no native and reliable way to query free disk space across operating systems. So I can't just look at the system disk usage as you would in a file explorer.
I have a solution in mind, which is basically to use the system disk as cache by default, and if an error occurs, automatically fall back to using the target disk as the cache and do it again. It will be fully automated and you won't need to do anything other than waiting. The only downside of this is that you'll have to wait extra time for the system disk to fill up before Picocrypt detects the issue and switches to the target disk to do the final encryption. But a solution is still better than no solution, right? I will make this change in v1.31, which I expect to release over the upcoming weekend (no guarantees though).
Let me know what you think. Also, you can use v1.29 which uses the target disk as a cache by default, which should allow you to encrypt your 1.33 TB of data.
I guess it would be quite frustrating to know in advance the app will spend an hour working for no reason (Picocrypt needs ~1h to zip 1TB).
Isn't it possible to check the free space on c: and do a comparison with the files loaded in app before selecting it as the cache location?
Maybe for 1.32? :-)
That would be ideal, but there isn't a native way in Go to do it. On Windows, I would need to use the Win32 API. On Linux, I would need to use terminal commands to fdisk
or something like that (I'm not a Linux expert), and I don't even know how I would get started on macOS. I'll take a deeper look and see if there's a quicker solution, and if I can find one, I'll put it in 1.31. If not, 1.31 have the hacky but usable solution.
@Ozwel I have a solution! If you are encrypting less than 100 GB, then use the system cache, if more than 100 GB, use the target disk. This gives the best of both worlds - for small and casual files, Picocrypt can use the fast and reliable system disk, which will likely have 100 GB free. For files over 100 GB, I will use the target disk since if you're encrypting over 100 GB, you are probably storing the output on a large hard drive. I think this is the way to go and will likely be present in v1.31. Let me know if this sounds good.
Actually, ignore that, I've decided to go back to the way it was done in v1.29, where the target is always used. I think people are sensible enough not to encrypt 1 TB of data onto a slow SD card, and using the system causes more problems than it solves. For v1.31, I will go back to using the target disk by default and for all cases. So you should be able to use v1.31 to encrypt your large files.
I just released v1.31 which uses the target disk by default and always, so this issue is solved and I will close it. Please give it a test, and if it doesn't work for some reason, please feel free to reopen this issue so we can debug further. Thanks!