How to save large tif files ?
uxhub opened this issue · 21 comments
Hello,
I've a segmentation fault error when saving large tif file. The saving process write (float) image up to 8.6go and crash, (unsigned char) image up to 2.2go and crash. It works fine with smaller files and I don't have any problem by using RAW format with large files.
Any idea to solve my problem ? is it a libtiff issue ? (Btw I used #define cimg_use_tiff and libtiff.so.5).
Best regards,
Any way you can get a debug log ? Maybe by activating '-fsanitize=address' if you use with g++ ?
Having the exact line number where the crash occurs would be nice.
When adding the '-fsanitize=address' option, the writing process doesn't start at all, and I have the following warning:
==8147== WARNING: AddressSanitizer failed to allocate 0x0002dc6c0000 bytes
not sure it helps.
So, it looks like a memory allocation problem.
How much RAM do you have ?
I don't understand why, I have 125go of free ram. (0x0002dc6c0000 bytes =11,44go)
The Problem might be, that you try to write more than 4GB into a TIFF-file. In Standard TIFF-files this is not possible, due to the usage of 32bit-adresses in the file format. It has been made possible with bigtiff, which uses 64-bit adresses and therefore allows for larger files. The newer Version of libtiff should Support BigTIFF, but I don't know, whether the appropriate functions are called from within CImg ...
Well yes, we have just added the support of bigtiff one month ago or so.
That's still in the repository (not released yet), but maybe the OP could try it.
Well I used the latest CImg.h header file uploaded. Should it work ?
uploaded from where ? The download page of the main site, or the git repo ?
Try the one in the git repo.
I use the last one from the git repo.
Ah! So it should work with big files. I've tested with a 4Gb file some weeks ago, and it were working as expected. Are you using Windows or Linux (or something else) ?
I use a 64bits ubuntu 14.04.
What I found strange is that the file I read is 3.1go, and when saving it as tif, it writes the file until 8.6go as a float image and until 2.2 as a char image before it crash (why this difference ?).
The difference can be easily explained : using a float-valued image requires to write float (4 bytes) for each value, while using an unsigned char image requires only 1 byte for each value.
What was the value type used to encore your original file ? Can be unsigned short (2 bytes per value) too.
Yes ok, but it doesn't stop to write at a certain file size. That's what I found strange. (by the way, it can write 8.6go, so I suppose bigtiff is enabled).
Any idea ?
I don't have any idea right now, but I'll be at the lab tomorrow and I'll test writing a 8Gb image file.
I'll let you know.
Could you provide a minimal non-working example or the dimensions of the image (w x h x d x s), type of compression (none, lzw, jpeg)? We could try to better diagnose the problem.
I tried this with not much success to reproduce the bug http://pastebin.com/pd8CzqL7
The only crashes i have is at the image creation when the image becomes too big..
I tried your code with success until allocation error (like you).
However, I slightly modified it so that depth raises too: http://pastebin.com/4tGLZ96N
-> I get the same seg fault behaviour even if i have a lot of free RAM:
512x512x512 4B mem:0.50GB none: ok
512x512x512 1B mem:0.12GB none: ok
1024x1024x1024 4B mem:4.00GB none: ok
1024x1024x1024 1B mem:1.00GB none: ok
2048x2048x2048 4B mem:32.00GB noneErreur de segmentation (core dumped)
with '-fsanitize=adress'
512x512x512 4B mem:0.50GB none: ok
512x512x512 1B mem:0.12GB none: ok
1024x1024x1024 4B mem:4.00GB none: ok
1024x1024x1024 1B mem:1.00GB none: ok
==2963== WARNING: AddressSanitizer failed to allocate 0x000800000000 bytes
2048x2048x2048 4B mem:32.00GB none: ok
==2963== WARNING: AddressSanitizer failed to allocate 0x000200000000 bytes
2048x2048x2048 1B mem:8.00GB none: ok
[...]
-> fsanitize indicates I need 0x000800000000 bytes = 256go while saving the 2048^3 image file ? why ?
OK, I think I've found the bug (with the help of Jérome).
Commit 0f1efdc should fix it. Could you try yourself and tell us if this is working as expected ?
Thanks again for the bug report.
We are working on an additional patch to reduce the memory footprint of function save_tiff()
as well.
Ok it's working like a charm now, good job ! Thank you for your reactivity.
And congratulations for the amazing work done in CImg !
Best regards,
You're welcome. Thanks for the kind words (and the bug report :) ).
Well done David!