phoboslab/qoi

Consider adding a sRGB flag to the header

richgel999 opened this issue · 10 comments

There's no way to tell if the file contains pixels/texels in the sRGB colorspace, or not. It's a single bit, and in some applications this is very valuable.

Or the format could enforce a colorspace, for simplicity's sake

How you use QOI in your App/Game is ultimately your decision. However, we could hint at the exact format that you should use to maximize interoperability.

I like this approach:

The RGB-data should be sRGB for best interoperability and not alpha-premultiplied.
~ http://tools.suckless.org/farbfeld/

I saw this reply in another (closed) bug:

As an interchange format (which QOI is not)

You have a chance to create a new lossless file format which will catch on, which is extremely rare. Once it's added to stb_image.h/stb_image_write.h it'll be automatically supported by a number of apps.

I think QOI is very useful for interchange - just keep it as simple as possible (roughly as simple as it is now). Some sort of sRGB or format hint would be super valuable to some classes of users/applications. The format should also indicate if there's any alpha data or not, which is important for loading speed (otherwise we have to scan the entire loaded image to see if there's alpha).

Also it would be ideal if QOI's license was public domain, like stb_image.h. The value is in the idea, not the code (it's only ~300 lines).

Or the format could enforce a colorspace, for simplicity's sake

Users will want to store normal maps (with or without an alpha channel) into the format, for example. So it shouldn't enforce a particular colorspace. The highest value add is "this image is sRGB" or not.

Sorry, can you please clarify why this sRGB bitflag would be needed? When I load a normal map, wouldn't I know it's a normal map and it should not be treated as sRGB?

Does (texture) compression handle sRGB data differently from linear RGB?

How would my open source general purpose image compression/processing tool know it's a normal map (for example)? I have no flag or colorspace information in the file. The user would have to tell the tool in some way. Whether or not the input pixels/texels are sRGB controls which colorspace error metrics are used during lossy compression. It really needs image file metadata. We have this with PNG's "gAMA" or "sRGB" chunks:
http://www.libpng.org/pub/png/spec/1.2/PNG-Chunks.html

Another example: KTX2 encodes the colorspace of the image/texture (I could send a link but the spec is huge).

When we load a QOI file into a 3D renderer that uses texture mapping+filtering, we need to know if it's sRGB or not to enable sRGB texture filtering in the hardware:
https://medium.com/@tomforsyth/the-srgb-learning-curve-773b7f68cf7a

It's obviously not terminal if QOI doesn't tell the loader which colorspace the data is in. "Linear vs. sRGB" is the most useful in my experience. If it doesn't have this information, I wouldn't be able to convert from PNG->QOI without losing some important key information.

I get it - you want to keep QOI simple. But there is such a thing as too simple. A 90's image file format (PNG) supported this concept, and the format is widely popular, so it's worth considering in some way.

PNG sRGB chunk:
http://www.libpng.org/pub/png/spec/1.2/PNG-Chunks.html#C.sRGB

When my app sees a file that is in the sRGB colorspace it knows the data is not (for example) a normal map, and can treat it properly.

I propose either a single bit, indicating that the RGB data is sRGB or not. Or alternatively, 4 bits for each channel. Alternatively a single byte with 0 or 1 is fine.

Does (texture) compression handle sRGB data differently from linear RGB?

Yes - very differently. If it's sRGB we can use perceptual colorspace metrics, resulting in reduced artifacts and increased compression for the same perceptual error. Many texture compressors (etc2comp, astcenc, crunch, basisu, etc.) support this.

This is now implemented. See #37 for more info.