Dithering with a grayscale palette has "unexpected" results with color image input
makew0rld opened this issue · 4 comments
"unexpected" is in quotes because the math is sound, but the result is almost never what you want. This is a property of dithering, and is not related to the library code.
The bad output can be very severe with error diffusion dithering, but ordered dithering still can have issues too.
The solution is to always make an image grayscale before dithering with a grayscale palette. The library should have a convenience function for this, as well as explain it in the README. Should the library detect grayscale palettes and make the image grayscale automatically when dithering? Probably not, the user should still have the option to.
Example
Original image:
Original image but made to black and white:
Floyd-Steinberg dithering the color image with an 8-bit sRGB grayscale palette of 0, 156, 213, 255. 100% strength:
And the same, but dithering the black and white image:
The README now mentions this, I think that's good enough.
This issue has been fixed in the weighted
branch, which weights each channel to form a proper grayscale. It always outputs the correct second image, no matter if the image has been made grayscale beforehand or not. For more information on how this fix came to be, see makew0rld/didder#14.
This issue will be closed once the branch is merged, the README has been updated to remove mention of this, and didder code and docs are updated to not automatically convert the image to grayscale if the palette is grayscale.
I don’t know your particular formulas, but the general order of events I would try for this would be:
- Determine the sRGB tristimulus encoded values.
- Decode from encoded state to uniform tristimulus.
- Calculate the weighted sum for BT.709 / sRGB. Four decimal approximation
(0.2126 * R) + (0.7152 * G) + (0.0722 * B)
- Take the uniform luminance to a perceptual-like distribution such as L*.
Use the output of 4. as your metric of distance for the dithering.