LoadingByte/cinecred

Adding more Output Colorspaces

Opened this issue · 4 comments

It would be great to add more output colorspaces and whitepoints. For professional usage at least one Cinema colorspace would be great. Here's a whishlist for some colorspaces which I use on a regular basis as a professional:

Gamut EOTF (Gamma)
P3D60 Gamma 2.6
P3D65 Gamma 2.6
P3DCI Gamma 2.6
Rec2020 PQ
ACES (AP0) Linear

I guess there are a lot more of options and combinations which would make sense. Maybe - to keep the UI clean and easy - there could also be an "professional" option, where you can select Gamut and EOTF seperately.

Also: Currently you refer to the ETOF as BT.709, which does not really clearly define a transfer function. If you are referring to a value of approx. Gamma 2.4 you should reference it as BT.1886.
The allocation for limited and full range should be linked to the output format, and not to the color space, or at least should be changeable.

I always expected that this feature request would pop up at some point :)


First of all, regarding BT.709: as far as I know, while BT.709 doesn't define an EOTF, it does define an OETF. As I see Cinecred as kind of a "camera" that (conceptually) creates a linear RGB image and then applies the BT.709 OETF to it for delivery, I think it's justified to refer to this process as "BT.709 Gamma" (I know it should technically be called "transfer function" or even "OETF", but most less technically versed people only know the term "gamma", I think).

However, I'm by no means a digital imaging expert. This is just knowledge I collected from online resources. As you know your stuff, I'd really like to hear your opinion on my thinking of viewing Cinecred as a "camera".


Now, regarding range: at the moment, limited range is used only for BT.709 YCbCr delivery, and full range is used for everything else (i.e., BT.709 RGB, sRGB, and sYCC delivery). Is this a good default? Do you recon that there's a use case for the user overriding this default? If yes, it would be very easy to add an option for that.


Finally, regarding additional color space options: from a technical point of view, it would be easy to let the user choose the primaries, transfer characteristic, and even the YCbCr coefficients and chroma subsampling location. In addition, there could be a list of preset "color spaces" that preselect these options for the user, just like "sRGB" and "Rec.709" right now.

The last three options would be unproblematic, but the primaries option introduces a problem: For everything apart from embedded videos (which are converted to the delivery color space directly), Cinecred renders in sRGB internally, which of course uses the BT.709 primaries. So even when the user chooses wider primaries for the delivery, all the colors would still reside inside the BT.709 triangle.

Now, there are three options, and I need your thoughts on them:

  1. We say that this behavior is fine. It would of course need to be documented.
  2. I somehow "blow up" the BT.709 colors to the wider gamut. I'm however not aware whether there is a standardized algorithm for this.
  3. We render in the wider gamut from the start. This not only requires switching out the rendering backend (as the current one is sadly limited to 8-bit sRGB), but also introduces two challenges:
    • With the preview: the standard color space assumed by most operating systems is sRGB, and while it's sometimes configurable, that's a deep rabbit hole to dive into; however, by adding DeckLink support or similar, we could at least partially offer a solution.
    • With colors picked by the user: all software I'm aware of provides only sRGB color pickers, and if the user is restricted to picking only sRGB colors anyway, there's no point in drawing in a wider gamut.

I always expected that this feature request would pop up at some point :)

Let me say this first: A colleague of mine found your tool on reddit and forwarded it to me. I had a quick sneek and peek and found you tool very easy to understand and very useful. Some of the things Cinecred does automatically would be a real pain, if you'd try to code this with AfterEffects expressions. I'd love to give Cinecred a try in some of our productions. But to do this I'd llike to sort out some of the things mentioned in the request.

First of all, regarding BT.709: as far as I know, while BT.709 doesn't define an EOTF, it does define an OETF. As I see Cinecred as kind of a "camera" that (conceptually) creates a linear RGB image and then applies the BT.709 OETF to it for delivery, I think it's justified to refer to this process as "BT.709 Gamma" (I know it should technically be called "transfer function" or even "OETF", but most less technically versed people only know the term "gamma", I think).
However, I'm by no means a digital imaging expert. This is just knowledge I collected from online resources. As you know your stuff, I'd really like to hear your opinion on my thinking of viewing Cinecred as a "camera".
I'm definitly not an image engineer, but I'm an artist with some technical insights. So let's try to work this out.

Yes, you are right. I should have just called it Transfer Function. And I see your point, seeing it as a "camera". There are a few things that need to be taken into consideration.
First: Having the Titles as a "Camera" means, that you always need to have a coherent color management for the titles. Therefore it is not really possible to just insert the titles "headless" in any mastering workflow.
Second: I am used to render my work in the desired output color space in which the master will be delivered. That prevents accidental colorshifts in the best way possible. But there are a ton of different workflow options.
Third: Using Graphic assets scene referred is pretty uncommon. Usually are in a display referred working space. That might also be the reason why I stumbled over the BT.709 Transfer Function. I expected the Rec.709 option to be delivered with a Gamma ~2.4 (BT.1886) Transfer Function

Now, regarding range: at the moment, limited range is used only for BT.709 YCbCr delivery, and full range is used for everything else (i.e., BT.709 RGB, sRGB, and sYCC delivery). Is this a good default? Do you recon that there's a use case for the user overriding this default? If yes, it would be very easy to add an option for that.

To be honest: I am not sure about this. As far as I know most formats "expect" limited or full range. I think H.264 for example expects limited range, while TIFF or DPX expect full range. I have to do some more research on this topic to come up with a reliable suggestion. However I wouldn't mind, if there was an override option. But this might confuse inexperienced users.

Finally, regarding additional color space options: from a technical point of view, it would be easy to let the user choose the primaries, transfer characteristic, and even the YCbCr coefficients and chroma subsampling location. In addition, there could be a list of preset "color spaces" that preselect these options for the user, just like "sRGB" and "Rec.709" right now.

The last three options would be unproblematic, but the primaries option introduces a problem: For everything apart from embedded videos (which are converted to the delivery color space directly), Cinecred renders in sRGB internally, which of course uses the BT.709 primaries. So even when the user chooses wider primaries for the delivery, all the colors would still reside inside the BT.709 triangle.

Now, there are three options, and I need your thoughts on them:

  1. We say that this behavior is fine. It would of course need to be documented.

  2. I somehow "blow up" the BT.709 colors to the wider gamut. I'm however not aware whether there is a standardized algorithm for this.

  3. We render in the wider gamut from the start. This not only requires switching out the rendering backend (as the current one is sadly limited to 8-bit sRGB), but also introduces two challenges:

  • With the preview: the standard color space assumed by most operating systems is sRGB, and while it's sometimes configurable, that's a deep rabbit hole to dive into; however, by adding DeckLink support or similar, we could at least partially offer a solution.
  • With colors picked by the user: all software I'm aware of provides only sRGB color pickers, and if the user is restricted to picking only sRGB colors anyway, there's no point in drawing in a wider gamut.

Okay, I see your problem. Since my "programming" skills have its boundries somewhere around a bash script and 2 lines of pyhton, I have no idea how much impact this has, but I would strongly recommend switching the render engine. 8-Bit precision is not really up to date anymore. It is still fine for many television deliverys and Youtube Webshows. But more and more shows, even for german broadcast stations have to be delivered in HDR, which requires a higher precision, and a wider color gamut. And it really isn't a good idea to stretch the colorspace (and even the transfer function with 8 Bit precision). As long as we are talking about simple credits (White on Black, or Color on Color) there is still a lot of time left until 8-Bit precision becomes obsolete.
For now I would recommend to stick to the primaries and transfer functions which are close to 709 and sRGB, to avoid bandings or artifacts produced by any color space conversion. There are always some clumsy procution-partner logos with gradients in the titles.

The challenges mentioned in you last point: Yes, this is a rabbit hole. And every professional who didn't get lost at some point is either an engineer or lying. I guess a common solution for most software providers is the use of display color manangement. The Codevalues of the picked color are always absolute and referring to the internal working colorspace. But the appearance of the color changes, when changing the colorspace. This even happens in After Effects for example.
A solution could be the (additional) use of OCIO Based ACES Colormanagement, but this generates new problems, since this isn't easy to understand, even for experienced users.

I'm very grateful for your input! In the following write-up, I've made open questions that you could comment on italic.

(A) Swapping out the renderer

I'm currently working on switching to another renderer that natively supports very high bit depth, a linear transfer characteristic, and arbitrarily wide gamuts. Getting this done will provide the foundation for non-sRGB color space support.

(B) BT.709 vs BT.1886 transfer function

I think I've got a good grasp on the biology, physics, and math behind the digital representation of color. Still, how transfer functions apply to typical imaging workflows still eludes me. To remedy this, I'll present my thoughts on the BT.709/BT.1886 workflow now and would be very grateful if you could tell me whether I'm wrong somewhere. For simplicity, I'll assume we're staying in the BT.709 gamut all the way through, so we can really focus on the transfer characteristics.

  1. Let's start simple. A video camera records an image (with a linear light response, as this is how sensors work), passes it through the BT.709 OETF, and then stores it. The video stream is tagged with the ISO/IEC 23091 transfer characteristic code point 1 to mark the usage of the BT.709 OETF. When a TV later plays back the video, it passes the stored image through the BT.1886 EOTF to get a linear light response image that the TV can display.

    Captured Light
           |
           |  Camera applies BT.709 OETF
           v
      Video File
           |
           |  TV applies BT.1886 EOTF
           v
    Displayed Light
    

    Because the BT.1886 EOTF is not the inverse of the BT.709 OETF, displayed light is actually different from scene light by a gamma of about 1.2.

  2. Now presume that some actually edits the video in an NLE. An non-color-managed NLE would copy the pixels verbatim without applying any transfer function. A color-managed NLE would most likely apply the inverse BT.709 OETF to get linear scene light, and then apply the BT.709 OETF when exporting. Either way, both the NLE's input and output video files have the BT.709 OETF applied. During editing, the reference monitor applies the BT.1886 EOTF to the BT.709 OETF monitor stream provided by the NLE.

        Captured Light
               |
               |  Camera applies BT.709 OETF
               v
          Video File
               |
               |  non-color-managed: no change
               |      color-managed: NLE applies inverse BT.709 OETF
               v
    Internal Representation
               |
               |  non-color-managed: no change
               |      color-managed: NLE applies BT.709 OETF
               v
    Monitor In & Video File
               |
               |  TV or reference monitor applies BT.1886 EOTF
               v
        Displayed Light
    
  3. Let's bring Cinecred into the picture now. If the NLE or finishing tool is not color-managed, I can now see the benefit of passing the linear light titles through the inverse BT.1886 EOTF on export: there is no longer a gamma difference of 1.2 between the linear light representation in Cinecred and the linear light representation on the TV. So if your monitor is properly calibrated and you view it in TV-like conditions, what you see in Cinecred will be exactly what the audience will see on their TVs.

        Rendered Titles
               |
               |  Cinecred applies inverse BT.1886 EOTF
               v
          Video File
               |
               |  non-color-managed NLE: no change
               v
    Internal Representation
               |
               |  non-color-managed NLE: no change
               v
    Monitor In & Video File
               |
               |  TV or reference monitor applies BT.1886 EOTF
               v
        Displayed Light
    
  4. If the NLE or finishing tool is color-managed, it would still be best for Cinecred to apply the inverse BT.1886 EOTF to the render, but at the same time tag the video stream with the code point 1 to pretend that the BT.709 OETF has been applied. This makes the color-managed NLE apply the inverse BT.709 OETF prior to processing the titles. As you can see in the following graphic, the inner BT.709 and the outer BT.1886 applications cancel out, so we get the same what-you-see-is-what-you-get behavior as in the non-color-managed case.

        Rendered Titles
               |
               |  Cinecred applies inverse BT.1886 EOTF
               v
          Video File
               |
               |  color-managed: NLE applies inverse BT.709 OETF
               v
    Internal Representation
               |
               |  color-managed: NLE applies BT.709 OETF
               v
    Monitor In & Video File
               |
               |  TV or reference monitor applies BT.1886 EOTF
               v
        Displayed Light
    

Now, the question is how to present this to the user. We have to consider two cases:

  1. For the color space preset that also includes gamut, should we call it Rec.709 Display-Referred? Would there be any merit in still exposing the old behavior as a preset called something like Rec.709 Scene-Referred?
  2. For the separate gamma and gamut selection, should we call the gamma option inverse BT.1886 EOTF, just BT.1886, or even BT.709 or BT.1886 (tagged as BT.709)? Or "Rec.709 Display-Referred" as well? How is this usually done in other titling applications? And once again, should we still support the old behavior, and how should it be called?

(C) More gamuts and transfer functions

While we're at it, let's expand on how the user selects the delivery color space. I'll definitely keep the list of presets that select a gamma and gamut in unison, but also add a "Custom" option. Upon selection of that, two additional dropdowns appear, which offer a good set of gamuts and transfer functions.

  1. What do you think, which gamuts (with white points) should Cinecred support?
  2. And which transfer characteristics, apart from the Rec.709 ones already discussed in section (B)?
  3. Finally, which combinations of gamuts and transfer functions could make a good list of presets?

(D) Full vs limited range

From a bit of reading, I think that you're right: for container deliveries, which are actually always YCbCr in Cinecred, we need limited range, while for image sequences, full range is used. This is what's already done by Cinecred.

I will however add an additional dropdown for the range. By default, "Auto" is selected, and in addition, a hint like "Only touch this if you know what you're doing" will appear next to it, so casual users hopefully just ignore it.

I will also remove the range indicator from the color space preset dropdown.

(E) YCbCr matrix coefficients

These depend on the video standard. I think it'll be easy to choose the right ones depending on the selected gamut and gamma. But like for the range, I'll add a dropdown that allows to explicitly specify them if need be.

(F) Exploiting wider gamuts

Delivering in a wider gamut would of course be useless if all the colors remained in the BT.709 gamut. So let's talk about how the user can generate colors that exceed BT.709.

First of all, when the user embeds pictures or videos that come in a wider gamut, Cinecred simply color-manages them to the delivery gamut, so as many colors as possible are retained.

The more challenging part is how to let the user choose colors outside the BT.709 D65 gamut, mostly because we'd need to accurately reproduce these colors to the user. Thanks for the hints you've given me; I'll read more on this and also look into operating system color management. In the end, I probably somehow need to tell the OS that some parts of the window should not be interpreted as sRGB, but some other color space. At the moment, it looks like this is the only remaining issue for which I don't have a clear path forward yet.

The just released Cinecred 1.6.0 brings most of the requested features, with the notable exception of on-screen previews being limited to sRGB. Still, arbitrary color spaces can be played out via a DeckLink card. Maybe I'll look into supporting the former in the future, but seeing as the OS situation is quite complicated, this isn't very high on my list of priorities.

The second thing that's missing is any form of Gamma 2.6. The only relevant CICP code is that for SMPTE ST 428, which is not pure Gamma 2.6, but includes a scaling factor:

CICP specification of ST 428

Implementations like the one I'm using also implement this scaling factor:

float st_428_eotf(float x) noexcept
{
	return x < 0.0f ? 0.0f : 52.37f / 48.0f * zimg_x_powf(x, 2.6f);
}

Now is this what you would need? In that case, I'll be happy to add support for delivering in the ST 428 EOTF to the next patch 1.6.1.