Skip to content

Conversation

kasper93
Copy link
Member

@kasper93 kasper93 commented Oct 11, 2025

The sRGB EOTF is a pure gamma 2.2 function. There is some disagreement regarding the sRGB specification and whether it should be treated as a piecewise function. Many displays are actually gamma 2.2, and content mastered for PC is typically affected by that. Therefore, linearize it as such to avoid raised blacks.

See:
IEC 61966-2-1-1999
https://community.acescentral.com/t/srgb-piece-wise-eotf-vs-pure-gamma/4024
KhronosGroup/DataFormat#19
https://gitlab.freedesktop.org/pq/color-and-hdr/-/issues/12
https://github.com/dylanraga/win11hdr-srgb-to-gamma2.2-icm

Copy link

github-actions bot commented Oct 11, 2025

@Headcrabed
Copy link
Contributor

@kasper93 What about enable this on ACM-disabled device by default?

Another read material that could be added into documentation: https://projects.blender.org/blender/blender/issues/145022

@na-na-hi
Copy link
Contributor

The sRGB EOTF is a pure gamma 2.2 function. There is some disagreement regarding the sRGB specification and whether it should be treated as a piecewise function.

I am not seeing any disagreement here. The sRGB specification is very clear about this:

  • The sRGB reference display has an EOTF of a pure gamma 2.2 function. This is for the reference display ONLY - it is intended to be used by the sRGB reference display to map the sRGB encoded values to light intensity.
  • The piecewise function is used for everything else: converting between sRGB and other color representations like XYZ, which requires the function to be invertible.

The above indicate that the "gamma 2.2" treatment should only be explicitly done during the final output phase when displaying sRGB content on an HDR display (to keep the light intensity the same as on sRGB display). For sRGB display which accepts sRGB-encoded values, no conversion should be done. The source should also not be treated as encoded in gamma 2.2 in any situation.

@Headcrabed
Copy link
Contributor

The above indicate that the "gamma 2.2" treatment should only be explicitly done during the final output phase when displaying sRGB content on an HDR display (to keep the light intensity the same as on sRGB display). For sRGB display which accepts sRGB-encoded values, no conversion should be done. The source should also not be treated as encoded in gamma 2.2 in any situation.

@na-na-hi Similar discussions happened on every link provided above.... Please read some of them.

@kasper93
Copy link
Member Author

kasper93 commented Oct 11, 2025

@na-na-hi: I provided 4 links with discussions that should explain to you why this option exists and why it is useful when linearizing sRGB to encode in PQ. (also this is a draft for a reason...)

@na-na-hi
Copy link
Contributor

na-na-hi commented Oct 11, 2025

@na-na-hi Similar discussions happened on every link provided above.... Please read some of them.

I read through these links, and everyone there is confused by the ambiguous use of terms "EOTF" and "OETF". This is the main reason why they cannot comprehend that both of my points can be correct at the same time, and have to declare one of them wrong to resolve the apparent conflict.

Even Jack Holm, who is mostly correct on this issue, is confused: In his email he said "The sRGB standard does not specify an OETF", while at the same time saying that sRGB->XYZ is specified using a two part EOTF. However, both sRGB->XYZ and XYZ->sRGB conversion formulas are defined, which means that if the conversion specifies EOTF, it must also specify OETF, which conflicts with his other statement.

By applying the strict standard regarding these terms to only refer conversions between real, physical light and digital encodings, it can be concluded that the sRGB standard indeed does not specify an OETF. It only specifies a gamma 2.2 EOTF. The sRGB->XYZ and XYZ->sRGB conversion is only between digital values and has no bearing on OETF/EOTF in a strict sense.

He is also wrong on the gamma 2.2 being an "approximation": nowhere is the standard says it is an approximation, and the gamma 2.2 formula is presented in an unambiguous way. In fact, it refers to Annex A, which directly rejects the usage of ambiguous "gamma" term, suggesting the intention of an accurate power law representation. sobotka also argued why it is the case (although also with confusion about EOTF/OETF terminology). This is easier to understand considering that the sRGB->XYZ conversion is NOT an EOTF. The purpose is to convert sRGB into other colorspace specifications, and that colorspace is converted to display using the appropriate display EOTF.

@na-na-hi: I provided 4 links with discussions that should explain to you why this option exists and why it is useful when linearizing sRGB to encode in PQ

But I did acknowledge the case for mapping sRGB for a target PQ display (and I think this is the correct thing to do, since mpv takes the role of sRGB reference display (exact power 2.2 EOTF) here to map sRGB to light intensity). What is the problem here?

@kasper93
Copy link
Member Author

kasper93 commented Oct 11, 2025

But I did acknowledge the case for mapping sRGB for a target PQ display (and I think this is the correct thing to do, since mpv takes the role of sRGB reference display (exact power 2.2 EOTF) here to map sRGB to light intensity). What is the problem here?

Sorry, it's probably me misunderstanding.

The source should also not be treated as encoded in gamma 2.2 in any situation.

I this this was what prompted my reply. There is still a bit of engineering issue on our color management side. Because depending on input and output transfer it need to decide different things. But similarly as with bt.1886 where we use inverse of it to linearize source, I think we can use gamma 2.2 to linearize sRGB. This way we are working in display space linear light. Now the engineering part I mention is to infer what transfer function use to delinearize. For example if target-trc=srgb we would delinearize with gamma2.2 to preserve gamma. For target-trc=pq we would use directly display refered linear light to convert to luminance values. Targeting linear should also be fine, because it should be encoded for display response by compositor, so our "environment encoding" can stay in it.

Does it make sense, if something is not clear let me know.

EDIT: Of course we could also do sRGB decoding and convert to display light later, but I think it effectively would be the same, for our processing sake that we do in linear light. (and hopefully we only linearize / delinearize once)
EDIT2: I will also remove this option, because it's not needed in fact.

@na-na-hi
Copy link
Contributor

But similarly as with bt.1886 where we use inverse of it to linearize source, I think we can use gamma 2.2 to linearize sRGB. This way we are working in display space linear light.

This is the incorrect thing to do according to sRGB standard, because you are linearizing to another colorspace for further processing on the linear light intensity values (scaling, shader, etc) and not to a reference display, so it needs to use the piecewise function, not the gamma 2.2 function.

Of course we could also do sRGB decoding and convert to display light later, but I think it effectively would be the same, for our processing sake that we do in linear light.

It is not the same. In general f(g(x)) is different from g(f(x)). Processing should be done in absolute intensity, not display intensity.

@llyyr
Copy link
Contributor

llyyr commented Oct 12, 2025

This is the incorrect thing to do according to sRGB standard, because you are linearizing to another colorspace for further processing on the linear light intensity values (scaling, shader, etc) and not to a reference display, so it needs to use the piecewise function, not the gamma 2.2 function.

Could you point out what section of the spec supports this interpretation? If I understand you correctly, mpv linearizing with power2.2 function as the EOTF would be correct if it were to output sRGB/gamma2.2*, but it would not be correct to do so if mpv were to output some other colorspace, say bt.1886?

*I say sRGB/gamma2.2 but they have the same EOTF function so they should be the same.

@kasper93
Copy link
Member Author

kasper93 commented Oct 12, 2025

But similarly as with bt.1886 where we use inverse of it to linearize source, I think we can use gamma 2.2 to linearize sRGB. This way we are working in display space linear light.

This is the incorrect thing to do according to sRGB standard, because you are linearizing to another colorspace for further processing on the linear light intensity values (scaling, shader, etc) and not to a reference display, so it needs to use the piecewise function, not the gamma 2.2 function.

Of course we could also do sRGB decoding and convert to display light later, but I think it effectively would be the same, for our processing sake that we do in linear light.

It is not the same. In general f(g(x)) is different from g(f(x)). Processing should be done in absolute intensity, not display intensity.

Well of course, the whole standard is about this.

And yet, this is what is required to preserve perceptual look of the image as it is intended to be viewed. Similarly we don't linearize bt.709 with inverse of camera oetf and instead use bt.1886.

Our rendering pipeline internally is in display intensities (not encoding ones), HDR like. Also with BPC already applied.

This is the incorrect thing to do according to sRGB standard, because you are linearizing to another colorspace for further processing on the linear light intensity values (scaling, shader, etc) and not to a reference display, so it needs to use the piecewise function, not the gamma 2.2 function.

Could you point out what section of the spec supports this interpretation? If I understand you correctly, mpv linearizing with power2.2 function as the EOTF would be correct if it were to output sRGB/gamma2.2*, but it would not be correct to do so if mpv were to output some other colorspace, say bt.1886?

*I say sRGB/gamma2.2 but they have the same EOTF function so they should be the same.

This is what is defined by sRGB standard and the "mismatch" is discussed in introduction
image

@llyyr
Copy link
Contributor

llyyr commented Oct 12, 2025

This is what is defined by sRGB standard and the "mismatch" is discussed in introduction

This doesn't explain why it would be incorrect to use power2.2 EOTF if mpv were to output bt.1886 or PQ. (I'm still waiting for nanahi to say that this is what they meant)

From my understanding, such an operation would be as follows:

srgb video -> mpv linearizes using power2.2 -> work in linear light -> mpv delinearizes to pq or bt.1886 or whatever.

From my understanding, nanahi is saying that mpv linearizes using power2.2 step is wrong if mpv would in the end delinearize to any colorspace except sRGB.

@kasper93
Copy link
Member Author

(I'm still waiting for nanahi to say that this is what they meant)

Ok, sorry. I won't interrupt you anymore.

@kasper93 kasper93 changed the title vo_gpu_next: add --linearize-srgb-as-power22 vo_gpu_next: linearize sRGB as a pure 2.2 power function Oct 12, 2025
@kasper93 kasper93 marked this pull request as ready for review October 12, 2025 11:28
@sobotka
Copy link

sobotka commented Oct 12, 2025

It is not the same. In general f(g(x)) is different from g(f(x)). Processing should be done in absolute intensity, not display intensity.

As folks are grasping the nuance here, it becomes a question as to what the “correct” composition operation would be.

We can see that if we consider the more obvious case of BT.709 to BT.1886, it is the operator, aka the display transfer characteristic and colourimetry, that determines the relative wattages of the presented stimuli. It would appear be unsound advice to suggest that compositing the presented stimuli of window A over window B should be decoded to relative wattages using the inverse of the BT.709 encoding characteristic; if the goal were to be to emulate a composition as though the planes of the windows were physicalist / materialist panels, it is strictly the presented stimuli relative wattages that matter here.

Given that the operator drives the meaning of the data encoding, we can see some sense for considering the PQ or HLG EOTF as driving the compositing of the presentation mechanisms. If we remove the composition out of the equation, the normative presentation of an sRGB two part encoding is a vanilla 2.2 EOTF, and to draw an equivalent presentation, the relative wattages should align. Therefore the presenting stimuli state would logically be the appropriate encoding for the output data state operation; placing the sRGB encoding into the PQ / HLG encoded state as though it were being emitted from a 2.2 EOTF presenting medium.

It follows that if we agree the presentation state is the driving function, then as with the BT.1886 example, the normative 2.2 operator as decoding for presented relative wattages should be employed for composition.

@na-na-hi
Copy link
Contributor

na-na-hi commented Oct 12, 2025

This doesn't explain why it would be incorrect to use power2.2 EOTF if mpv were to output bt.1886 or PQ. (I'm still waiting for nanahi to say that this is what they meant)

From my understanding, such an operation would be as follows:

srgb video -> mpv linearizes using power2.2 -> work in linear light -> mpv delinearizes to pq or bt.1886 or whatever.

From my understanding, nanahi is saying that mpv linearizes using power2.2 step is wrong if mpv would in the end delinearize to any colorspace except sRGB.

PQ specifies the displayed luminance. It is different from the absolute luminance level converted from sRGB using the piecewise function.

If mpv is outputing to PQ the correct pipeline should look like this:

sRGB video -> mpv linearizes using sRGB piecewise function -> work in absolute linear light -> processed absolute linear light -> mpv delinearizes using sRGB piecewise function -> processed sRGB video -> mpv linearizes to display light using power2.2 -> mpv delinearizes to pq

Note that the "processed absolute linear light -> mpv delinearizes using sRGB piecewise function -> processed sRGB video -> mpv linearizes to display light using power2.2" step can be a single "absolute linear light -> display linear light" step if we precompute the relationship.

So what this PR should do is the following: when source is sRGB and target display does not accept sRGB encoded values, add the "absolute linear light -> display linear light" step before converting from linear colorspace to output colorspace.

@Headcrabed
Copy link
Contributor

@kasper93 By setting to "Both" by default, you are breaking ACM support....

@kasper93
Copy link
Member Author

kasper93 commented Oct 12, 2025

@kasper93 By setting to "Both" by default, you are breaking ACM support....

By not setting both by default, we are breaking Wayland support. I'm aware of that, and frankly you should be using PQ output from mpv if your display is in HDR mode.

@sobotka
Copy link

sobotka commented Oct 12, 2025

sRGB video -> mpv linearizes using sRGB piecewise function -> work in absolute linear light -> processed absolute linear light -> mpv delinearizes using sRGB piecewise function -> processed sRGB video -> mpv linearizes to display light using power2.2 -> mpv delinearizes to pq

It is not logical to suggest that the direct to PQ relative stimuli wattages should be power 2.2, and that the compositing of two “layers” be calculated differently, using the two part decoding.

If we consider two buffers A and B, which are “unoccluded” and presented under the 2.2 power function simultaneously on the display medium, the relative “linear” wattages follow from the 2.2 power function for both regions.

If we attempt to emulate moving A over B, and occluding B by some degree of A for an “overlapping window” emulation, it is logical that the as presented relative wattages derived from the 2.2 power function should be employed. The measured energy at the faceplate would indicate the power 2.2 output, not the two part input.

@Headcrabed
Copy link
Contributor

Headcrabed commented Oct 12, 2025

@kasper93 By setting to "Both" by default, you are breaking ACM support....

By not setting both by default, we are breaking Wayland support. I'm aware of that, and frankly you should be using PQ output from mpv if your display is in HDR mode.

No it won't break. haasn/libplacebo@4d4938d would make it choose linear color space on Linux and macOS. Only windows ACM uses VK_COLOR_SPACE_SRGB_NONLINEAR_KHR.

@Headcrabed
Copy link
Contributor

Also seems on some devices, SDR 10bit out would only work when ACM is enabled... And PQ won't help here.

@llyyr
Copy link
Contributor

llyyr commented Oct 12, 2025

haasn/libplacebo@4d4938d would make it choose linear color space on Linux and macOS

FWIW linear output is wrong unless you set --target-contrast=inf... except the option is also buggy on mpv and doesn't have the same result as in plplay (if you set min_luma to the 1e-6 minimum). I will look into this sometime soon when I have the time

@Headcrabed
Copy link
Contributor

Headcrabed commented Oct 12, 2025

haasn/libplacebo@4d4938d would make it choose linear color space on Linux and macOS

FWIW linear output is wrong unless you set --target-contrast=inf... except the option is also buggy on mpv and doesn't have the same result as in plplay (if you set min_luma to the 1e-6 minimum). I will look into this sometime soon when I have the time

But we don't have any other better choices... Linear, sRGB and linear scRGB are the only choices on some Wayland compositors, and some of them just implement sRGB piece-wise as 2.2.

@mahkoh
Copy link
Contributor

mahkoh commented Oct 12, 2025

RE the rest of the discussion here:

One of the goals of the wayland color management protocol (possibly the goal) is to have the same content look the same regardless of the color space conversions performed, as long as the final and all intermediate color spaces can correctly represent the content. For example, consider the following two pipelines:

Pipeline 1:

  1. mpv plays sRGB video
  2. mpv sends the video to the compositor using sRGB color description
  3. compositor converts to PQ
  4. compositor displays on HDR10 display

Pipeline 2:

  1. mpv plays sRGB video
  2. mpv converts to PQ
  3. mpv sends the video to the compositor using PQ color description
  4. compositor displays on HDR10 display

To achieve the goal, mpv and the compositor must agree how to convert sRGB to PQ. If compositors always linearize sRGB content using gamma22, mpv has to do the same if it wants to achieve the same goal. Whether or not that is defined in any standard other than wayland is irrelevant.

@Headcrabed
Copy link
Contributor

RE the rest of the discussion here:

One of the goals of the wayland color management protocol (possibly the goal) is to have the same content look the same regardless of the color space conversions performed, as long as the final and all intermediate color spaces can correctly represent the content. For example, consider the following two pipelines:

Pipeline 1:

1. mpv plays sRGB video

2. mpv sends the video to the compositor using sRGB color description

3. compositor converts to PQ

4. compositor displays on HDR10 display

Pipeline 2:

1. mpv plays sRGB video

2. mpv converts to PQ

3. mpv sends the video to the compositor using PQ color description

4. compositor displays on HDR10 display

To achieve the goal, mpv and the compositor must agree how to convert sRGB to PQ. If compositors always linearize sRGB content using gamma22, mpv has to do the same if it wants to achieve the same goal. Whether or not that is defined in any standard other than wayland is irrelevant.

And that's why I made this commit. haasn/libplacebo@4d4938d

@kasper93
Copy link
Member Author

kasper93 commented Oct 12, 2025

To achieve the goal, mpv and the compositor must agree how to convert sRGB to PQ. If compositors always linearize sRGB content using gamma22, mpv has to do the same if it wants to achieve the same goal. Whether or not that is defined in any standard other than wayland is irrelevant.

Yes, that's clear. Currently mpv agrees with what Windows does. This PR will allow to match Wayland behavior. I like the platform as a default, which will work on both platforms and frankly avoids the headache of deciding what should be the default. Smarter people already decided.

And that's why I made this commit. haasn/libplacebo@4d4938d

Linear output is currently not supported on Windows. We don't implement scRGB which would be possible to use on Windows. Also PQ output from mpv should be compatible with whatever compositors are doing, unless compositors are wrong, but that's why this option can be adjusted by user to decide for themselves.

@kasper93 kasper93 force-pushed the srgb_srgb branch 2 times, most recently from 4b5ab5c to 61eaaf5 Compare October 13, 2025 18:39
@kasper93
Copy link
Member Author

kasper93 commented Oct 13, 2025

Updated PR. This is the version that works mostly correct in each case.

For everyone’s information regarding the sRGB EOTF. I think we can acknowledge that many displays actually implement piecewise sRGB EOTF, especially in "srgb mode" if such a mode is provided. Of course, it may mean only primaries, but from what I have seen, monitors seems to implement sRGB piecewise function. Now here is the issue. Sending sRGB encoded signal to gamma2.2 display is looking fine, after all this is what the spec prescribed. However sending gamma2.2 to sRGB (piecewise) display is looking quite bad, blacks are boosted to the point it's unwatchable. And you can argue that such display is not spec compliant, but reality is different as we know. Just letting you know, that it may be not as pragmatic as you think to be technically correct here.

@kasper93
Copy link
Member Author

kasper93 commented Oct 14, 2025

If it means that mpv will look different than Windows Media Player, so be it.

That said, this is problematic. mpv is content consumption application and the goal of it is to faithfully represent author intent of the image.

To do that we can obviously follow the establish specification, but if various parties interpret it differently, the specification doesn't do it's job. In which case we have to fallback to the best guess.

Now, in case of PQ output, we effectively need to emulate the display characteristics to convert sRGB to PQ. We no longer care what is the sRGB of the user display, because PQ sidesteps this, but we have to know how to correctly interpret input image. If the image was mastered on/for sRGB piecewise EOTF display, we have to respect that.

I know folks claim that "vast majority of pc monitors are calibrated to gamma 2.2". Therefore we should be using gamma 2.2 to reproduce sRGB content. While probably true, there are some caveats:

  • Monitors for years provided various profiles and if they have "sRGB mode" it most likely means piecewise EOTF, and limiting gamut. (some professional monitors also provide gamma2.2 profile or custom luts...)
  • Monitors in other modes likely are calibrated to G2.2, but then likely using different gamut, so still image appearance is wrong
  • Virtually every HDR display on Windows (since few years) is effectively using piecewise sRGB EOTF in HDR mode. Whether we like it or not, this is targeting huge number of users and content creators / game developers have to make sure that their content looks correct. If they don't provide native scRGB output or HDR, using sRGB swapchain will be affected. (Ideally Windows would use G2.2, but honestly I'm not seeing this changing)
  • All screen capture on Windows is using sRGB piecewise encoding (at least 8-bit captures). To correctly reproduce this we need to use piecewise EOTF. This affects all captures and likely livestreaming.
  • (provided only for context, you can ignore this point) It has been confirmed in 2016 that sRGB EOTF is piecewise. This was of course an unofficial email, but whether you like it or not, statement like that is enough for companies and interested parties to align their interpretation of the standard. And that's all on this point, please don't discuss this further here. For me it is engineering problem and I'd like to keep focused on present, not the past.

To achieve the goal, mpv and the compositor must agree how to convert sRGB to PQ. If compositors always linearize sRGB content using gamma22, mpv has to do the same if it wants to achieve the same goal. Whether or not that is defined in any standard other than wayland is irrelevant.

Like I said before, mpv goal is not to align with Wayland, but to provide best viewing experience. At the same time, Wayland is the only reason this patch exist. mpv itself is too small to get in front and say everyone else is wrong about this sRGB thing.

I think the patch is good, I'm just thinking at loud what should be the default option. I think G2.2 by default would be ok, my main problem is that outputting G2.2 to sRGB piecewise is looking bad. And IMHO we cannot ignore existence of hardware that does that.

Also I don't like the platform dependent hack, but I guess if Windows ever changes their conversion we can update our code...

(might also add similar hack for =no for Wayland)

@Headcrabed
Copy link
Contributor

For me, current SDR ACM implementation is not acceptable while others seem ok. Though ACM would always use piecewise for monitor TRC, MHC2 LUTs could be used to convert to other monitor gamma curve, use MHC2Gen with --calibrate-transfer option could achieve this.

@sobotka
Copy link

sobotka commented Oct 14, 2025

I think G2.2 by default would be ok, my main problem is that outputting G2.2 to sRGB piecewise is looking bad. And IMHO we cannot ignore existence of hardware that does that.

Any identified EOTF encoding should implicitly yield a radiometrically linear output, the sole responsibility of the application would be to encode with the proper Presentation Transfer Characteristic, with the option for the audience to override should the incorrect EOTF be identified.

It has been confirmed in 2016 that sRGB EOTF is piecewise.

For the record, the email "confirming" this was authored by someone who was not involved in the writing of the specification, nor any of the individuals cited within the earlier document listed under acknowledgements. None of that matters however, as the document identifies the Display Input / Output Characteristic clearly, and relying on any external "reference" would be folly.

To this end, normative professional calibration and characterization defaults are likely more useful context for an application such as mpv.

@llyyr
Copy link
Contributor

llyyr commented Oct 14, 2025

For the record, the email "confirming" this was authored by someone who was not involved in the writing of the specification, nor any of the individuals cited within the earlier document listed under acknowledgements. None of that matters however, as the document identifies the EOTF clearly, and relying on any external "reference" would be folly.

I believe that sentence is referring to ITU-R BT.2380, which unfortunately explicitly defines sRGB eotf to be the piecewise function. So now we have a standard that is a double standard to deal with...

my personal opinion is that we should (as with everything else in mpv), let this be fully configurable. And stop using the word "sRGB" to avoid muddying the waters even further.

@kasper93
Copy link
Member Author

To this end, normative professional calibration and characterization defaults are likely more useful context for an application such as mpv.

Yes, I know, I read it all. And then there is https://www.colour-science.org/posts/srgb-eotf-pure-gamma-22-or-piece-wise-function

If you are calibrating your display to the sRGB IEC 61966-2-1:1999 Standard, your calibration target should be the piece-wise function. If you are producing a display compliant with the sRGB IEC 61966-2-1:1999 Standard it should adopt the piece-wise function.

Clearly there is misalignment and it's not easy to get agreement on this topic. I don't argue with the use of G2.2, though.

mpv will provide option, so the user can decide for themselves. Most will not notice the difference, and those who do notice and care can adjust to their likings.

Monitors for years provided various profiles and if they have "sRGB mode" it most likely means piecewise EOTF, and limiting gamut. (some professional monitors also provide gamma2.2 profile or custom luts...)

I did quick check on some reviews. And it seems that by default monitors are generally factory calibrated to pure power response. While depending on the model real value may vary 2.1-2.4.

The point about "sRGB modes" still stands, those modes most likely are calibrated for piecewise curve. While some brands like Asus provide more customizability. The dedicated mode "sRGB Cal" is indeed targeting piecewise function, but separately to that there is an "sRGB Colour Gamut" option, which is only limiting the gamut. See https://tftcentral.co.uk/reviews/asus-rog-strix-xg32ucwmg

@aufkrawall
Copy link

DisplayCal sRGB gamma curve target apparently also is piecewise btw. Same what novideo_srgb (unofficial alternative to ACM) does when explicitly selecting sRGB curve. Whatever is the significance of that, just wanted to mention it at least once.

to ensure a consistent appearance. Depending on the platform, the sRGB EOTF
used by the system compositor may differ.

The default is ``auto``. (Only for ``--vo=gpu-next``)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm thinking about to set this to no by default. As we currently are not dealing with SDR ACM properly under auto.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As we currently are not dealing with SDR ACM properly under auto.

What do you mean?

Copy link
Contributor

@Headcrabed Headcrabed Oct 14, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

When at SDR ACM case, we are still doing piecewise-2.2 convert at auto mode, but if correct MHC2 LUT is used, this conversion is useless. I still prefer to only apply this conversion at auto mode for ACM-disabled case(at win32).

Copy link
Member Author

@kasper93 kasper93 Oct 14, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't understand.

I still prefer to only apply this conversion at auto mode for ACM-disabled case(at win32).

In ACM disabled case, there is no need for any conversion. And in fact with ACM enabled there is no need for any conversion either. And no conversion happens currently in both of those cases.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could you provide scenario in which things are not working as expected?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It could be BT.709.

Set correct target-contrast for bt.1886.

source-1886 EOTF-2.2 OETF-OS ACM piecewise EOTF-linear scRGB mix-piecewise OETF-HW LUT to monitor space EETF-monitor output

I don't think they are converting gamma of sRGB. Also can you confirm that you see different output with ACM on or off?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

source-1886 EOTF-2.2 OETF-OS ACM piecewise EOTF-linear scRGB mix-piecewise OETF-HW LUT to monitor space EETF-monitor output

I don't think they are converting gamma of sRGB. Also can you confirm that you see different output with ACM on or off?

Which part of the pipeline do you mean for "convert gamma of sRGB"?

Also I can't test ACM now :(

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

source-1886 EOTF-2.2 OETF-OS ACM piecewise EOTF-linear scRGB mix-piecewise OETF-HW LUT to monitor space EETF-monitor output

I don't think they are converting gamma of sRGB. Also can you confirm that you see different output with ACM on or off?

If you mean OS ACM's gamma converting, it's not done by software, but by LUTs from ICC file which is done by mhc2gen. mhc2gen could convert ICC generated by calibration tool to MHC2 format and modify the LUTs from original ICC file to add an extra conversion from piecewise to monitor TRC described by TRC tag in the input ICC file.

https://github.com/dantmnf/MHC2#notes-for-sdr-auto-color-management

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It still sounds to me that everything is working as expected.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It still sounds to me that everything is working as expected.

There's 2.2 OETF-piecewise EOTF..... This is not correct.

@sobotka
Copy link

sobotka commented Oct 14, 2025

Yes, I know, I read it all. And then there is https://www.colour-science.org/posts/srgb-eotf-pure-gamma-22-or-piece-wise-function

That is precisely the post I was speaking of, and should be taken down due to the reasons mentioned. There's a deep, behind-the-scenes history there.

In the end, follow the specification, as following Mr. Shaw's point at Light Illusion, the displays were simply not two part, so it is an ahistorical stance. Not that it doesn't prevent the agenda from being forwarded to this day.

@kasper93
Copy link
Member Author

kasper93 commented Oct 15, 2025

I mentioned this issue in other threads, but let me reiterate it here, because this is somehow a blocker for this PR. (edit: it's not as bad if correct target-contrast is used.)

Almost all pc captures, livestreams are tagged with cicp 1-1-1, which is bt.709. mpv sees that and applies bt.1886 EOTF. All good, except this content is not really mastered for bt.1886, it's pc screen capture.

Let's assume display is calibrated to G2.2, as we all agreed is the most common way. (see at the bottom for more chatter).

Now there are two ways to consume content like this:
A. Not color managed pipeline. No gamma conversion is done, so effectively bt.1886 tag is ignored and the signal is sent to the display. This looks good, because basically this is how it would likely look on your screen directly anyway.
B. Color managed pipeline. Now this is more interesting. BT.1886 is almost the same as sRGB (see [1]) for display with peak luminance of 203 nits and 1000:1 contrast, which is reasonable default used in mpv. That means, even though we do gamma conversion, the end result is almost the same, there is difference in near black, but visually it looks good. Like I said this depend on contrast set in mpv, --target-contrast. Even though content is tagged incorrectly with bt.1886, the transformation is almost identity, so we roundtrip and things are ok.

After this PR, if we consider that sRGB really means G2.2 and try to send this to display instead. We no longer have this roundtrip property and basically apply bt.1886 for content that is not meant for it. Again see [1] for G2.2 difference. (for default display parameters)

However, this is not only limited to this PR, because if we use other contrast value than standard 1000, we divert from sRGB curve.

Technically this is not a mpv issue, because we can't realistically handle all mistagged "correctly". But at the same time, the sRGB screen recordings tagged as bt.1886 are so common, that it's really hard to justify breaking all this. Note that bt.1886 depends highly on setting correct luminance values for your display. And exactly the same issues happens when outputing to PQ. In general you should patch such content and use --vf=format=transfer=srgb for it. Regardless of that all this depends on correctly configured mpv, say if your display uses sRGB piecewise, you have different issues. However I would like to preserve the default config to work on like default sdr display, but maybe I'm trying to much.

Just to put those words into images, let's consider this Diablo 4 screenshot. Not sure screenshots are best to visualize this, because they will look different depending on your display, but for reference.

sRGB -> sRGB no gamma conversion

mpv -v d4_gamma2p2.png --border=no --pause --no-config --treat-srgb-as-power22=no --target-peak=203 --target-contrast=1000 --target-colorspace-hint-strict=no --vf=format=transfer=srgb --target-trc=srgb --geometry=0x1080
image

bt.1886 -> sRGB emulated mistagged sRGB content

mpv -v d4_gamma2p2.png --border=no --pause --no-config --treat-srgb-as-power22=no --target-peak=203 --target-contrast=1000 --target-colorspace-hint-strict=no --vf=format=transfer=bt.1886 --target-trc=srgb --geometry=0x1080
image

bt.1886 -> sRGB emulated mistagged sRGB content, with typical OLED reported contrast

mpv -v d4_gamma2p2.png --border=no --pause --no-config --treat-srgb-as-power22=no --target-peak=203 --target-contrast=1000000 --target-colorspace-hint-strict=no --vf=format=transfer=bt.1886 --target-trc=gamma2.2 --geometry=0x1080
image

bt.1886 -> gamma2.2 this is what would happen to common livestream with --treat-srgb-as-power22=both

mpv -v d4_gamma2p2.png --border=no --pause --no-config --treat-srgb-as-power22=no --target-peak=203 --target-contrast=1000 --target-colorspace-hint-strict=no --vf=format=transfer=bt.1886 --target-trc=gamma2.2 --geometry=0x1080
image

bt.1886 -> gamma2.2 this is what would happen to common livestream with --treat-srgb-as-power22=both and typical OLED contrast

mpv -v d4_gamma2p2.png --border=no --pause --no-config --treat-srgb-as-power22=no --target-peak=203 --target-contrast=1000000 --target-colorspace-hint-strict=no --vf=format=transfer=bt.1886 --target-trc=gamma2.2 --geometry=0x1080
image

[1] https://www.desmos.com/calculator/hgmz5xej0e


I was wondering why Microsoft picked sRGB piecewise curve for sRGB EOTF, as this produces mismatch between PQ output and G2.2 SDR displays that are common. The answer is simple. All Microsoft Surface tablets/laptops displays are calibrated to sRGB piecewise curve, since ever, I checked reviews from like over 10 years ago. It makes sens for them to pick consistent output on their hardware.

They even mention this on this blog post https://techcommunity.microsoft.com/blog/surfaceitpro/high-dynamic-range-hdr-in-surface-displays/4100353

Standard Dynamic Range (SDR) is the legacy color processing standard for the Windows OS. In this mode, content RGB pixel values range from 0-255 (or a normalized range of 0.0 - 1.0), and the absolute luminance range is controlled separately by OS brightness control. The default color space and Tone Rendering Curve (TRC) for Windows OS SDR is defined as sRGB, which differs from the traditional gamma 2.2 curve. Accurate content color reproduction for display and content color profiles can be managed by applications such as latest version of Microsoft Photos.

So, while unfortunate, it is unlikely that Windows implementation will change and it is the expected behavior. While they could do something like Apple does, where internal screens are handled differently to external ones.

@kasper93
Copy link
Member Author

Added --sdr-adjust-gamma option.

@sobotka
Copy link

sobotka commented Oct 16, 2025

Standard Dynamic Range (SDR) is the legacy color processing standard for the Windows OS. In this mode, content RGB pixel values range from 0-255 (or a normalized range of 0.0 - 1.0), and the absolute luminance range is controlled separately by OS brightness control. The default color space and Tone Rendering Curve (TRC) for Windows OS SDR is defined as sRGB, which differs from the traditional gamma 2.2 curve. Accurate content color reproduction for display and content color profiles can be managed by applications such as latest version of Microsoft Photos.

I'd be incredibly skeptical of this.

Does anyone have a factory default ICC from a Surface?

@kasper93
Copy link
Member Author

Does anyone have a factory default ICC from a Surface?

I don't have profiles itself, but I've been looking at reviews and notebookcheck, one example https://www.notebookcheck.net/Microsoft-Surface-Laptop-7-15-Lunar-Lake-review-A-slap-in-the-face-for-Windows-on-ARM.985263.0.html

It has two profiles, sRGB and Vivid, both uses srgb curve, Vivid one has bigger gamut.

Vivid:
Vivid

sRGB:
sRGB

Comment on lines +7369 to +7371
Additionally, ``bt.1886`` requires display contrast ratio to be known for
correct rendering, which is often unavailable. Use``--target-contrast`` to
specify it.
Copy link
Contributor

@mahkoh mahkoh Oct 17, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Keep in mind that I don't know what I'm talking about.

libplacebo implements the bt.1886 EOTF as follows:

        const float lb = powf(csp_min, 1/2.4f);
        const float lw = powf(csp_max, 1/2.4f);
        const float a = powf(lw - lb, 2.4f);
        const float b = lb / (lw - lb);
        MAP3(a * powf(X + b, 2.4f));

csp_min and csp_max are taken from the pl_hdr_metadata if available. mpv sets these values to the values from the target luminance event.

However, as far as wayland color management is concerned, the guidance is to use the values from the luminances event which can be expected to be significantly different. See here.

For color descriptions created by the client, the values from the set_luminances request would be used.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, that's good point. We may need to update this too. Note that luminance values are used only in HDR mode fully, while recently the min_luma is also used for SDR. That's because on Windows this descriptor generally gives us HDR display capabilities.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

However, as far as wayland color management is concerned, the guidance is to use the values from the luminances event which can be expected to be significantly different. See here.

Are you sure this is right? target_luminance field should only differ from the luminance field when the monitor is in HDR mode, because the values in the EDID are only applicable in the HDR mode.

For bt.1886, both target_luminance and luminance should give us the same values. And I can confirm that this is the case on wlroots compositors.

For pq output, I get the following luminance and target_luminance:

wp_image_description_info_v1#76.luminances(0, 10000, 203)
wp_image_description_info_v1#76.target_luminance(7, 351)

If mpv chooses to use the reference luminance, we'll be tonemapping to a theoretical output that goes from 0-10000 nits while my display only does 351 nits, which would result in mpv doing no tonemapping at all.

IMO the comment should be dropped because there's no way to know the display contrast ratio without a colorimeter, since this information is not provided via the EDID.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

IMO the comment should be dropped because there's no way to know the display contrast ratio without a colorimeter, since this information is not provided via the EDID.

wp_image_description_info_v1#76.target_luminance(7, 351)

This is your display contrast, no?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is your display contrast, no?

Yes, and as I said that's fine for PQ mode. This information is only reliable in HDR mode, which only activates when the monitor gets PQ signal.

For every other mode, target_luminance and luminance field should both contain the reference display luminance because we can't know the specifics about the user's display.

This is my display's EDID https://github.com/user-attachments/files/22932843/test.txt and the luminance info is under the HDR block, those values aren't applicable for SDR mode.

Copy link
Member Author

@kasper93 kasper93 Oct 17, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

With that in mind, and knowing that the reference luminance for SDR is 100 nits, the backlight doesn't need to run as bright so the darker pixels can be darker than they would in HDR mode.

So what's the problem? If the value is brighter than actual black level, we would raise our black level output a bit, which is fine. It's still better to use 0.02 nits instead of 0.2 nits, if display has 0.002 nits real black level.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Feel free to do what you want, I won't be commenting on this PR again

Copy link
Member Author

@kasper93 kasper93 Oct 18, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you give examples when it is not correct to use those values?

https://web.archive.org/web/20171201033424/https://standards.cta.tech/kwspub/published_docs/CTA-861-G_FINAL_revised_2017.pdf See 7.5.13

The HDR Data Block indicates the HDR capabilities of the Sink
Byte 7 indicates the Desired Content Min Luminance. This is the minimum value of the content (in cd/m2) that the display prefers for optimal content rendering

The field does not describe the panel's darkest pixel output, it describes the HDR10 mastering spec that the display can display.

Could you provide exact quotes about this? Or is this selective reading? I know the block is called "HDR", but is well defined to describe SDR too.

image

I can agree that this min luminance may not always be accurate, but is probably better guess than using single value for every monitor. Also if this is not available the reported value will be empty or substituted by standard (S)DR one.

(the same is with primaries, SDR mode almost never means sRGB primaries, as almost all displays, reproduce native gamut also in SDR)

In reality, this value can be much lower for SDR modes for OLED models and higher/lower in SDR mode for LCDs. Depending on features like local dimming or type of backlight, this can change in SDR.

My worry is only that it may be too low. You argue to set it to hard coded 0.2 nits, while also arguing that this value should be much lower than reported by monitor metadata, which is almost always lower.

Feel free to do what you want, I won't be commenting on this PR again

Ah, yes great technical discussion, stop responding, because others doesn't agree.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could you provide exact quotes about this? Or is this selective reading? I know the block is called "HDR", but is well defined to describe SDR too.

Are you doing selective reading here? You can scroll down to see exactly that Data Bytes 3-22 only define mastering data for ST2084.

Data Bytes 3 – 22 contain the Display Mastering data defined in SMPTE ST 2086

If a Source does not have SMPTE ST 2086 [41] metadata to send, then Data Bytes 3-22 shall be populated with zeroes to indicate the data is not provided

If this data was meant to be used for other modes besides ST2084, why would it be populated with zero if the display doesn't support ST2084?

You argue to set it to hard coded 0.2 nits, while also arguing that this value should be much lower than reported by monitor metadata, which is almost always lower.

Citation needed. Every EDID I have looked at has minimum luminance between 0.15-0.25 nits.

Also using minimum luminance from this field doesn't solve the problem for displays that don't have HDR mode at all, since they won't have this field populated. You said it's so important that we set minimum luminance (which I agree with), but then why is it fine to leave SDR monitor in the dust? You should set it to a constant 0.2 because that's what the reference display is, instead of guessing based on the moon phase.

Ah, yes great technical discussion, stop responding, because others doesn't agree.

I said that because you ignored the rest of the comment and only responded to one example I cited. It can be lower OR higher than the value in EDID, depending on the monitor's featureset. It's not about technical discussion, it's about discussion in good faith and I get the impression you're married to this change for some reason, and nothing I say could change your mind.

I will repeat myself one more time clearly. I have no problem with setting a min_luma value for SDR. I have a problem with using the HDR min luma value from EDID, because this is basically equivalent to using rand() in a range of 0.15-0.25. If we use a constant value of 0.2, at the very least we're grounded and have some sort of basis instead of a completely arbitrary value that happens to be near the reference SRGB display.

Copy link
Member Author

@kasper93 kasper93 Oct 18, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's try first align on the issue, because I don't even know why is there a problem here.

Are you doing selective reading here? You can scroll down to see exactly that Data Bytes 3-22 only define mastering data for ST2084.

Data Bytes 3 – 22 contain the Display Mastering data defined in SMPTE ST 2086

If a Source does not have SMPTE ST 2086 [41] metadata to send, then Data Bytes 3-22 shall be populated with zeroes to indicate the data is not provided

If this data was meant to be used for other modes besides ST2084, why would it be populated with zero if the display doesn't support ST2084?

You are quoting paragraph about ST2086 while mentioning ST2084. I think you mistaken ST2086 for ST2084.

ST2086 is just a definition of mastering display metadata. Which was adopted to describe displays in general. It's not related to any specific format.
image

Wayland does not specify that this is only valid in any given mode. Notice "should", as it just specify that metadata should be compatible with standard ones.
image

If this data was meant to be used for other modes besides ST2084, why would it be populated with zero if the display doesn't support ST2084?

I don't know. ST2084 is not mentioned in this document, with regards to metadata, beside transfer function.

Citation needed. Every EDID I have looked at has minimum luminance between 0.15-0.25 nits.

Depends on panel tech, IPS likely will hover around 1000:1 contrast ratio, while OLEDs have 0.002 nits, 0.0005 nits reported generally. There is no one value. That's why using reported value gives users some benefit.

None really do SDR certification, but you can look at the values range for VESA HDR cert https://displayhdr.org/performance-criteria/

Also using minimum luminance from this field doesn't solve the problem for displays that don't have HDR mode at all, since they won't have this field populated. You said it's so important that we set minimum luminance (which I agree with), but then why is it fine to leave SDR monitor in the dust? You should set it to a constant 0.2 because that's what the reference display is, instead of guessing based on the moon phase.

1000:1 contrast is default in mpv/libplacebo since ever, 203 / 1000 = 0.203 nits. This is the default value, if no other is provided.

I get the impression you're married to this change for some reason, and nothing I say could change your mind.

Quite the opposite. I'm trying to find out when/how the reported value won't work. I'm not convinced that 0.2 nits works for everyone, at least it doesn't for me. Though I'm aware that black level depends on many factors including ambient light conditions. Displays may have different contrast ratio in in different modes like you said. But big gain of using consistent black level is consistent rendering of SDR content in both SDR and HDR mode, (well with --sdr-adjust-gamma).

Screenshot will not show exactly (probably not at all), but if you look at this, outer border is black at 0.22 nits, while inner is black at 0 nits. May look the same on IPS displays, but definitely doesn't look the same on OLED. I don't see a difference at around 0.008 nits in bright lit room, way lower in dark room (< 0.001 nits, can't test now in pitch dark env), display reports black level at 0.0002 nits. Which is reasonable, sure it's best case scenario and can be adjusted, but at the same time it wouldn't look like gray. We don't want to emulate IPS or TN display in libplacebo, that's not our goal.
image

I don't mind, we can revert this change, from what I've seen people were adjusting manually target-contrast already, so for them it would be no different.

@sobotka
Copy link

sobotka commented Oct 17, 2025

I don't have profiles itself, but I've been looking at reviews and notebookcheck, one example

Doesn’t the legend indicate that the one curve is the pattern generator and the other is the native response, no?

@kasper93
Copy link
Member Author

I don't have profiles itself, but I've been looking at reviews and notebookcheck, one example

Doesn’t the legend indicate that the one curve is the pattern generator and the other is the native response, no?

It's ideal response vs display response. Not sure what is your question exactly?

@Headcrabed
Copy link
Contributor

Headcrabed commented Oct 17, 2025

Standard Dynamic Range (SDR) is the legacy color processing standard for the Windows OS. In this mode, content RGB pixel values range from 0-255 (or a normalized range of 0.0 - 1.0), and the absolute luminance range is controlled separately by OS brightness control. The default color space and Tone Rendering Curve (TRC) for Windows OS SDR is defined as sRGB, which differs from the traditional gamma 2.2 curve. Accurate content color reproduction for display and content color profiles can be managed by applications such as latest version of Microsoft Photos.

I'd be incredibly skeptical of this.

Does anyone have a factory default ICC from a Surface?

Here it is. From my surface pro 8. Be aware that it's screen is not wide-gamut but sRGB.
color.zip

// sRGB to PQ output. We are not concerned about this case, as it would
// look wrong anyway.
bool target_pq = !target_unknown && target_csp.transfer == PL_COLOR_TRC_PQ;
if (opts->treat_srgb_as_power22 & 4 && target_pq)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What I want is to also add check for ACM here. When ACM is enabled, we should also use piecewise TRC.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That's why gamma correction is disabled for SDR now.

It is impossible to know if ACM will convert gamma or won't do any color adjustment.

Hence why we apply the "correction" only to PQ output.

Similarly we don't know if the display itself is calibrated to sRGB or gamma2.2.

And while we can assume by default gamma2.2 in non-acm. In case of ACM we don't know if any conversation will happen.

That's why we have an option and you can adjust to fit your setup.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OK you persuaded me. But I want to add some more information to this option's description like following:

When set to ``auto`` this option should have no effect by default when using Vulkan on macOS and Wayland. But in Windows SDR ACM cases with D3D11 or Vulkan, if you have calibrated your monitor and use MHC2Gen with ``--calibrate-transfer`` option to convert your ICC file to MHC2 format, set this option to ``no`` to keep color accuracy.

More information here: https://github.com/dantmnf/MHC2#notes-for-sdr-auto-color-management

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That's why I initially wanted to do this conversion only if both input and output is srgb.

sRGB reference display is defined a 2.2 gamma device. To preserve the
look of the sRGB as mastered on such device, linearize it as such.

Note that sRGB encoding is piecewise with linear segment, which creates
mismatch to pure power 2.2 function, but this is intended to be viewed
on such display.

See:
IEC 61966-2-1-1999
https://community.acescentral.com/t/srgb-piece-wise-eotf-vs-pure-gamma/4024
KhronosGroup/DataFormat#19
https://gitlab.freedesktop.org/pq/color-and-hdr/-/issues/12
https://github.com/dylanraga/win11hdr-srgb-to-gamma2.2-icm
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

7 participants