I got to thinking about the display of color values and the histogram in Lightroom. Lightroom internally uses a working space with prophotoRGB primaries but with linear gamma. Using linear gamma avoids all kinds of possible artefacts and is a good choice if you have the bitprecision. On the display of RGB values in percentages and the histogram, to avoid all values to be in the 0-1% range, Lightroom modifies the values with a sRGB tone curve. I was curious to see what this really means. In this original proposal by the group from HP and Microsoft that originally designed sRGB this is explained more or less. sRGB does not have a simple 2.2 gamma that you can use but a complex curve that has a knee in the shadows. The equations are as follows:
If you have a linear R,G, or B value in sRGB primaries, the equation to find the value in nonlinear sRGB space are:
if R ≤ 0.00304,
Rnl = 12.92* R
Rnl = 1.055* R1/2.4-0.055
repeated for Green and Blue of course.
So, save for the multiplication with 100 to get percentages, Lightroom uses this exact math to calculate the percentage values in linear prophotoRGB to the sRGB tonecurve modified prophotoRGB value display space (I'll call that the Lightroom Value Space or LVS). This tonecurve looks like the following on a double-log plot.
In Red the sRGB tonecurve and in Blue the curve for a simple 2.2 gamma. You can see that for values below 0.05 in the linear space, the deviation from a 2.2 gamma is larger than 5%. Conversely, in the non-linear LVS space, values below 10% are significantly different from when you would assume a 2.2 gamma. This means that the values I published before for the values for the MacBeth colorchecker color patches in the LVS system are all wrong. Here are the correct values:
Sorry for the graphic, I still haven't figured out how to make tables appear correctly in blogger. You can clearly see that the values are significantly different. Hope this is useful for somebody!