Gamma correction doesn't look properly corrected, is this linear?

 ̄綄美尐妖づ 提交于 2019-12-06 03:01:51

The results you are getting is what is to be expected.

You seem to confuse the physical radiometric radiance created by the display with the perceived brightness of a human. The latter is nonlinear, and the simple gamma model is a way to approximate that. Basically, the monitor is inverting the nonlinear transformation of the human eye, so that the standard (nonlinear) RGB space is perceived linearily - using RGB intensity 0.5 is perceived as roughly half as bright as 1.0, and so on.

If you would put a colorimeter or spectrophotomer at youe display when displaying your gamma-corrected grayscale levels, you would actually see that the 0.73 step would show roughly 50% of the luminance of the white level in candela/m^2 (assuming you display does not deviate too much from an sRGB model which is btw. not using gamma 2.2, but a linear segment in combination with gamma 2.4 for the rest, the 2.2 is only another approximation).

Now the question is: what exactly do you want to achieve? Working in linear color space typically is required if you want to do physically accurate calculations. But then, a light source with 50% of the luminance of another one does not apper as half as bright to the human, and the result you got basically is correct.

If you want just to have a a color space which is linear in percepted brightness, you can completely skip the gamma correction, as sRGB is exaclty trying to provide that already. You might just need some color calibration or small gamma adjustment to correct for deviations introduced by your display, if you want exact results.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!