Heads up for those who are starting Cyberpunk 2077
Based upon what I’ve seen here on Xbox - You might want to turn HDR off if you are wanting to have the best visual experience - there are problems (and it’s not really all that great)
Or if you really can’t be bothered doing that. You need to keep the HDR peak luminance pretty low (and mid tone adjustment will suit most people around 0.90 suspect)
And this is then when it starts to become problematic:
Setting the mid point value at the default 2 gives a level of exposure and detail in highlights very similar to that of the SDR mode- albeit it much brighter overall. Unfortunately this overall increase in brightness comes at the expense of poor near black - giving an image that lack punch and contrast during lower light scenes.
Dropping the mid tone value down lower improves the lightness of near-black and resolves more details in highlights, however, issues with either the tonemapper itself or bugs in colour treatments reveal themselves more readily.
Here you can see some of the game’s upper values are actually incorrect creating these weird weird inversion effect. You can see this in most specular in this area.
The floor here suffers from the same problem. This problem doesn’t exists at all in the SDR mode.
Mid point tone map setting
If you are going to play in HDR somewhere between 1 and 1.50 on this scale may be preferable. (The chances are that 1.8 should be the official mid grey point - as that is the universally recognised value for 18% reflectance) but for many I suspect this won’t be punchy enough.
I suspect that the luminance values are simply scaling up the entire image using a gain/contrast style adjustment from the base 80nit (1,1,1 in scRgB) image.
Whilst this is a possible strategy for creating an HDR image - Experience tells me that once you start getting over around 600nits, things just start to become a little bit too much. Not only do things that are meant to dim, look incredibly bright, but mid values also start to elevate.
Light sources pretty much all scale up with this, meaning many emmissive textures (of which there are many) all look almost universally bright.
This is particularly noticeable during the initial sequence of the game , the holograms in the lobby of the office (corpo intro) are intensive red and bright - obviously these things don’t exist in real life, however there is a well established language for how these look in sci fi.
In your bosses office there are fabric lampshades which appear almost as bright as the colour neon/led materials.
It’s all just too visually intense with an apparent lack of variation and nuance between these Emmissives. This becomes more obvious if you are using brighter max luminance values which drag even apparently dim surfaces and secular to be super bright.
Something you maybe be familiar with if you’ve tried the Xbox’s auto-HDR or other approaches to performing SDR to HDR conversions.
If you are using a very bright display and you try and match the in game values with the max luminance, you will have an eye searing image , which in itself will cause issues for local dimming algorithms and perhaps excerbate Blooming.
Again, dialing the mid point value down lower actually helps to with the mid tones, but you start to lose a haziness around light sources, something that is clearly key to the look of the game.
The PC version has an anomaly , a HDR10 version or a HDR SCRGB , which is weird as it is not an actual colour space for output.
500 / 1.8 is what I’m currently running
Or just save yourself hassle and run in SDR.
I think some of the issues above have been tweaked and aren’t as bad as they were.
After playing around some more I *think* I have a better understanding of what is going on.
Max brightness doesn’t appear to interact with the typical brightness roll-off that we expect a Game to provide - much like we have seen in a few other titles - God of War and Destiny 2 being some of the most well known titles to do the same thing.
Max Brightness appears to control the gain, or the contrast setting - I suspect 10,000nits is in fact the default as the game’s output is scaled from it’s base set up to the game’s SDR tuned output.
The problem with then reducing this is that it then brings down the mid tone brightness by the time you move the white point down to a useable 1000nits or so.
The Mid tone adjustment had 2 effects on the image - reducing it will make the darker parts of the image - increasing the perception of contrast- increasing it creates more of a bloom like effect , which smooths out some highlight detail.
Now my objective is to have an image that is
A: Bright enough for general play
B: Not so obviously raised at the dark end of image
C: Pulls out some additional highlight detail to exist in very high-not parts of the image.
So one way around this is to choose an EOTF curve that retains highlights
On the LG sets this is DTM OFF - a 4000nit roll off.
On Samsung Sets you can change gamma / ST2084 to -3.
You can then set the games maximum output to 3500-4000
And drop back the mid point value to somewhere between 1-1.8 (depending on if you want a more contrast or more bloom)
This will brighten the image, provide a brightness roll off, an image that is not so dim and a slightly more contrasty image.
Give this a go and let me know what you think!