Is just me or GT Sport looks better with HDR off?

  • Thread starter MarkGT007
  • 95 comments
  • 33,103 views
Quick question should I put my tv in game mode or HDR mode? Thanks

These are two separate things, one should not affect the other. Turn on both for optimal gaming experience. Game mode refers to input lag on the tv, The lower the number the better. You will see a slight difference in video quality in game mode, but it is barely noticeable depending on the quality of the tv. HDR will give a wider color gamut resulting in a more colorful and vibrant picture.
 
Before I got my current TV (55" Samsung with 4K and HDR) I wasn't really a fan of it, as I'm not keen on graphics / visual details altogether.

My TV with the PS4 Pro looks very good, but to be honest had I not used my PS4 on the same TV in 1080p without HDR to match my capture card and moved to 4k and HDR immediately after I wouldn't have noticed any difference. But once you realize it you can't unsee it ;-)
 
I still run an old Panasonic Plasma because to match the image quality, i'd need to spend thousands on an OLED screen. LED in general, at consumer prices, have a very fake looking picture that i cannot warm to at all. And HDR on LED screens look terrible most of the time unless you have absolute fine control over the effect.
 
I still run an old Panasonic Plasma because to match the image quality, i'd need to spend thousands on an OLED screen. LED in general, at consumer prices, have a very fake looking picture that i cannot warm to at all. And HDR on LED screens look terrible most of the time unless you have absolute fine control over the effect.


The current LED’s are nearly on par with OLED and superior to old Plasma.

The critical thing is calibrating the screen. LED look like crap in the store, the store mode visuals crush the blacks, colors are off and it looks like garbage.
 
I can´t talk about GT sport cause i never compared it running on a non hdr screen.., i run GT sport in game mode and without all filters from tv on to have less input lag.

But in other games i use it and i could compare Division 2 running whit and whithout hdr on a same brand tv has mine but that doesn+t suport HDR, i was shocked since the diference it´s so big that almost seem´s that your looking to two diferente generation games. it´s almost like division 2 running on a non hdr screen looks like division 1.., and running on a HDR screen looks like division 2 running on a pc with high settings enable, colors and the atmosphere of the game is som much richier that it´s almot impossible to describe the diference if your not looking at it with naked eye.
But hdr it´s like all other features there are games that the implementation it´s good and others it´s just bad.
 
I set my 4K 50 inch new tv to HDR on, calibrated the PS4s HDR options to how it suggested also. In doing so, any game with any form of motion left a ghost image of what was in motion at all times. This was prevalent in GTS and Project Cars 2 especially.

So i've no idea if I got something wrong there or what that was. The TV was in full HDR and game mode. And the end result was GT Sport looked dull and washed out. And both GTS and PCARS2 had the blurry ghosting of items in motion.

So to me, definitely looks better "off". And imo is an overhyped option for gaming.
 
I set my 4K 50 inch new tv to HDR on, calibrated the PS4s HDR options to how it suggested also. In doing so, any game with any form of motion left a ghost image of what was in motion at all times. This was prevalent in GTS and Project Cars 2 especially.

So i've no idea if I got something wrong there or what that was. The TV was in full HDR and game mode. And the end result was GT Sport looked dull and washed out. And both GTS and PCARS2 had the blurry ghosting of items in motion.

So to me, definitely looks better "off". And imo is an overhyped option for gaming.

You’re doing something very wrong. It is far from overhyped as others have said. It’s a huge difference and visually far far superior.



In regard to pixel response time Plasma TV is lightning fast compared to LED LCD or even OLED.

That’s true vs LED which has a fastest response time of about 1ms vs .002ms for Plasma. But OLED are down to .001ms.

Despite the small edge in response time over LED, plasma gives up a lot more in the following.

No 4K
No HDR support.
Outdated HDMI
higher input lag
Far far lower peak brightness with 100-200nit vs 800-1000nit on modern screens.

The Plasma has a lower response time but double to quadruple the input lag. Honestly sub 4ms response time is a wash especially when you factor in the frame rate issues of the PS4. Considering most are sub 4ms, I’ll take the far lower input lag with 4K HDR over 1080p high input lag.

Best is to go OLED and have everything IMO.
 
Last edited:
CAN THIS BE POSTED A LOT IN THIS GROUP PLEASE:

HDR Photography is COMPLETELY DIFFERENT to HDR video.

HDR photography takes multiple photos at different exposure levels to mitigate over or under-saturation and exposure in a photograph. This was originally created to overcome a camera sensor having the exposure adjusted across the whole sensor and making it behave a little more like a real eye. For example, if you look out of a bright window your eyes can still see the inside of the window frame, a camera cannot. People also use this to edit a photograph in to silly levels for artistic effect.

HDR videography is a colour-space and brightness format which enhances the contrast ratio and widens the colour gamut. An HDR camera ups the number of available colours to 1 billion distinct colours (in the case of HDR10) or 68 billion (in the case of Dolby Vision). An HDR display must have either a peak brightness of over 1000 cd/m2 and a black level less than 0.05 cd/m2 (a contrast ratio of at least 20,000:1) or a peak brightness of over 540 cd/m2 and a black level less than 0.0005 cd/m2 (a contrast ratio of at least 1,080,000:1).

This is, I'll admit, confused by smartphone manufacturers enabling 'HDR video' in their camera apps. These are not (unless you own a certain LG phone) HDR video and are merely making use of HDR photography techniques whilst capturing a video.

You also can't post pictures on here demonstrating the difference between HDR and non-HDR as phones do not kick in HDR (even if they are capable of HDR) on web pages or still images, only in moving video that has been encoded with a minimum of a 10-bit colour space so it is POINTLESS.

Sorry, but as someone with a passion for photography and videography it annoys me that the two terms are used to describe completely different things.
 
CAN THIS BE POSTED A LOT IN THIS GROUP PLEASE:

HDR Photography is COMPLETELY DIFFERENT to HDR video.

HDR photography takes multiple photos at different exposure levels to mitigate over or under-saturation and exposure in a photograph. This was originally created to overcome a camera sensor having the exposure adjusted across the whole sensor and making it behave a little more like a real eye. For example, if you look out of a bright window your eyes can still see the inside of the window frame, a camera cannot. People also use this to edit a photograph in to silly levels for artistic effect.

HDR videography is a colour-space and brightness format which enhances the contrast ratio and widens the colour gamut. An HDR camera ups the number of available colours to 1 billion distinct colours (in the case of HDR10) or 68 billion (in the case of Dolby Vision). An HDR display must have either a peak brightness of over 1000 cd/m2 and a black level less than 0.05 cd/m2 (a contrast ratio of at least 20,000:1) or a peak brightness of over 540 cd/m2 and a black level less than 0.0005 cd/m2 (a contrast ratio of at least 1,080,000:1).

This is, I'll admit, confused by smartphone manufacturers enabling 'HDR video' in their camera apps. These are not (unless you own a certain LG phone) HDR video and are merely making use of HDR photography techniques whilst capturing a video.

You also can't post pictures on here demonstrating the difference between HDR and non-HDR as phones do not kick in HDR (even if they are capable of HDR) on web pages or still images, only in moving video that has been encoded with a minimum of a 10-bit colour space so it is POINTLESS.

Sorry, but as someone with a passion for photography and videography it annoys me that the two terms are used to describe completely different things.
Good points, but what do you mean 1000 cd/m2 being a minimum? I’m sure you know that depends on the current image being displayed and also for how long. I mean my Samsung Q6FN tops at 860 at 10% target, not sustained and still looks absolutely briliant in HDR, especially GTS. Contrast is “only” 6700, which is great for an LCD.
 
As far as I know there are THREE HDR standards for monitors.

HDR1000 = 1,000 cd/m2
HDR600 = 600 cd/m2
HDR400 = 400 cd/m2

You want a 1,000 unit but these are the most expensive and the most rare.

eg. Philips 43M6VBPAB 43" 4K UHD Adaptive Sync 4MS HDR1000 VA QLED Monitor

600 is acceptable.

400 is barely HDR given 400 cd/m2 isnt hugely great over your typical 300/350 cd/m2 on a typical screen.

I'm still kind of looking for the holey grail for a monitor that can do it all...

eg. 27" 1,440p 144hz IPS and HDR1000

Not sure what ps4/5 and GT Sport support.
 
As far as I know there are THREE HDR standards for monitors.

HDR1000 = 1,000 cd/m2
HDR600 = 600 cd/m2
HDR400 = 400 cd/m2

You want a 1,000 unit but these are the most expensive and the most rare.

eg. Philips 43M6VBPAB 43" 4K UHD Adaptive Sync 4MS HDR1000 VA QLED Monitor

600 is acceptable.

400 is barely HDR given 400 cd/m2 isnt hugely great over your typical 300/350 cd/m2 on a typical screen.

I'm still kind of looking for the holey grail for a monitor that can do it all...

eg. 27" 1,440p 144hz IPS and HDR1000

Not sure what ps4/5 and GT Sport support.
Still... the actual cd/m2 depends greatly on content that’s being displayed. It ranges from 600-1100 for the 2019 SAMSUNG Q8 (which is quite embarasing, considering my 2018 Q6 is very close, and has even better contrast ratio than 2019 Q8 btw).

So which one are we talking about? 10% window peak brightness or just sustained 50% or what?
 
IMHO most HDR looks awful unless done carefully by a professional editor.

All HDR does, as per the name, is extend dynamic range so solid blacks have details revealed and blown whites would now show some detail. Instead we get this super bright, super fake, super saturated rubbish complete with halo effects.

Good HDR would mean you barely notice it otherwise it has been overdone.
 
IMHO most HDR looks awful unless done carefully by a professional editor.

All HDR does, as per the name, is extend dynamic range so solid blacks have details revealed and blown whites would now show some detail. Instead we get this super bright, super fake, super saturated rubbish complete with halo effects.

Good HDR would mean you barely notice it otherwise it has been overdone.

You are describing HDR in photography... Not in video terms.

Please read my post above and you'll see what I mean.

Still... the actual cd/m2 depends greatly on content that’s being displayed. It ranges from 600-1100 for the 2019 SAMSUNG Q8 (which is quite embarasing, considering my 2018 Q6 is very close, and has even better contrast ratio than 2019 Q8 btw).

So which one are we talking about? 10% window peak brightness or just sustained 50% or what?

It doesn't depend on what content is displayed - assuming we are talking a 'normal' LED backlit LCD panel. Because when an LCD panel is in use it is back lit at a constant brightness (on or off at least) so the 'nit' or cd/m2 rating is based purely on the backlight at full brightness - obviously there is user control, but it's peak brightness that is 'rated'.

Between the backlight and your eyes is a TFT (or Thin Film Transistor) panel and a colour panel - see the diagram below.

d762cf4f7f.jpg


When it comes to an OLED TV it is a little different as OLED is essentially the backlight, TFT and colour panel all together as they are directly generating the image. In this instance I don't know how they are measured in terms of their cd/m2 but it would stand to reason that their rating is based on a full-white screen as this would be their brightest setting and a TV's cd/m2 rating is defined as 'peak brightness'.

So, to reiterate, it doesn't matter what's being displayed as the rating is peak brightness and not just 'what it's displaying at this time or that'.

And to anyone looking for an HDR TV, you want to look for 2 things - one is the peak brightness (in nits or cd/m2 - 1 nit is 1cd/m2) and the other is compatibility with HDR10 (10 bit colour - 1 billion colours) and Dolby Vision (12 bit colour - 68 billion colours). Secondary to this is contrast ratio.

Good points, but what do you mean 1000 cd/m2 being a minimum?

With reference to this, and as has been alluded to above, 'true' HDR is 1000nits or above. In the same way you can buy an HD TV which only displays 720p, you can buy an HDR TV based on its ability to achieve a 10 bit colourspace.
 
With reference to this, and as has been alluded to above, 'true' HDR is 1000nits or above. In the same way you can buy an HD TV which only displays 720p, you can buy an HDR TV based on its ability to achieve a 10 bit colourspace.
I don’t think brightness is as straghforward as a maximum resolution of a TV. I’m pretty sure it varies slightly for every unit. That being said, how do you measure how many nits can a TV produce then? Which number is real for you, when you say an HDR display should be at least 1000 nits bright.
 
Different companies measure their displays differently. We are really going on the manufacturer's word really.

For people like me, I can only go on a display's review once it reaches the press... and me being lazy that means watching a yt review.
 
Different companies measure their displays differently. We are really going on the manufacturer's word really.

For people like me, I can only go on a display's review once it reaches the press... and me being lazy that means watching a yt review.
I read TV reviews on rtings.com. They do by far the most comprehensive and detailed review, including the things we talk about here and much more. I would never blindly believe a manufacturer anything. You can also compare TVs there.
 
IMHO most HDR looks awful unless done carefully by a professional editor.

All HDR does, as per the name, is extend dynamic range so solid blacks have details revealed and blown whites would now show some detail. Instead we get this super bright, super fake, super saturated rubbish complete with halo effects.

Good HDR would mean you barely notice it otherwise it has been overdone.

And some games simply increase the max brightness instead.

Bad HDR (RDR2 looked better with HDR off)
hk54gQw.jpg

Instead of seeing more detail in the dark and bright areas on a HDR tv, it looked exactly like the picture above. Higher contrast with more black crush and whiteout.

Good HDR (SotC subtle HDR)
AnGGnBN.png

Can't take a proper screenshot, yet on a HDR tv, the boss is in full detail as well as the horse (no black crush) while the flames are both brighter and more detailed.

HDR allows a higher dynamic range in games, yet if the game render engine isn't made for it it will look bad like RDR2, just increasing the max brightness without getting any of the benefits.

GT Sport is build from the ground up for HDR, assets are all 10 bit and made with max 10K nits in mind. (at least 6x higher than the best displays atm). You need to set the max HDR range properly in the game settings or GTS will either look washed out or you lose detail in the bright areas.

In GTS the difference is most obvious in night races. The signs (100m boards etc) are plain white without HDR, with HDR they are very bright in your headlights but you can still see the numbers. It's even more pronounced with dust/dirt/spray reflecting in your head lights. When following someone at Bathurst in a night race, without HDR you get blinded, with HDR it's bright, but you can still properly see detail through the thrown up dust while the darker areas of the screen remain clearly visible. Tokyo in the rain at night is amazing with HDR.
 
I don’t think brightness is as straghforward as a maximum resolution of a TV. I’m pretty sure it varies slightly for every unit. That being said, how do you measure how many nits can a TV produce then? Which number is real for you, when you say an HDR display should be at least 1000 nits bright.

The panel is measured by the manufacturer in a test - most likely prior to the adornment of TFT and colour panels.

It isn't about my opinion, that is the current standard info on what constitutes HDR
 
The panel is measured by the manufacturer in a test - most likely prior to the adornment of TFT and colour panels.

It isn't about my opinion, that is the current standard info on what constitutes HDR
You didn’t answer my question :) I mean, I get... 1000 nits. But what number is the one?
 
What do you mean "what number is the one"?

1000 nits is defined as true HDR brightness for a panel, anything above is better. 1000 nits = 1000 cd/m2
 
Got a new TV yesterday. Sony BRAVIA XG83. It's a mid-range 4K TV with LED LCD IPS panel. And looks like I'll be sticking to the SDR mode.
HDR does look better... but only in bright scenes or if the TV is in a bright room. In a dark room, the dark scenes glow.

These photos are made using the manual camera settings. The same settings for the SDR and HDR modes to avoid automatic exposure control.

HDR off.
The backlight is not noticeable. You can use the light sensor function on the TV.
aBAyacOcLnc.jpg


HDR on. The backlight is bright. You can't use the light sensor on the TV.
m96zoSWN6y0.jpg


And here are the screenshots directly from the game. I have an ordinary PS4.

HDR off.
vgklDg79AiI.jpg


HDR on. The viper red looks better without losing details. The sunset also looks better in HDR. But that backlight glow during dark scenes ruins everything for me. -_-
IwxiOH30is8.jpg


EDIT: Made a sunset comparison.

HDR off.
VZ1F6TFZ_pc.jpg


HDR on minimum. Looks almost like SDR.
oxO3owI7w9M.jpg


HDR on maximum. Adds more brightness to the sun. More color and details to the clouds.
MSeZZBcB1ks.jpg


But again, also adds more backlight glow to the dark areas.
 
Last edited:
HDR 5 is about the max you should set it to. HDR 10 is for 10K nits tvs that don't exist yet.
Set it too low and you get washed out scenes plus 'glow' in dark areas.
The glow in dark scenes can also be prevented by calibrating your tv correctly, brightness and contrast.
 

Latest Posts

Back