- 60
- Singapore
Quick question should I put my tv in game mode or HDR mode? Thanks
Quick question should I put my tv in game mode or HDR mode? Thanks
I still run an old Panasonic Plasma because to match the image quality, i'd need to spend thousands on an OLED screen. LED in general, at consumer prices, have a very fake looking picture that i cannot warm to at all. And HDR on LED screens look terrible most of the time unless you have absolute fine control over the effect.
In regard to pixel response time Plasma TV is lightning fast compared to LED LCD or even OLED.The current LED’s are nearly on par with OLED and superior to old Plasma.
In doing so, any game with any form of motion left a ghost image of what was in motion at all times.
I set my 4K 50 inch new tv to HDR on, calibrated the PS4s HDR options to how it suggested also. In doing so, any game with any form of motion left a ghost image of what was in motion at all times. This was prevalent in GTS and Project Cars 2 especially.
So i've no idea if I got something wrong there or what that was. The TV was in full HDR and game mode. And the end result was GT Sport looked dull and washed out. And both GTS and PCARS2 had the blurry ghosting of items in motion.
So to me, definitely looks better "off". And imo is an overhyped option for gaming.
In regard to pixel response time Plasma TV is lightning fast compared to LED LCD or even OLED.
I'll look, thanks for the tip.Check out if soomething like "pixel motion" option is enabled on your TV image/screen settings. If so, turn that off immediately.
Hello. What are your picture settings? brightness, contrast... etc.Here is my comparison on GT sport usin pictures taken on phone. One with hdr on, on the phone and HDR on, on the TV. And another off and off.
TV used Sony Bravia x800e 43", camera Sony Xperia XZ Premium 19MP mode.
View attachment 752606 View attachment 752608
Hope this clears stuff up
Good points, but what do you mean 1000 cd/m2 being a minimum? I’m sure you know that depends on the current image being displayed and also for how long. I mean my Samsung Q6FN tops at 860 at 10% target, not sustained and still looks absolutely briliant in HDR, especially GTS. Contrast is “only” 6700, which is great for an LCD.CAN THIS BE POSTED A LOT IN THIS GROUP PLEASE:
HDR Photography is COMPLETELY DIFFERENT to HDR video.
HDR photography takes multiple photos at different exposure levels to mitigate over or under-saturation and exposure in a photograph. This was originally created to overcome a camera sensor having the exposure adjusted across the whole sensor and making it behave a little more like a real eye. For example, if you look out of a bright window your eyes can still see the inside of the window frame, a camera cannot. People also use this to edit a photograph in to silly levels for artistic effect.
HDR videography is a colour-space and brightness format which enhances the contrast ratio and widens the colour gamut. An HDR camera ups the number of available colours to 1 billion distinct colours (in the case of HDR10) or 68 billion (in the case of Dolby Vision). An HDR display must have either a peak brightness of over 1000 cd/m2 and a black level less than 0.05 cd/m2 (a contrast ratio of at least 20,000:1) or a peak brightness of over 540 cd/m2 and a black level less than 0.0005 cd/m2 (a contrast ratio of at least 1,080,000:1).
This is, I'll admit, confused by smartphone manufacturers enabling 'HDR video' in their camera apps. These are not (unless you own a certain LG phone) HDR video and are merely making use of HDR photography techniques whilst capturing a video.
You also can't post pictures on here demonstrating the difference between HDR and non-HDR as phones do not kick in HDR (even if they are capable of HDR) on web pages or still images, only in moving video that has been encoded with a minimum of a 10-bit colour space so it is POINTLESS.
Sorry, but as someone with a passion for photography and videography it annoys me that the two terms are used to describe completely different things.
Still... the actual cd/m2 depends greatly on content that’s being displayed. It ranges from 600-1100 for the 2019 SAMSUNG Q8 (which is quite embarasing, considering my 2018 Q6 is very close, and has even better contrast ratio than 2019 Q8 btw).As far as I know there are THREE HDR standards for monitors.
HDR1000 = 1,000 cd/m2
HDR600 = 600 cd/m2
HDR400 = 400 cd/m2
You want a 1,000 unit but these are the most expensive and the most rare.
eg. Philips 43M6VBPAB 43" 4K UHD Adaptive Sync 4MS HDR1000 VA QLED Monitor
600 is acceptable.
400 is barely HDR given 400 cd/m2 isnt hugely great over your typical 300/350 cd/m2 on a typical screen.
I'm still kind of looking for the holey grail for a monitor that can do it all...
eg. 27" 1,440p 144hz IPS and HDR1000
Not sure what ps4/5 and GT Sport support.
IMHO most HDR looks awful unless done carefully by a professional editor.
All HDR does, as per the name, is extend dynamic range so solid blacks have details revealed and blown whites would now show some detail. Instead we get this super bright, super fake, super saturated rubbish complete with halo effects.
Good HDR would mean you barely notice it otherwise it has been overdone.
Still... the actual cd/m2 depends greatly on content that’s being displayed. It ranges from 600-1100 for the 2019 SAMSUNG Q8 (which is quite embarasing, considering my 2018 Q6 is very close, and has even better contrast ratio than 2019 Q8 btw).
So which one are we talking about? 10% window peak brightness or just sustained 50% or what?
Good points, but what do you mean 1000 cd/m2 being a minimum?
I don’t think brightness is as straghforward as a maximum resolution of a TV. I’m pretty sure it varies slightly for every unit. That being said, how do you measure how many nits can a TV produce then? Which number is real for you, when you say an HDR display should be at least 1000 nits bright.With reference to this, and as has been alluded to above, 'true' HDR is 1000nits or above. In the same way you can buy an HD TV which only displays 720p, you can buy an HDR TV based on its ability to achieve a 10 bit colourspace.
I read TV reviews on rtings.com. They do by far the most comprehensive and detailed review, including the things we talk about here and much more. I would never blindly believe a manufacturer anything. You can also compare TVs there.Different companies measure their displays differently. We are really going on the manufacturer's word really.
For people like me, I can only go on a display's review once it reaches the press... and me being lazy that means watching a yt review.
IMHO most HDR looks awful unless done carefully by a professional editor.
All HDR does, as per the name, is extend dynamic range so solid blacks have details revealed and blown whites would now show some detail. Instead we get this super bright, super fake, super saturated rubbish complete with halo effects.
Good HDR would mean you barely notice it otherwise it has been overdone.
I don’t think brightness is as straghforward as a maximum resolution of a TV. I’m pretty sure it varies slightly for every unit. That being said, how do you measure how many nits can a TV produce then? Which number is real for you, when you say an HDR display should be at least 1000 nits bright.
You didn’t answer my question I mean, I get... 1000 nits. But what number is the one?The panel is measured by the manufacturer in a test - most likely prior to the adornment of TFT and colour panels.
It isn't about my opinion, that is the current standard info on what constitutes HDR