Gran Turismo 7 PS5 Pro Update: 8K Graphics, Real-Time Ray-Tracing On Track

  • Thread starter Famine
  • 839 comments
  • 104,730 views
Which is why I prefer to watch on BluRay, which is why PS5 needs a disc drive :s
Streaming is simply the ugly last choice.
I find Blu-ray too compressed. The original standard anyway. I havent seen the 4k ones. I got really distracted by the colour banding.


xDriver69x

Would you like to wager? Or shall I just help educate. There are RAW 8K video cameras, and have been for many years. What you think and what is might not be the same thing.
You just linked a compressed stream lol.
 
Last edited:
The previous poster indicated 8k60 was impossible unless compressed on HDMI.
You can calculate the required (uncompressed) bandwidth yourself :D

Required Bandwidth = Pixels (per X-Axis) * Pixels (per Y-Axis) * Bits per color channel (8 for sdr, 10 for HDR) * 3 Channels (Red, Green, Blue) * Frames Per Second or HZ

Example with 4k, hdr and 120hz:
3840 (pixels on x axis) * 2160 (on y axis) * 10 (hdr) * 3 * 120 = 29,859,840,000 Bit per second (~29 GBit/s)

Example with 8k, sdr and 60hz:
7680 * 4320 * 8 * 3 *60 = 47,775,744,000 Bit per second (47 GBit/s)

hdmi 2.1 (including 2.1b) offers maximum of 48GBit/s total bandwidth, but only 42 GBit/s are usable for data. the rest is used to balance the voltage.
 
Would you like to wager? Or shall I just help educate. There are RAW 8K video cameras for consumers, and have been for many years. Pro Grade 17K gear is even so much better. What you think and what is might not be the same thing.
Yes, it's possible, but it will never be a thing for any commercial service or medium. DVDs weren't uncompressed, BluRays weren't uncompressed and no streaming service, be it YouTube, Netflix, Amazon Video, Disney, Hulu, Pornhub or whatever offer any uncompressed videos AFAIK. So yes, streaming uncompressed 8K may be possible (even now? Not sure about the bitrate/required bandwith) but as I said, it won't be a thing for any commercial service. So maybe we're just talking about different use cases.

-edit- Found a bandwith calculator, but didn't check if it's correct. It says that the required bandwith for uncompressed 8K with 60FPS is ~47746 Mbps

-edit2- @i386 made the calculation himself, so it looks like it's correct.
 

Attachments

  • Screenshot_20240928-213115.png
    Screenshot_20240928-213115.png
    19.7 KB · Views: 5
Last edited:

i386

You can calculate the required (uncompressed) bandwidth yourself :D

Required Bandwidth = Pixels (per X-Axis) * Pixels (per Y-Axis) * Bits per color channel (8 for sdr, 10 for HDR) * 3 Channels (Red, Green, Blue) * Frames Per Second or HZ

Example with 4k, hdr and 120hz:
3840 (pixels on x axis) * 2160 (on y axis) * 10 (hdr) * 3 * 120 = 29,859,840,000 Bit per second (~29 GBit/s)

Example with 8k, sdr and 60hz:
7680 * 4320 * 8 * 3 *60 = 47,775,744,000 Bit per second (47 GBit/s)

hdmi 2.1 (including 2.1b) offers maximum of 48GBit/s total bandwidth, but only 42 GBit/s are usable for data. the rest is used to balance the voltage.
Despite those calculations and data capacities, compression is not used to achieve 8k60 support.
Its on the HDMI link I quoted earlier.



Here is more evidence:
FRL
Fixed Rate Link allows for these transfer rates:
8k60 8bit 32gbps
8k60 10bit 40gbps
8k60 12bit 48gbps
 
Last edited:
Wow there's a lack of something/shun or just to talk past or ego stroke goins on here.

Compression is only relevant when looking to optimize transmission through a bandwidth limited pipe or coupled with encryption for DRM. Folks didn't and still don't fight against DRM so you got it everywhere and that means encryption which includes compression.

Compression is not relevant though unless you are an engineer trying to get throughput concerns resolved.

Quality is what is relevant, and guess what. There are lossless compression algorithms (Huff, Magic, FF, even VP9 has lossless) so there is absolutely ZERO difference in a batch of frames that took a full 48*Gbps* amount of bandwidth and the same lossless compressed batch of frames that used 15 - 45*Mbps*.

Now there is also latency, any work done on a signal introduces latency for any reasonable discussion. encryption/compression adds latency. RAW signals ( aka professional recording ) or in your own home studio, use the full maximum signal bandwidth and take up an enormous amount of space. terabytes vs gigabytes for a movie, and again, ZERO quality difference. You can store and playback RAW video footage with VLC. Get your gluster ready.

Now games care about latency, movies do not. Games use trickery to try to offset this, like disabling HDCP. And because the frame/signal generation is dynamic, things like VRR are utilized to reduce quality issues.

I suspect this might still not be clear as day, and I'm sure someone will go off on a tangent despite trying to help clear the muddy waters, but forums.. forums never change.
 
Last edited:
There is nothing to speculate over. Sony advises to use the 48gbps HDMI cable. The cable has no choice but to support 8k60 uncompressed. Its the definition of what a standard is.

The previous poster indicated 8k60 was impossible unless compressed on HDMI.
The cable can only transmit what it's being fed, and the also "8K" PS5 famously isn't even capable of outputting enough data to saturate the available bandwidth of the HDMI 2.1 standard.


Like, what are we doing here?
 
Last edited:
My main gripe with PS5 that it has the limited HDMI bandwith to 32Gbps , so in 120hz mode HDR is limited

Pro most likely will have updated HDMI otherwise what is the point of the 8K60 if you not getting proper bandwith

ideally would be if they just updated the bandwith on the regular PS5 but that sadly didn't happen :/
 
Last edited:
Did you see the column called sampling? That's chroma subsampling, a lossy compression for colors (https://en.wikipedia.org/wiki/Chroma_subsampling)
HDMI 2.1b "8K60A - supports uncompressed mode"
With compression it can go beyond 8k60 uncompressed.
The compression is visual lossless "The specification incorporates VESA DSC 1.2a link compression, which is a visually lossless compression scheme. VESA DSC 1.2a also can be used to obtain higher resolutions than 8K60/4:2:0/10-bit color, such as 8K60 RGB, 8K120 and even 10K120"
 
VESA DSC 1.2a also can be used to obtain higher resolutions than 8K60/4:2:0/10-bit color, such as 8K60 RGB, 8K120 and even 10K120"
That's the explanation how to achieve 8k at 60hz: 4:2:0 chroma sub sampling aka lossy compression :D
VESA DSC 1.2a also can be used to obtain higher resolutions than 8K60/4:2:0/10-bit color, such as 8K60 RGB, 8K120 and even 10K120"
This is how to get full colors (rgb or 4:4:4 chroma subsampling) at 8k and 60hz or 8k and 120hz (with 4:2:0 compression)
 
That's the explanation how to achieve 8k at 60hz: 4:2:0 chroma sub sampling aka lossy compression :D

This is how to get full colors (rgb or 4:4:4 chroma subsampling) at 8k and 60hz or 8k and 120hz (with 4:2:0 compression)
I am not sure why you are arguing against the people that organise the HDMI standard?
It clearly says it uncompressed.
"The HDMI 2.1b Specification includes a new cable - the Ultra High Speed HDMI® Cable. It’s the only cable that complies with stringent specifications designed to ensure support for all HDMI 2.1b features including uncompressed 8k@60 and 4K@120."
 
He's not arguing with the HDMI Forum. You just don't actually understand the information you're trying to use to hit people with in this thread. HDMI 2.1 does not support 8K60 HDR at 4:4:4 and 10 Bit. The end. It doesn't get any more simpler than the math that i386 already posted above, but if even that's too much there's even calculators that will tell you outright what each display standard supports through their actual data throughput. To get that signal you'll still need to use DSC compression (if your hardware actually supports it) to get a virtually identical video signal or you can step down to 4:2:0 chroma subsampling (which the HDMI Forum probably doesn't consider "compression" because hiding behind chroma subsampling asterisks has been a crutch for a decade now).


And regardless of what cable you use, if both ends of the cable are not connected to something that supports the full HDMI 2.1 standard even if they are labeled HDMI 2.1 then you will not get full HDMI 2.1 speeds. We know this is not the case because it's already known that the current PS5 does not support full speed HDMI 2.1 even though it Sony plastered "8K" and "HDMI 2.1" everywhere for it; and in fact even has limitations with its video output that does not affect the Series X. We know this is not the case because of the problems with early televisions following 2.1's initial release that had HDMI 2.1 labeling, HDMI 2.0 speeds with and some combination of HDMI 2.1 features (eARC, usually). We know this is not the case because of the horror stories of computer monitors with "HDMI 2.1" input that only allowed high resolution/refresh rate combos with cut down chroma subsampling (which disproportionately affects the image quality with PC usage).
 
Last edited:
He's not arguing with the HDMI Forum. You just don't actually understand the information you're trying to use to hit people with in this thread. HDMI 2.1 does not support 8K60 HDR at 4:4:4 and 10 Bit. The end. It doesn't get any more simpler than the math that i386 already posted above, but if even that's too much there's even calculators that will tell you outright what each display standard supports through their actual data throughput. To get that signal you'll still need to use DSC compression (if your hardware actually supports it) to get a virtually identical video signal or you can step down to 4:2:0 chroma subsampling (which the HDMI Forum probably doesn't consider "compression" because chroma subsampling being a crutch that you can use to claim resolution/refresh rate specifications has been a game played for a decade now).
Why did you add "HDR at 4:4:4 and 10 Bit." to 8K60 ? The debate has been over 8k60 on hdmi being impossible. Then compression came into it. And the HDMI 2.1b standard supports 8k60 uncompressed. I am not an expert and dont claim to be, I am just reading whats on the HDMI offical specifications. Uncompressed doesnt sound like a vague expression to deceive people. Nor does 8k60.
Have I been deceived by the HDMI foundation?
 
Why did you add "HDR at 4:4:4 and 10 Bit." to 8K60 ?
Because the original post on this two page thread derail, which is what i386 was responding to in the first place when you said he was wrong, specifically said "2.1 can support up to 10k at 120fps with HDR." True HDR is at least 10 bit video at 4:4:4. The HDMI 2.1 equipped PS5 can't even support that at 4K.

The debate has been over 8k60 on hdmi being impossible. Then compression came into it. And the HDMI 2.1b standard supports 8k60 uncompressed.
Because even if you want to claim that that's what the discussion was always about, compression is still needed because HDMI 2.1 doesn't have the actual usable bandwidth (which in actuality is around 42Gbps for video signaling) for an 8K60FPS stream. Whether that compression is officially named as such or is using hacky way to lower bandwidth that might not affect image quality and frequently is concealed from people buying devices doesn't make it actually uncompressed. I don't know what their testing methodology is but the HDMI Forum either doesn't consider cutting down the chroma subsampling to be a form of image compression or is doing something otherwise asterisk worthy like measuring NTSC refresh rates instead of actual 60FPS. Again, this is such a known quantity that you can feed data into a calculator to tell you what the bandwidth requirements are for uncompressed video resolutions/framerates and whether current cable standards can support it. A bitmap image of a certain resolution is always the same size size no matter what is on it. The same is true for uncompressed video.


Math is math.

Uncompressed doesnt sound like a vague expression to deceive people. Nor does 8k60.
This is the same standards committee that for years allowed manufacturers to label something as HDMI 2.1 so long as it supported at least one HDMI 2.1 exclusive feature. This can include things that are perfectly possible to be done on HDMI 2.0 equipment. Those Vizio TVs from 2020 that only supported 4K60 HDR at 4:2:2 were oficially HDMI 2.1 devices, for example; or all of the sound bars that needed a firmware update to "enable" HDMI 2.1 (because all they were using it for was eARC so any HDMI Ethernet cable dating back to the PS3 launch would work).


And that's even before we get to what happened during COVID:
Have I been deceived by the HDMI foundation?

Where they just changed it so HDMI 2.0 devices would just straight up be called HDMI 2.1; similar to the nonsense the USB Implementers Forum was doing a couple years prior.



The tightening of those standards so manufacturers can't play so fast and loose with the branding is what HDMI 2.1b is actually trying to fix (in essence, fixing a mess that they created themselves), but HDMI 2.1 (sans "b") nowadays barely needs to support anything (if anything) that HDMI 2.0 didn't.
 
Last edited:
Well Sony does the work so we dont have to. Good job this is not a PC thread. Sony have a recommended/supplied cable. And we will be getting 8k60 in some way or another.
And its all going to be tested and reviewed by people like Digital Foundry. Looking forward to that.
 
One more thing to add. Most tv’s only have one good socket. Some two. Not sure if any have all 4. Point is, you can have your shiny cable and fancy playstation pro and if you plug it into hdmi 3(or whatever), you’ll see none of what you paid for. I would be completely unsurprised if a good chunk of the, I can’t see a difference crowd, got caught out by this.
 
One more thing to add. Most tv’s only have one good socket. Some two. Not sure if any have all 4. Point is, you can have your shiny cable and fancy playstation pro and if you plug it into hdmi 3(or whatever), you’ll see none of what you paid for. I would be completely unsurprised if a good chunk of the, I can’t see a difference crowd, got caught out by this.
Yes. I have seen this happen firsthand. I was wathching some streamed TV content at a friends house and I said this looks like 1080p and they angrily said no they are paying premium for 4k. And they showed me on the stream it said 4k version playing. But the AV switch over button info popup on the TV said 1080p. I said try changing the HDMI socket on the TV and suddenly we get 4k...
It must be common. But people should read the manual?
 
One more thing to add. Most tv’s only have one good socket. Some two. Not sure if any have all 4. Point is, you can have your shiny cable and fancy playstation pro and if you plug it into hdmi 3(or whatever), you’ll see none of what you paid for. I would be completely unsurprised if a good chunk of the, I can’t see a difference crowd, got caught out by this.
Plenty of TVs have 4 HDMI 2.1 ports, including all LG OLED TVs from the past few years.
 
This is the discussion thread for an article on GTPlanet:

First Impressions of Gran Turismo 7 Running on PS5 Pro at Tokyo Game Show

We’ve now got a first look at the PlayStation 5 Pro Enhanced version Gran Turismo 7 in action, courtesy of a couple of bold attendees at the Tokyo Games Show in Chiba...
Watching the video I noticed that the braking zone on the road was also reflecting off the rear license plate. Weird but kind of cool. I guess the ray tracing assumes the braking zone is part of the road and it is acting like a light source.
 
Plenty of TVs have 4 HDMI 2.1 ports, including all LG OLED TVs from the past few years.

Im just saying double check when your buying, and plugging new stuff in. The tv people have been unnecessarily annoying with the sockets for years.


Also, lg has a great picture. A more true image than sony or samsung and yeah..I didn’t check their cheaper models but my g3 has good sockets. Just be wary
 
Im just saying double check when your buying, and plugging new stuff in. The tv people have been unnecessarily annoying with the sockets for years.


Also, lg has a great picture. A more true image than sony or samsung and yeah..I didn’t check their cheaper models but my g3 has good sockets. Just be wary
All OLED screens are LG displays and only Samsung make the QLED, which is the best I've seen on a TV you can buy off the shelf
 
Back