Gran Turismo 7 PS5 Pro Update: 8K Graphics, Real-Time Ray-Tracing On Track

  • Thread starter Famine
  • 466 comments
  • 49,048 views
Which is why I prefer to watch on BluRay, which is why PS5 needs a disc drive :s
Streaming is simply the ugly last choice.
I find Blu-ray too compressed. The original standard anyway. I havent seen the 4k ones. I got really distracted by the colour banding.


xDriver69x

Would you like to wager? Or shall I just help educate. There are RAW 8K video cameras, and have been for many years. What you think and what is might not be the same thing.
You just linked a compressed stream lol.
 
Last edited:
The previous poster indicated 8k60 was impossible unless compressed on HDMI.
You can calculate the required (uncompressed) bandwidth yourself :D

Required Bandwidth = Pixels (per X-Axis) * Pixels (per Y-Axis) * Bits per color channel (8 for sdr, 10 for HDR) * 3 Channels (Red, Green, Blue) * Frames Per Second or HZ

Example with 4k, hdr and 120hz:
3840 (pixels on x axis) * 2160 (on y axis) * 10 (hdr) * 3 * 120 = 29,859,840,000 Bit per second (~29 GBit/s)

Example with 8k, sdr and 60hz:
7680 * 4320 * 8 * 3 *60 = 47,775,744,000 Bit per second (47 GBit/s)

hdmi 2.1 (including 2.1b) offers maximum of 48GBit/s total bandwidth, but only 42 GBit/s are usable for data. the rest is used to balance the voltage.
 
Would you like to wager? Or shall I just help educate. There are RAW 8K video cameras for consumers, and have been for many years. Pro Grade 17K gear is even so much better. What you think and what is might not be the same thing.
Yes, it's possible, but it will never be a thing for any commercial service or medium. DVDs weren't uncompressed, BluRays weren't uncompressed and no streaming service, be it YouTube, Netflix, Amazon Video, Disney, Hulu, Pornhub or whatever offer any uncompressed videos AFAIK. So yes, streaming uncompressed 8K may be possible (even now? Not sure about the bitrate/required bandwith) but as I said, it won't be a thing for any commercial service. So maybe we're just talking about different use cases.

-edit- Found a bandwith calculator, but didn't check if it's correct. It says that the required bandwith for uncompressed 8K with 60FPS is ~47746 Mbps

-edit2- @i386 made the calculation himself, so it looks like it's correct.
 

Attachments

  • Screenshot_20240928-213115.png
    Screenshot_20240928-213115.png
    19.7 KB · Views: 1
Last edited:

i386

You can calculate the required (uncompressed) bandwidth yourself :D

Required Bandwidth = Pixels (per X-Axis) * Pixels (per Y-Axis) * Bits per color channel (8 for sdr, 10 for HDR) * 3 Channels (Red, Green, Blue) * Frames Per Second or HZ

Example with 4k, hdr and 120hz:
3840 (pixels on x axis) * 2160 (on y axis) * 10 (hdr) * 3 * 120 = 29,859,840,000 Bit per second (~29 GBit/s)

Example with 8k, sdr and 60hz:
7680 * 4320 * 8 * 3 *60 = 47,775,744,000 Bit per second (47 GBit/s)

hdmi 2.1 (including 2.1b) offers maximum of 48GBit/s total bandwidth, but only 42 GBit/s are usable for data. the rest is used to balance the voltage.
Despite those calculations and data capacities, compression is not used to achieve 8k60 support.
Its on the HDMI link I quoted earlier.



Here is more evidence:
FRL
Fixed Rate Link allows for these transfer rates:
8k60 8bit 32gbps
8k60 10bit 40gbps
8k60 12bit 48gbps
 
Last edited:
Wow there's a lack of something/shun or just to talk past or ego stroke goins on here.

Compression is only relevant when looking to optimize transmission through a bandwidth limited pipe or coupled with encryption for DRM. Folks didn't and still don't fight against DRM so you got it everywhere and that means encryption which includes compression.

Compression is not relevant though unless you are an engineer trying to get throughput concerns resolved.

Quality is what is relevant, and guess what. There are lossless compression algorithms (Huff, Magic, FF, even VP9 has lossless) so there is absolutely ZERO difference in a batch of frames that took a full 48*Gbps* amount of bandwidth and the same lossless compressed batch of frames that used 15 - 45*Mbps*.

Now there is also latency, any work done on a signal introduces latency for any reasonable discussion. encryption/compression adds latency. RAW signals ( aka professional recording ) or in your own home studio, use the full maximum signal bandwidth and take up an enormous amount of space. terabytes vs gigabytes for a movie, and again, ZERO quality difference. You can store and playback RAW video footage with VLC. Get your gluster ready.

Now games care about latency, movies do not. Games use trickery to try to offset this, like disabling HDCP. And because the frame/signal generation is dynamic, things like VRR are utilized to reduce quality issues.

I suspect this might still not be clear as day, and I'm sure someone will go off on a tangent despite trying to help clear the muddy waters, but forums.. forums never change.
 
Last edited:
There is nothing to speculate over. Sony advises to use the 48gbps HDMI cable. The cable has no choice but to support 8k60 uncompressed. Its the definition of what a standard is.

The previous poster indicated 8k60 was impossible unless compressed on HDMI.
The cable can only transmit what it's being fed, and the also "8K" PS5 famously isn't even capable of outputting enough data to saturate the available bandwidth of the HDMI 2.1 standard.


Like, what are we doing here?
 
Last edited:
My main gripe with PS5 that it has the limited HDMI bandwith to 32Gbps , so in 120hz mode HDR is limited

Pro most likely will have updated HDMI otherwise what is the point of the 8K60 if you not getting proper bandwith

ideally would be if they just updated the bandwith on the regular PS5 but that sadly didn't happen :/
 
Last edited:
Did you see the column called sampling? That's chroma subsampling, a lossy compression for colors (https://en.wikipedia.org/wiki/Chroma_subsampling)
HDMI 2.1b "8K60A - supports uncompressed mode"
With compression it can go beyond 8k60 uncompressed.
The compression is visual lossless "The specification incorporates VESA DSC 1.2a link compression, which is a visually lossless compression scheme. VESA DSC 1.2a also can be used to obtain higher resolutions than 8K60/4:2:0/10-bit color, such as 8K60 RGB, 8K120 and even 10K120"
 
VESA DSC 1.2a also can be used to obtain higher resolutions than 8K60/4:2:0/10-bit color, such as 8K60 RGB, 8K120 and even 10K120"
That's the explanation how to achieve 8k at 60hz: 4:2:0 chroma sub sampling aka lossy compression :D
VESA DSC 1.2a also can be used to obtain higher resolutions than 8K60/4:2:0/10-bit color, such as 8K60 RGB, 8K120 and even 10K120"
This is how to get full colors (rgb or 4:4:4 chroma subsampling) at 8k and 60hz or 8k and 120hz (with 4:2:0 compression)
 
That's the explanation how to achieve 8k at 60hz: 4:2:0 chroma sub sampling aka lossy compression :D

This is how to get full colors (rgb or 4:4:4 chroma subsampling) at 8k and 60hz or 8k and 120hz (with 4:2:0 compression)
I am not sure why you are arguing against the people that organise the HDMI standard?
It clearly says it uncompressed.
"The HDMI 2.1b Specification includes a new cable - the Ultra High Speed HDMI® Cable. It’s the only cable that complies with stringent specifications designed to ensure support for all HDMI 2.1b features including uncompressed 8k@60 and 4K@120."
 
He's not arguing with the HDMI Forum. You just don't actually understand the information you're trying to use to hit people with in this thread. HDMI 2.1 does not support 8K60 HDR at 4:4:4 and 10 Bit. The end. It doesn't get any more simpler than the math that i386 already posted above, but if even that's too much there's even calculators that will tell you outright what each display standard supports through their actual data throughput. To get that signal you'll still need to use DSC compression (if your hardware actually supports it) to get a virtually identical video signal or you can step down to 4:2:0 chroma subsampling (which the HDMI Forum probably doesn't consider "compression" because hiding behind chroma subsampling asterisks has been a crutch for a decade now).


And regardless of what cable you use, if both ends of the cable are not connected to something that supports the full HDMI 2.1 standard even if they are labeled HDMI 2.1 then you will not get full HDMI 2.1 speeds. We know this is not the case because it's already known that the current PS5 does not support full speed HDMI 2.1 even though it Sony plastered "8K" and "HDMI 2.1" everywhere for it; and in fact even has limitations with its video output that does not affect the Series X. We know this is not the case because of the problems with early televisions following 2.1's initial release that had HDMI 2.1 labeling, HDMI 2.0 speeds with and some combination of HDMI 2.1 features (eARC, usually). We know this is not the case because of the horror stories of computer monitors with "HDMI 2.1" input that only allowed high resolution/refresh rate combos with cut down chroma subsampling (which disproportionately affects the image quality with PC usage).
 
Last edited:
He's not arguing with the HDMI Forum. You just don't actually understand the information you're trying to use to hit people with in this thread. HDMI 2.1 does not support 8K60 HDR at 4:4:4 and 10 Bit. The end. It doesn't get any more simpler than the math that i386 already posted above, but if even that's too much there's even calculators that will tell you outright what each display standard supports through their actual data throughput. To get that signal you'll still need to use DSC compression (if your hardware actually supports it) to get a virtually identical video signal or you can step down to 4:2:0 chroma subsampling (which the HDMI Forum probably doesn't consider "compression" because chroma subsampling being a crutch that you can use to claim resolution/refresh rate specifications has been a game played for a decade now).
Why did you add "HDR at 4:4:4 and 10 Bit." to 8K60 ? The debate has been over 8k60 on hdmi being impossible. Then compression came into it. And the HDMI 2.1b standard supports 8k60 uncompressed. I am not an expert and dont claim to be, I am just reading whats on the HDMI offical specifications. Uncompressed doesnt sound like a vague expression to deceive people. Nor does 8k60.
Have I been deceived by the HDMI foundation?
 
Back