Some things I found while browsing for "PS3 1080p" stuff. Keep in mind, the first quote is from a MS employee, and the second just a quoted commentary on that statement. But, they don't bag on the PS3, and actually bring up a lot of good points. I think there was a typo, that I highlighted in red. Read on...
They both bring up some good points, and real open your eyes to the reality of 1080p with regards to gaming. It is very much a pinnacle of current media, and is a nothing to scoff at. But, like what Mark said in the very last paragraph I quoted, how about just giving us 720p with all the features turned up.
We all love 1080p, it is bad ass. But, 720p is very nice as well. And, if we could just get games at that res, with all the bells and whistles maxed out, it might, and probably would, actually look better. But, who knows. If all the hype is true, and the PS3 can do 1080p, and WITH the goodies turned up, I'm all for it.
CT
http://ozymandias.com/archive/2006/10/21/Clarifying-Thoughts-on-High-Definition-Game-Rendering.aspxOzymandiasClarifying Thoughts on High Definition Rendering
I was talking to Bruce Dawson, one of our senior software design engineers here, about some questions I had around 1080i and 1080p. Frankly, I was particularly curious about why Sony has continued harping on 1080p as being "TrueHD", especially since the 360 has enabled 1080p output as well (coming soon to homes near you!) I was trying to figure out if I was just missing something, and his emailed answer was particularly clear and helpful to me, and since there's nothing confidential here I thought I'd share it with you.
The really interesting statistic that popped for me is how much less time a game console has to render a 1920x1080 scene versus a 1280x720 scene. (Remember this is on the same console, whichever one you like. This is not a comparison of different console's rendering capabilities to each other.) Simply put, for a 1080i/p game the console has 55% less time per pixel to render any special effects, anti-aliasing, illumination, etc. than for a 720p game. Yes, even Resistance has fallen off the bandwagon and admitted they can't hit 1080i/p as previously claimed. (It also helps explain why Gran Turismo HD is so underwhelming.)
Anyway, Bruce's text is below. Hope it helps clarify a few things for you!
Many developers, gamers, and journalists are confused by 1080p. They think that 1080p is somehow more challenging for game developers than 1080i, and they forget that 1080 (i or p) requires significant tradeoffs compared to 720p. Some facts to remember:
* 2.25x: thats how many more pixels there are in 1920x1080 compared to 1280x720
* 55.5%: thats how much less time you have to spend on each pixel when rendering 1920x1080 compared to 1280x720the point being that at higher resolutions you have more pixels, but they necessarily cant look as good
* 1.0x: thats how much harder it is for a game engine to render a game in 1080p as compared to 1080ithe number of pixels is identical so the cost is identical
There is no such thing as a 1080p frame buffer. (I think they mean 1080i). The frame buffer is 1080 pixels tall (and presumably 1920 wide) regardless of whether it is ultimately sent to the TV as an interlaced or as a progressive signal.
* 1280x720 with 4x AA will generally look better than 1920x1080 with no anti-aliasing (there are more total samples).
A few elaborations:
Any game could be made to run at 1920x1080. However, it is a tradeoff. It means that you can show more detail (although you need larger textures and models to really get this benefit) but it means that you have much less time to run complex pixel shaders. Most games cant justify running at higher than 1280x720it would actually make them look worse because of the compromises they will have to make in other areas.
1080p is a higher bandwidth connection from the frame buffer to the TV than 1080i. However the frame buffer itself is identical. 1080p will look better than 1080iinterlaced flicker is not a good thingbut it makes precisely zero difference to the game developer. Just as most Xbox 1 games let users choose 480i or 480p, because it was no extra work, 1080p versus 1080i is no extra work. Its just different settings on the display chip.
Inevitably somebody will ask about field rendering. Since interlaced formats display the even lines on one refresh pass and then the odd lines on the next refresh pass, cant games just render half of the lines each time? Probably not, and even if you could you wouldnt want to. You probably cant do field rendering because it requires that you maintain a rock solid 60 fps. If you ever miss a frame it will look horrible, as the odd lines are displayed in place of the even, or vice-versa. This is a significant challenge when rendering extremely complex worlds with over 1 million pixels per field (2 million pixels per frame) and is probably not worth it. And, even if you can, you shouldnt. The biggest problem with interlaced is flicker, and field rendering makes it worse, because it disables the flicker fixer hardware that intelligently blends adjacent lines. Field rendering has been done in the past, but it was always a compromise solution.
http://www.gizmodo.com/gadgets/home-entertainment/1080p-gaming-not-what-it-seems-209311.phpGizmodo1080p Gaming Not What it Seems?
There is no doubt that 1080p is the holy grail of high definition, which is exactly why Sony has pursued that benchmark with such enthusiasm. But exactly how hard is 1080p to render for video game consoles? Here are some telling stats straight from senior software design Bruce Dawson:
* 2.25x: that's how many more pixels there are in 1920x1080 compared to 1280x720
* 55.5%: that's how much less time you have to spend on each pixel when rendering 1920x1080 compared to 1280x720--the point being that at higher resolutions you have more pixels, but they necessarily can't look as good
* 1.0x: that's how much harder it is for a game engine to render a game in 1080p as compared to 1080i--the number of pixels is identical so the cost is identical
* 1280x720 with 4x AA will generally look better than 1920x1080 with no anti-aliasing (there are more total samples).
Resolution is no longer the sole indicator of image quality because of technological advancements that improve graphics at the pixel level, like shaders. If the PS3 is a stronger computing platform that can run 1080p well, I'd like to see them scale the resolution down, max the shaders and give me a mind-blowing image I'm actually capable of displaying on my 720p TV. - Mark Wilson
They both bring up some good points, and real open your eyes to the reality of 1080p with regards to gaming. It is very much a pinnacle of current media, and is a nothing to scoff at. But, like what Mark said in the very last paragraph I quoted, how about just giving us 720p with all the features turned up.
We all love 1080p, it is bad ass. But, 720p is very nice as well. And, if we could just get games at that res, with all the bells and whistles maxed out, it might, and probably would, actually look better. But, who knows. If all the hype is true, and the PS3 can do 1080p, and WITH the goodies turned up, I'm all for it.
CT