If the entire point of this diatribe is that you objected to how I compared GT4's 1080i rendering to a TV scalar to explain why it appeared to achieve better results, then I have to say it was a waste of time on your part.
The entire point of this discussion is that GT4 at 1080i shows more detail than at 480 and your insistence to deny that.
It's hard to see how your posts explain how I'm wrong when I already
said,
twice, that the game was doubling the image height.
You said a lot of things and much of them does not make any sense. Still I'm not sure that you know what are you talking about.
It's a field-rendered "480p" image,
No, a 480p image is composed by a single field and at 1080i the images are compossed by two different fields. Every resultant 1080 frame on the screen is a combination of two of those "480" images, as seen with the increase of detail in all the 1080i comparisons.
It's pretty much the same resolution with some clever image rendering used to double the lines shown on screen and trick the television into thinking it's 1080i
No, practically is not the same. There is no external video trick involved in a already complaint full frame 1080i signal.
It's instead the PS2 scaling from the same 480 source; as opposed to the TV's more likely than not awful internal scalar doing so.
Again the incorrect 480 source and the "scaling" thing to explain a common interlacing process.
There is no more detail than the 480p image, and if your TV has the hardware to scale it properly or is an older CRT HDTV it would look no different.
Absurd... and already explained and discussed.
the game was field rendering the image (deinterlacing it, then stacking the two images)
The game is not deinterlacing the image, that is the work of the tv. Field rendering is another thing.
have trouble with GT4's image signal (the field rendered extremely narrow interlaced 1080i signal)
What?... as explained previously there is not an exotic resolution output from the PS2, the signal is normalized internally by the console to a full 1080i frame before arriving to the tv.
And you know as well as I do that it is impossible to get a like for like comparison to do so by throwing around YT videos. Youtube doesn't even render videos the same way on their end between the two modes, and that's before you get into things like the equipment used to capture the footage.
When the increase of quality is evident in the video, that sounds much as an excuse to don't accept it. If someone want to try the vid can be played directly in hd on most modern tvs with internet access.
This is another 1080 good example, full screen and at 1080p:
That doesn't answer my question. And, again, it's a photo of a television screen. Just because it's cropped this time rather than the full image you posted before doesn't change that.
More excuses... when the difference of the rendered detail is so clear between similar pictures a better source of them would not change anything. Enought is the definition.
You already explained why:
I mean a practical proof, like mine, not a numbers proof that you does not seems to understand.
GT4 480p (no AA)
640 x 480 = 307200 pixels (1 field at 60Hz, 1 frame)
GT4 1080i (no AA, lower color depth)
640 x 448 = 286720 pixels (1 field at 60Hz,
*1/2 frame)
*1 frame = 640 x 896 => 573440 pixels (those are the effective rendered pixels seen in pause and used by the tv deinterlacer to render every interpolated final frame)
The normalizing of the aspect ratio and the adaptation of the 1080i frame resolution from the above numbers is done later at the PS2 video output, also by the console, and those extra pixels were not counted as native rendered pixels.
Deinterlacing an image to double the vertical lines isn't going to give it any more detail.
Deinterlacing does not mean coping lines to fill the gaps or scaling... there are many advanced methods of interpolation and frame blending, that are very common and used for many years. I don't want to extend much, just read:
http://en.wikipedia.org/wiki/Deinterlaced#Deinterlacing_methods
The most it will do is give a quick and dirty psuedo anti-aliasing effect, as shown in the image.
There is no antialiasing in those photos, just more pixel density or resolution.
and a TV upscaling an image that is less than half that;
The video signal from the PS2 is already 1080i complaint or could not be displayed on any tv. Is like how the PS2 games are rendered internally at 640x448 but the console outputs by default a full NTSC signal of 720x480 to made it compatible with any tv. That type of resolution normalizing is never done by the tv.
"Because it looks better" does not imply "theoretically impossible." You're also putting quite a lot of weight on 60 fps considering the known failings of GT5 in that area.
I'm not putting a lot of weight on the 60fps thing, I'm putting a lot of weight at all what the game is processing compared to the best examples in the same console.