Just finished watching my FREE copy of Casino Royale! Very nice! i can only run 720p but Blue ray movies look extremely good!
Chances are you're probably watching the movie in 1080i, since almost all 720p sets display 1080i. Because currently, the PS3 does not output Blu-Ray movies in 720p, IIRC, at least.
What JR said is basically correct, but to make sure it's clear, 720p sets are not displaying 1080i, they are displaying 1080i signals by downscaling them to 720p.
This is currently the ideal set-up for those with 720p sets, because as JR noted, currently the PS3 does not downscale BD movies from their native 1080p signal to 720p.
To do this, you have to set-up the PS3 to output 1080i and let your TV do the downscaling, which in many cases will result in the same quality image compared to what the PS3 would have done had it downscaled the image instead.
To check to see if your PS3 is outputting 1080i, select
"Display Settings" from the
"Settings" bar. If it says
"Current Output Resolution 1080i" under the
"Video Output Settings", then your in good shape. If it says 720p, then the PS3 will only output a 480p signal when playing BD movies. If that is the case, then select
"Video Output Settings", then press the right arrow on the D-pad to skip the connector type which should already be set to the correct one. Then select
"Automatic". It will flash until it displays the highest resolution that your TV accepts. Select
"Yes", and to save that setting press
"X" again.
Now one catch with this is for those with 720p sets that do not properly deinterlace. If it does not, then they will lose some vertical resolution, but not as much as when a 1080p set doesn't properly deinterlace 1080i signals.
I know some of this info can seem unwieldy, so let me put in mathematical terms.
1080p = 1920x1080
1080i = 1920x540 x2 = 1920x1080
Interlaced is simply breaking up a progressive signal into two fields, one carrying the odd numbered horizontal lines, the other carrying the even numbered horizontal lines. When properly deinterlaced, they are combined to form the original 1080p signal.
What often happens though is that either the 1080i signals are not properly flagged or the TV doesn't deinterlace, and in either case, what happens most often is that the field pairs are individually scaled to the native resolution of the TV and then sequenced instead of being combined. For a 1080p TV this means a loss of 50% of the vertical resolution. For a 720p TV, because it is already losing 50% of the resolution from all 1080i and 1080p signals, the additional loss from improper deinterlacing is another 33%.
To confirm exactly what the resolution is from any incoming signal, look on your TV's remote for a button that says
"Info", or something like it. Every TV I have ever seen or worked on has had one, so it should be fairly easy to find. When you press that button it should provide you with information on the signal coming in that is currently being displayed. At the very least it should say the type of input and the resolution of the signal
(not what you are actually seeing, as that is always going to be the native resolution of the TV itself, not the signal!).
So if for instance, your PS3 is set to output 720p and not 1080i, then when watching a BD movie, like Casino Royale, when you press the
"Info" button on your TV’s remote, it should say
"720x480p" which is the resolution that the PS3 will output BD movies in if it is not set to output 1080i or 1080p.
If it says
"1920x1080i", then you are in good shape, as it means the PS3 is outputting the movie in 1080i, and your TV is then hopefully deinterlacing it properly and scaling it down to 1280x720.
I hope this was more helpful than confusing, but I can understand why many people find all these different forms of resolution so difficult to differentiate.