I didn't notice any screen tearing at Amar's, and I was sitting one meter from his 50'' plasma
I hope you are right, there is obvious screen tearing in prologue at times.
So let me ask you, did you notice screen tearing in prologue?, if not then I'm sorry but I can't trust your observations, it's like the people saying that 720p looks exactly like 1080 p <.< or 30 fps looks as smooth as 60 fps<.<).
I'm going to post this AMAZING, crystal clear , Perfect explenation of what screen tearing, v sync and tripple buffering are.
http://hardforum.com/showthread.php?t=928593
Ps2 games and pc games never had the issue with screen tearing:
ps2 : games designed to never drop under 60 fps or( sometimes 30 fps aka choosing to cap a game that would run at 30-60++ fps at 30 to keep things stable and smooth with no tearing), then capped at 50/60 fps with vsync (double buffer I assume) to run smoothly on pal and ntsc tvs with fps rarely dropping to 30
PC games : vsync, plenty of vram to use triple buffering framebuffer, great display drivers to deal with it:
scenario a : game runs f-ing smoothly, vsynch off, risk screen tearing.
scenario b: v synch on ,tripple buffering : more stable framerate no screentearing at a small cost to 'performance' (perceived only)
scenario c: double buffering , vsynch on, no tearing but heavy impact on fps if it dips under 60 or whatever you capped it at.
current gen consoles :**** all VRAM, no tripple buffering in many games cos it doesn't fit in the ****** little 10 MB edram framebuffer for the xbox360 or there simply aint enough vram on the ps3 either, games NOT made to keep a framerate over 60 fps (this is what ruins it compared to ps2/xbox) but instead AIMING for 30 or 60 fps with serious framedrops.
-> scenario a: vsync double buffered at 30 fps = fps drops down to 15 every time, reviewers bitch about bad framerate
-> scenario b : framerate simply locked at 30 fps , less ups and downs in framerates making framerate seem stable, no vsync so hello horrendous amounts of screen tearing if the game wasn't designed to never actually drop under 30 fps ( gears, enslaved)
-> scenario c : framerate locked at 30 with vsync triple buffered.
Steady but non ideal framerate, no screen tearing (several earlier current gen console games)
-> scenario d : framerate locked at 60 , vsync , double buffered, regular framerate dips to 30 because of vsync if the game wasn't optimised to run at a stable 60+, some perceive this badly, people complain. (many older ps3/xbox360 games, some ps2 games)
-> scenario f : framerate locked at 60 , no vsync , game not designed to maintain over 60 fps at all times (rarely dips= prologue) , often dips (mw and a bunch of other muck)
-> scenario e: ideal scenario : framerate locked at 60, triple buffering, vsync, game designed to maintain stable 60+ fps : Stable framerate, any eventual drops caught by vsync to prevent screen tearing, fps don't have to drop to 30 because there are 3 buffers (refer to link for explenation)
games using this : buy a ps2 or play pc , or wait for the next gen of consoles
-> scenario e: same as f but game can't maintain a 60+ framerate , lots of small framedrops but caught by vsync.
You can pray that PD chose for this option, considering the game still has jaggies and low res shadows, there's a chance that they used some of ps3's vram for tripple buffering instead, and then we don't have to deal with fugly screen tearing.
I can hear you ask 'why not design them like on ps2 with framerates in mind?
Answer: because they need to pretend that the hardware can keep up with modern pc's and because they want to port most games between all 3 systems.
So they sacrifice framerate (and resolution often) for 'next gen' graphics.
's What happens when the guys in charge are no longer the same guys that have to program and design the games.