60fps (and 1080p to some extent) limit GT5

  • Thread starter JBturbo
  • 221 comments
  • 19,327 views
So PD and T10 are idiots and every PC / Console sim racer for trying to make the game run at around 60FPS according to your opinion. You say yourself there is no difference between the two. I don't know if you are deluded or just plain trolling. Hopefully it is the latter, otherwise you have some issues in perceiving motion.

I wonder if anyone would take a bet in which I get to play a game with no motion blur at 30fps and then 60 and I tell which is which.

I'd put 100 bucks on it.
 
It's basic human physiology.

We've been studying it for a few thousand years now.

Catch up.

Maybe you should enlighten Polyphony Digital, Turn 10, Simbin, iRacing developers, Image Space Incorporated, netKar pro developers and all F1 simulators for lowering graphics quality in order to achieve highest FPS. Maybe the people who developed this have a similar understanding as me and need to catch up to your levels of wisdom. If they only knew that 60FPS makes no difference, we could have 3D trees, shadows with no fuzziness and higher quality effects and textures. If only PD knew this, they wouldn't have bothered making the game running at 60FPS and the fanbase wouldn't have cared as they wouldn't notice the difference. Why did they limit the quality of the games graphics for no reason as you say yourself. Maybe they need lessons from you and we will have a much better graphically game this gen without a hit on smoothness or response.

Seriously stop trolling, it does your reputation no good when you think your opinion are facts. No matter what you say, you won't change how humans see things, I notice a huge difference between lower and higher frame rates, game developers do and it's proven in science that you can notice a difference in frame rates from 100FPS compared to 200FPS. However there is a bigger difference from 30FPS to 60FPS.
 
I wonder if anyone would take a bet in which I get to play a game with no motion blur at 30fps and then 60 and I tell which is which.

I'd put 100 bucks on it.

It is a no-brainer for most but trolls / deluded people will say, there is no difference. This is also known as Wummery in some other boards and involves trying to wind the person up by going against all logic. However some people on here are either heavily deluded or have bad eye perception of motion.
 
Comparing watching video with playing games is difficult, when a frame in a game is drawn there has to be a snapshot of the data taken from the physics engine at one point in time, at 30fps it then takes 33.3ms to draw the frame based on that snapshot, then that frame is displayed for 33.3ms. The frame being displayed can be up to 66ms behind the physics engine, at 60fps those times are halved.

There have been some games I've played on the PS3, can't remember which but maybe NFS and midnight club that I've had problems with because of low framerate and/or motion blurring and I find it extremely irritating on my eyes.
 
This thread reminds me of a thread on a motorcycle board years ago wherein a poster tried to establish that shaft drive bikes, through some law of physics, were physically incapable of a wheelie. That thread went on forever. People posted pictures of shafties pulling a wheelie and there were cries of "PHOTOSHOPPED!!!" People posted personal accounts of actually pulling a wheelie on a shaftie. It became a running gag. This thread is like that, only less funny.

It's OBVIOUS that there is a clear and discernable, visual difference between 30FPS and 60FPS. It's not unlike the difference between film at 24FPS and video at 30FPS. THAT is an obvious difference. And that's only a change of 4 frames per second. And no, it has nothing to do with the FPS varying. You can look at Modern Warfare and it's obvious that it's 60FPS. You don't even need a 30FPS frame of reference nearby to tell. It's as clear as the nose on your face.

A couple of years ago, Madden came out for the PS3 and the 360. I believe it was the first or second year of the franchise on the PS3. Which means that the 360 already had one or two iterations of the game. The 360 version was 60FPS. The PS3 version, 30FPS. Now, having a football game at 30FPS doesn't break the game or anything, but it was like night and day seeing the two side by side. But the thing is, you didn't NEED to see them side by side. You could walk into the local Best Buy and see the demo on the 360 and immediately know it was running at 60FPS (or, to be more accurate, you could tell that it was running at a substantially higher frame rate than 30FPS).

I don't even understand how someone could argue that there's no difference between the two. It's like arguing that water isn't wet. Or fire isn't hot.
 
GT5 at 1080p never runs at 60fps. Maybe on very rare stretches of road whith no other cars it might. Never, otherwise. It might run smoother than a lot of PS3 games, but it's not really ever reaching 60fps. At least if you're running at 1080p.
 
It's basic human physiology.

We've been studying it for a few thousand years now.

Catch up.


Guys,

First off, let's all calm down. No need for rash comments.

This is actually an interesting thread, and I think that, no matter who you're backing, there's a lot to be learned.

There's a miscommunication here: It seems to me, some of you are talking about motion re: the human visual system, while some of you are concerned about reaction time and the resolution of the controls.

If you take the 'visual system' perspective, it's easy to see that JBTurbo et al are correct. This is backed by a considerable body of literature (NOT pop articles, I'm talking about major contributions from the field of cognitive neuroscience).

If you take the reaction time re: control resolution perspective, it will SEEM like 60fps is naturally better. This is a misconception, and a result of not dissociating these two issues.

So less angry comments, less ego tripping, and let's listen to each other. I think there's much to be learned either way.

JBTurbo.... I'm gonna go out on a limb and assume you're a cognitive neuroscience researcher?
.... actually it's a fairly short limb I'm going out on.... a stump really :)
 
This thread reminds me of a thread on a motorcycle board years ago wherein a poster tried to establish that shaft drive bikes, through some law of physics, were physically incapable of a wheelie.
...

So, we need the frame-rate equivalent of Arto Nyquist to show up in this thread, right? :P


As for all of this cognitive neuroscience stuff, if we're going down that route, how's about some references? I've not critically analysed a publication in a little while... :lol:
By the way, my perspective is what I can see and feel and distinguish, and whether it actually makes a difference to me; which it does. I want the maximum possible / practicable framerate, thanks. 👍
 
Guys,

First off, let's all calm down. No need for rash comments.

This is actually an interesting thread, and I think that, no matter who you're backing, there's a lot to be learned.

There's a miscommunication here: It seems to me, some of you are talking about motion re: the human visual system, while some of you are concerned about reaction time and the resolution of the controls.

If you take the 'visual system' perspective, it's easy to see that JBTurbo et al are correct. This is backed by a considerable body of literature (NOT pop articles, I'm talking about major contributions from the field of cognitive neuroscience).

If you take the reaction time re: control resolution perspective, it will SEEM like 60fps is naturally better. This is a misconception, and a result of not dissociating these two issues.

So less angry comments, less ego tripping, and let's listen to each other. I think there's much to be learned either way.

JBTurbo.... I'm gonna go out on a limb and assume you're a cognitive neuroscience researcher?
.... actually it's a fairly short limb I'm going out on.... a stump really :)


Actually turbo is completely wrong in terms of motion recreation because he is consufing the ability to recognize an image with the ability to discern discrete unit movement.

Again your eyes can tell that something was not drawn at all between two point regardless of whether it can seperate the two frames from each other.
 
This is all a bit retarded.
I can see both arguments here.

Although, on the PS1, on the first GT, there was a mode that raised the frame rate for one of the SSR x tracks. SSR5 I think. Can anyone remember this. I can't remember what it was called, but it was definitely an unlockable thing in game.

There was a vast improvement in the smoothness of the visuals and how the game felt because of the higher frame rate.
 
Actually turbo is completely wrong in terms of motion recreation because he is consufing the ability to recognize an image with the ability to discern discrete unit movement.

Again your eyes can tell that something was not drawn at all between two point regardless of whether it can seperate the two frames from each other.


Sure. And if we had infinte computing power, I'd always say that the higher the fps, the better, and let your eye-brain do the motion blurring (like real life).

The thing is, we don't have infinite computing power... so one has to use the resources optimally. I think that JBTurbo et al are just saying that all of this can be better optimized.... instead of just going for a higher frame rate just so you can have that on the back of GT5's box (even though there's a lot of other stuff on the box that isn't even in the game:ill:)

The fact of the matter is, no one will miss their breaking point because the image was ~17ms late in updating. If you actually perform better given that extra 17ms, then you should be an F1 driver ASAP.... not to mention labs around the world will want to know how it is that you can discern that kind of subtlety.
 
Sure. And if we had infinte computing power, I'd always say that the higher the fps, the better, and let your eye-brain do the motion blurring (like real life).

The thing is, we don't have infinite computing power... so one has to use the resources optimally. I think that JBTurbo et al are just saying that all of this can be better optimized.... instead of just going for a higher frame rate just so you can have that on the back of GT5's box (even though there's a lot of other stuff on the box that isn't even in the game:ill:)

The fact of the matter is, no one will miss their breaking point because the image was ~17ms late in updating. If you actually perform better given that extra 17ms, then you should be an F1 driver ASAP.... not to mention labs around the world will want to know how it is that you can discern that kind of subtlety.

I think you are wrong about what you think they are saying.

What I have read is that there is no value in going 60FPS because at 30FPS each frame is so fast the eye cannot identify each frame.

This has nothing to do with the illusion of fluid motion however...

While your eye can't individually isolate each individual frame, what it can see is the combination of a few consecutive frames together and what it will see is a bunch of discrete images each offset from another.

These discrete frozen images do not recreate real life fluid motion becuase in real life things do not move by in 1 inch jumps, they move in every single spot they can possible be with no gaps between "frames".

The higher the framerate, the smaller these gaps will be and the more fluid it will appear.

As you point out with infinite computing power we would have no gaps and it would be perfect. And we don't have that.

But that certainly doesn't mean that there is no improvement between 30 and 60 fps and the math about what the eye can discern is not relevant to whether 60fps is more fluid than 30 fps.

This is evidenced by the fact that 60 fps is much more visually fliud than 30 fps in almots any game.

As for how much it affects your ability to control... I know that I do percieve a "snappiness" at higher framerates. The numbers make that timing seem inconsequentially small but in reality I can tell you what I can feel in real world test.

While milliseconds sound infitessimally small, just think how many milliseconds lag in your ping it takes to feel the game being laggy... and that' with netcode to help overcome it... so is feeling the difference between 16 and 32 ms really that far of a stretch?

It's certainly not something I think I could accurately measure through use and tell you "oh yeah that's 16 more ms" but it's certainly an overall feeling I can percieve.

Get a video player that can desync the audio. Creep the audio out of synch just a few ms either way... notice how you can tell something is wrong... you can't tell how many milliseconds wrong and in many cases can't even tell whether it's too soon or too late, but it sure is wrong.
 
I think you are wrong about what you think they are saying.

That's entirely possible. It's not easy to read through angst and defensiveness. You might be right.

As you point out with infinite computing power we would have no gaps and it would be perfect. And we don't have that.

But that certainly doesn't mean that there is no improvement between 30 and 60 fps and the math about what the eye can discern is not relevant to whether 60fps is more fluid than 30 fps.

Actually this is an interesting question: how many samples of the visual field do we need to take with respect to time in order to achieve a faithful representation of a scene?

To my knowledge, this is an unanswered question in the neuroscience of vision. I'm an auditory researcher, and so I tend to think of this as an analog to sampling rate in digitally recorded audio. If that analogy holds, then there are all kinds of things that will determine what the optimal rate should be. The optimal rate will be context-dependent, and should take into account the resources available. Does it, in this case?
 
I'll always prefer 60fps over 30fps because at least when The game is under high load there is less chance for it to dip below the point of me noticing the low frame rate.

This would perhaps be even more atrocious if PS3 was Multi-GPU
 
That's entirely possible. It's not easy to read through angst and defensiveness. You might be right.



Actually this is an interesting question: how many samples of the visual field do we need to take with respect to time in order to achieve a faithful representation of a scene?

To my knowledge, this is an unanswered question in the neuroscience of vision. I'm an auditory researcher, and so I tend to think of this as an analog to sampling rate in digitally recorded audio. If that analogy holds, then there are all kinds of things that will determine what the optimal rate should be. The optimal rate will be context-dependent, and should take into account the resources available. Does it, in this case?

It's an unanswered question because as you noted it's situationally dependant.

For instance 2 fps is probably enough to accurately appear fully fluid if the scene is a still life with a slug crawling over an apple.

However 60FPS will still leave huge gaps between frames if something is flying by the camera up close at 200 mph.

I suppose if you take the speed of light in a vacumm as the maximum possible speed anything could travel and figure out how many frames per second you would need to render to ensure that said item would move only 1 pixel per frame, that would be the most frames per second you would need to render fluid motion.

Then you still run into uncertanties like is 1 pixel really the smallest unit you need to work with? For instance it's easy to say that with a 480p display you only need a 480p source for highest image quality... yet we routinely see that 720 and 1080p content looks better downscaled to 480p than true 480p native content does... so is there some magical subjective effect you have to account for?

Do you need to ensure that your framerate is so high the movement does not exceed 1/4 pixel movement per frame?

At the end of the day I can say that for me, on a 100hz display I find 90fps to be the spot at which things are crisp and response feel instant. I would be interested to test it on faster displays with faster rendering but I think 100fps is right around the point where no one would be able to appreciate more.
 
I used to play quake (qw,1,2,3) competitively. I can notice the difference in FPS between 90 and 125.

I would turn down the graphic detail to the point where the game was smooth and crisp.

Granted, GT5 is not a twitch monkey game, but still I'd rather have higher FPS and smooth graphics than better graphics and lower FPS. An option to turn off shadows would be fantastic.

They look horrible, its the flickering that does it for me. I don't care about the shadow resolution, but when you see black triangles flickering all over the screen, it destroys the smoothness, its a bit of a head scratcher really.
 
I used to play quake (qw,1,2,3) competitively. I can notice the difference in FPS between 90 and 125.

I would turn down the graphic detail to the point where the game was smooth and crisp.

Granted, GT5 is not a twitch monkey game, but still I'd rather have higher FPS and smooth graphics than better graphics and lower FPS. An option to turn off shadows would be fantastic.

They look horrible, its the flickering that does it for me. I don't care about the shadow resolution, but when you see black triangles flickering all over the screen, it destroys the smoothness, its a bit of a head scratcher really.

I was a CS player so not as twitchy... for the time I did play Quake and QL I think I might have actually been able to tell the 120+ frame rate if I had a good 120hz monitor.
 
120hz was the max my monitor could do. Weird thing was, with some quake games the FPS you capped it at effected the physics. I believe this is not the case with GT5.

Ahh I miss my old 19" CRT.

My plasma is getting BSpec burning from the randomly generated names list on the right. :(
 
I can definitely tell a difference between 30fps and 60fps - in driving games, flight sims, etc. I'll take 60fps any day over 30!!

And GT5 does dip way down at times - just try coming up through the final turns that lead to the start/finish line at Le Mans in the rain, and it starts chugging away trying to keep up but can't. Thankfully it doesn't last very long, but it is noticeable.
 
No studies or sciences etc blah whatever has given a frame rate at what the eye sees, simply because it isn't a camera and doesn't work that way.
I don't even care if it's what I'm seeing, I can easily FEEL a difference in the FPS.
I can tell a difference in 60fps and 95fps on UT3 and my monitor's refresh is 60hz, meaning 35 of those frames are being dropped.

I'm a big FPS(shooter) player, mostly fast paced FPS. I'm also a big racer... if you want a twitchy racer go try trackmania - its free and you are always going about 400mph through loops, twists, bends, jumps, chicanes, or anything that makes driving stupidly complicated and fun beyond belief.

Both of the above have lead me to this conclusion:
For gaming of all variety: high fps > low fps, period.

I had a friend in an old FPS I used to play that swore he couldn't play at anything under 200FPS and I believe him to this day.
 
Both of the above have lead me to this conclusion:
For gaming of all variety: high fps > low fps, period.

In general, I'd agree. But I can imagine some scenarios in which, for stylistic/artistic reasons, a game director would NOT want the fluidity that comes along with a higher frame rate. Much like film has a look because it's 24FPS, 30FPS games have a look as well. And I can see a director choosing that look intentionally.
 
Just simple conditional texture replacement, similar to the way you build in LOD's.

How do you see this? Something like 360 different textures for the 360 degrees your texture could be moving in? -> needing 360 times the vid mem for textures...

Somebody know of games trying to do this (incorporating real!! motion blur)?
 
Hi there,

30fps is more than enough for a game such as GT5.

You do realise, As with MOST Sims, GT being no different, the physics engine and the graphics engine not only run on the same clock, but they are directly tied to each other. This was also a priority with forza, to the point that they had to actually remove alot of visual effects to keep the fps high so that the physics calculations were both accurate and consistant. The physics engine runs at 60 cycles per second. The graphics engine is tied to that. Running any lower is not an option as it would lead to many undesirable glitches in the phyiscs.
 
You do realise, As with MOST Sims, GT being no different, the physics engine and the graphics engine not only run on the same clock, but they are directly tied to each other. This was also a priority with forza, to the point that they had to actually remove alot of visual effects to keep the fps high so that the physics calculations were both accurate and consistant. The physics engine runs at 60 cycles per second. The graphics engine is tied to that. Running any lower is not an option as it would lead to many undesirable glitches in the phyiscs.

That is not the case with most sims, impossible on PCs as framerates change from one machine to another and it's probably best to have the physics running several times the speed of the highest expected graphics rate. In GT5 when the framerate drops the physics don't slow down so it's not the case with that either. Generally the physics run at a higher frequency than the graphics and a snapshot of data is grabbed at the frequency of the graphics before each frame is rendered. Forza physics run at something like 300Hz but I don't know what GT5 runs at. The problem with graphics running at a lower framerate is that the graphics lag behind the physics more, it always lags by at least the time of one frame.
 
That's entirely possible. It's not easy to read through angst and defensiveness. You might be right.



Actually this is an interesting question: how many samples of the visual field do we need to take with respect to time in order to achieve a faithful representation of a scene?

To my knowledge, this is an unanswered question in the neuroscience of vision. I'm an auditory researcher, and so I tend to think of this as an analog to sampling rate in digitally recorded audio. If that analogy holds, then there are all kinds of things that will determine what the optimal rate should be. The optimal rate will be context-dependent, and should take into account the resources available. Does it, in this case?

Taking the audio example, the maximum sample rate is commonly quoted as twice the ceiling frequency that the human ear is said to perceive (as "hearing") so the analogue for vision would be taking the frequency of the shortest wavelength of light that is visible to us (somewhere in the violet region) which works out at around 1600 Tera Hertz.

Clearly our vision doesn't work in the same way!


Since we're just about agreed that 60 fps is superior for feel (for those who can discern the difference) than 30 fps, we could probably address the rest of the issues raised by the OP.

The slowdown is a pain, but at least there's no v-sync (I'd rather take the ugly tears than deal with whole-frame stutters!) and is definitely less of an issue in 720p / 60 Hz mode. Perhaps PD could have taken steps to ensure that the 720p mode worked at a flawless 60 Hz, just as an "option" for those who care about fluidity and connectivity over absolute (well, sort of) image quality.

Lastly, shadows. I'm no expert, but I would think that a change in implementation technique would be needed to fix the quality issue, which may upset the balance of memory and processor / GPU time they currently require, and hence require other stuff to change too...
 
You do realise, As with MOST Sims, GT being no different, the physics engine and the graphics engine not only run on the same clock, but they are directly tied to each other. This was also a priority with forza, to the point that they had to actually remove alot of visual effects to keep the fps high so that the physics calculations were both accurate and consistant. The physics engine runs at 60 cycles per second. The graphics engine is tied to that. Running any lower is not an option as it would lead to many undesirable glitches in the phyiscs.

This is a load of rubbish too. They are not directly tied together for most sims. Most sims run the physics engine around 400Hz or FPS as you call it. The reason why games like Forza and GT have to lower the visuals is to maintain a higher FPS for the visual side, to appreciate the physics engine and be able to react to it quicker through visuals for controller users. Steering wheel users can feel the force feedback quicker to react but still the higher FPS helps with showing the movement in the car. At 30FPS, you see a lot less happening and therefore your precision and accuracy is worse in braking and turn in points. 30FPS is not bad for arcade games but 60FPS is a must for racing sims.

Also if it was directly tied then GT5 would be hard to play as the physics engine will slow down to visual FPS of below 50FPS on some occasions and then go back up again giving a strange feeling. However that is not the case as most racing sim engines aim for at least 300Hz or more.
 
so, lets say i have a 32'' HD READY TV (720p)
Now lets say im getting a 26'' FULL HD (1080P) TV, will the image better of better quality?

Chris
 
Get in your car and go for a drive at 15mph. Fix your eyes at a spot out of your side window. Even at 15mph it will be incredibly blurred. This is in real life!

Look out of the side window and move your eye/head to follow, say, a hot woman. Then your eye will unblur the woman and blur everything else. That's because the eye can detect upto 300 fps when tracking objects. In a 30 fps game with inbuilt motion blur, you can't selectively unblur the objects you want to see by moving your eye/head. In 60 fps games, it's easier to do so. Because we always track moving objects while playing games, more fps is always better.
 
so, lets say i have a 32'' HD READY TV (720p)
Now lets say im getting a 26'' FULL HD (1080P) TV, will the image better of better quality?

Chris

In a way it will look worse as it will have 2x less anti-aliasing (Jagged Lines) but it will look more sharp on the 26" screen.

If you ran it at 720p mode on the 26" it won't look as good as 1080p due to it not being its native resolution but TVs out now do a good job on upscaling.

Due to the screen being smaller, and having a higher resolution, it will have a much higher pixel density so every thing will look sharper and clearer but may have more jagged lines.

The smaller the screen the less you will appreciate the higher resolution and detail as if you had a 32" with 1080p you will notice the detail easier. If you are getting a new TV then get a LED backlit LCD TV.
 
Taking the audio example, the maximum sample rate is commonly quoted as twice the ceiling frequency that the human ear is said to perceive (as "hearing") so the analogue for vision would be taking the frequency of the shortest wavelength of light that is visible to us (somewhere in the violet region) which works out at around 1600 Tera Hertz.

Clearly our vision doesn't work in the same way!

Interesting line of thinking but the difference is that a digital audio signal is a series of samples that when put together represent the waveform, for video information each pixel is told to emit or let pass (depending on type of display) light of particular frequencies so the video signal has nothing to do with generating the waveform of the light, if you want to think in audio terms then it's more like those old pianos with the paper rolls where a hole in the paper causes a key to be played.
 
Back