What speed does GT4 run at?

  • Thread starter Tornado
  • 15 comments
  • 14,123 views
40,914
Now, I know it is a simple question, but after some GT3 time I'm not so sure. I thought originally 60FPS, but GT3 seems to run noticeably faster than GT4. So, is there any official statement on GT4? I'm thinking 45 FPS.
 
I remember all the previews and reviews when the game came out claiming 60FPS for GT4, a quick googling confirmed this. Whether or not this actually holds true on some sorta bench test is another matter.
 
Every source I could find from a quick Google search tells me that it's 60fps, and my first reaction is to believe it. For me, there's a fairly substantial difference between 40fps and 60fps, so if anything, GT4 is probably close to 60fps...say, 50-55.
 
Just another thing, do you know how hard it is to notice the difference for a human to see a FPS difference with out computers? Impossible if it's over 30. The human eye can only see up to 30 frames per second. So it really doesn't matter, but sometimes, low FPS can and somtimes will slow down around lot's a of action, like the corners in GT. But since it has 60fps, (assuming it does) it doesn't have that.
 
Just another thing, do you know how hard it is to notice the difference for a human to see a FPS difference with out computers? Impossible if it's over 30. The human eye can only see up to 30 frames per second. So it really doesn't matter, but sometimes, low FPS can and somtimes will slow down around lot's a of action, like the corners in GT. But since it has 60fps, (assuming it does) it doesn't have that.

After-image differentiation allows you to see those differences, given a bit of practice. In games it's even more obvious since control is mostly directly affected by the framerate. For example, i can easily see the difference between 20, 30, 60 and 100+ fps in games.

Anyway, the eye does not take pictures like a camera. The organic nature of the eye allows for subtle details to be noticed. I hear some people can even see the frequency of lightbulbs.... Although that is not an enjoyable experience.

Closing off, i can say GT4 runs at at least 50 fps, probably more. It's also constant, except in photomode, for some reason.
 
PAL - GT3 and GT4 = 50 fps
NTSC - GT3 and GT4 = 60 fps

Not's possible to run a console game at random framerates without glitches in the sync of the tv image(tearing), the framerate of the game need to work in sync with the tv frequency(NTSC=60 Hz, PAL=50 Hz) even with slowdowns.

NTSC framerates = 60/30/15
PAL framerates = 50/25/10
 
Just another thing, do you know how hard it is to notice the difference for a human to see a FPS difference with out computers? Impossible if it's over 30. The human eye can only see up to 30 frames per second. So it really doesn't matter, but sometimes, low FPS can and somtimes will slow down around lot's a of action, like the corners in GT. But since it has 60fps, (assuming it does) it doesn't have that.
2 things:
  1. There is no way that is true that the human eye cannot see over 30 FPS. I've played the R4 Turbo Mode Bonus Disc, and that shows without a doubt that 60FPS is very noticably different than 30FPS.
  2. I said that I saw a very real and noticable difference between GT3 and GT4. Overall, though, which is what I should have said; and not just in particular sections. I'm not sure whether GT3 is over 60 FPS, GT4 is under 60 (though I can gurantee it is more than 30), or something.
 
It's not so much about the fps, if it's constant. Television runs at only 25fps here in the UK but people don't complain that it looks choppy. Why because it's always 25fps, it's never changing not even by 1 frame except sometimes when your watching a live broadcast like a football match there can sometimes be delays in the signal and intereference ect. If I played a game that averaged 40fps but during the course of playing it wen't as high as 50fps and as low as 30fps that game would look a lot choppier than watching TV or playing a game that is a constant 30fps.
 
You can even see the difference in a single frame per second, at least, most people can. See the difference between movies and a home video? That's because moveis are shot at 24 fps, and home videos at 25 or 26. That accounts for a lot of the 'professionality' in the way moveis look when compared to home videos.
 
I am one of those people that can see the "flicker" in incandescent light blulbs...it's more of a slight dimming but it's very distracting.

Anyway, the big difference between home videos and professional videos isn't the frame rate. It's the steadiness of that frame rate. I've seen some professional productions where the use of a digital camera was used for certain shots due to space requirements and the change is very noticable despite the post production editing.

Also, the lower framerate for movies, in my opinion, adds a slight softness to the movie. Action sequences filmed and played at a steady 24 fps will have a little blur making things feel like they're moving faster. For video games, high frame rates (within reason, 120 fps is a waste) is better for smoothness especially with digitally added blurring.

In GT4, I think it is fair to assume 50 fps is the typical rate when playing. With GT4 pushing the PS2 to it's processing limit I think it is obvious at some points that the frame rate drops...but I doubt it goes below 45 fps.
 
The reason you can tell a difference, cause the higher FPS your getting, also means your game is running smooth. Duh you can see the difference, between 100-60fps or whatever. But I got this info of the eyes can only see 30 frames per second from someone who knows alot about birds. Birds see things really fast, he knew that a bird can see 300 frames per second. It's crazy. But he also told me about the human eyes and 30fps.

And about you telling the difference, thats simples. Frames per second is another way of saying how smooth, or laggy the game is. Like the in and out factors. And choke. It all means basically the same thing. How smooth the game is running. Usually what happens when alot is going on, like a burst of action, or a crowded section of a track, your choke will raise, along with your in value, and your fps will lower. Causing your game to be sluggish, but not in all cases. I once had 15fps and over 80 choke in source. That's crazy horrible. But the game was running perfect. Maybe my registration was off a little, but it's always that way, and maybe my rates needed to be fixed. But to tell you the truth, it's tough to tell the difference. The only real way to get the correct info is by computer.
 
It seems some are talking about frame rates, and other refresh rates.

Without having to go into fine detail over the complex visual dynamics that explain how these key rates effect what humans percieve visually, in very basic terms, frame rates determine our ability to notice motion, while refresh rates determine our ability to see the image as always being 'on'. Humans only need to see about 15 fps to percieve fluid motion, and under normal circumstances cannot tell the difference between 24 fps and above. This is why the standard film rate is 24 fps.

However, for human's to see something that is flashing on the screen as being continually 'on', it has to be refreshed at about 50 times per second (Hz). This is why the standard refresh rate is 50/60Hz.

Depending on the brightness, and the distance to width ratio, higher refresh rates may be necessary. For instance, our vision is most sensitive to refresh rates at the edges of our field of view (peripheral vision), and least sensitive at the center of our field of view. Thus, the more our field of view is taken up by the displayed images, the higher the refresh rates need to be for them to look as if they are always 'on'. This is why CRT monitors are usually run at 70Hz or greater, while most consumer CRT TVs refresh at 50Hz and 60Hz.

As far as games go, the following wiki entry may help shed some light on the subject:

Frame Rates in Video Games

Frame rates are considered important in video games. The frame rate can make the difference between a game that is playable and one that is not. The first 3D first-person adventure game for a personal computer, 3D Monster Maze, had a frame rate of approximately 6 fps, and was still a success, being playable and addictive. In modern action-oriented games where players must visually track animated objects and react quickly, frame rates of approximately 25 to 30 fps are considered minimally acceptable.

A culture of competition has arisen among game enthusiasts with regards to frame rates, with players striving to obtain the highest fps count possible. Indeed, many benchmarks released by the marketing departments of hardware manufacturers and published in hardware reviews focus on the fps measurement. Modern video cards, often featuring NVIDIA or ATI chipsets, can perform at over 160 fps on intensive games such as F.E.A.R. This does not apply to all games - some games apply a limit on the frame rate. For example, in the Grand Theft Auto series, Grand Theft Auto III and Grand Theft Auto: Vice City have a standard 30 fps (Grand Theft Auto: San Andreas runs at 25 fps) and this limit can only be removed at the cost of graphical and gameplay stability. It is also doubtful whether striving for such high frame rates is worthwhile. An average 17" monitor can reach 85 Hz, meaning that any performance reached by the game over 85 fps is discarded. For that reason it is not uncommon to limit the frame rate to the refresh rate of the monitor in a process called vertical synchronization. However, many players feel that NOT synchronizing every frame produces better in-game performance, at the cost of some "tearing" of the images.

It should also be noted that there is a rather large controversy over what is known as the "feel" of the game frame rate. It is argued that games with extremely high frame rates "feel" better and smoother than those that are just getting by. This is especially true in games such as a first-person shooter. There is often a noticeable choppiness perceived in most computer rendered video, despite it being above the flicker fusion frequency.

This choppiness is not a perceived flicker, but a perceived gap between the object in motion and its afterimage left in the eye from the last frame. A computer samples one point in time, then nothing is sampled until the next frame is rendered, so a visible gap can be seen between the moving object and its afterimage in the eye.

The reason computer rendered video has a noticeable afterimage separation problem and camera captured video does not is that a camera shutter interrupts the light two or three times for every film frame, thus exposing the film to 2 or 3 samples at different points in time. The light can also enter for the entire time the shutter is open, thus exposing the film to a continuous sample over this time. These multiple samples are naturally interpolated together on the same frame. This leads to a small amount of motion blur between one frame and the next which allows them to smoothly transition.

An example of afterimage separation can be seen when taking a quick 180 degree turn in a game in only 1 second. A still object in the game would render 60 times evenly on that 180 degree arc (at 60 Hz frame rate), and visibly this would separate the object and its afterimage by 3 degrees. A small object and its afterimage 3 degrees apart are quite noticeably separated on screen.

The solution to this problem would be to interpolate the extra frames together in the back-buffer (field multisampling), or simulate the motion blur seen by the human eye in the rendering engine. Currently most video cards can only output a maximum frame rate equal to the refresh rate of the monitor. All extra frames are dropped.

High frame rates are also for creating performance "reserves" as certain elements of a game may be more GPU-intensive than others. While a game may achieve a fairly consistent 60 fps, the frame rate may drop below that during intensive scenes. A higher rendering frame rate may prevent a drop in screen frame rate.
 
👍 I always thought that frame rates (FPS) and refresh rates (Hz) were interchangable. Good info, but I was referring to GT4 just constantly playing slower than GT3. It's not choppy, but GT3 just "feels" faster than 4.
You are not alone, as they can easily be confused for one another as they are both interconnected to a certain degree. The best way to visualize it, is to think of each frame, and then the number of times it is 'exposed'. Typically with 24/25/30 fps video, each frame is 'exposed' twice thus resulting in 48/50/60Hz refresh rates.

As for GT3 & GT4 frame rates, I have no clue. However, the amount of detail in each frame can also effect how 'choppy' an image looks by comparrison, so that may also be playing a role in why there is a visual difference between the two.

I really have not given much thought to the technical side of game development and how they generate images for display, but I am finding that the better these games get... the more of a "gamer" I have become, or at least see my interst growing exponentially. I certainly am understanding now why people like George Lucas, Peter jackson, and Robert Rodriguez who's passions have always been in film, are spending more time and resources on game development than ever before. Games are captureing all the excitment that action and adventure films have to offer, and then putting the player in control of the enviroment, the character(s), and even the story line.

I suspect the way things are headed, that gaming market will grow at an unprescedented rate, and will exceed the sales of both box office films and films on video, making it the largest market for consumer entertainment.
 
Back