Not even close, the human eye can only see about 20-25fps.
...
Utterly wrong.
The human eye doesn't see the world in frames at all. It has a temporal response characteristic that is dependent on both the intensity, contrast ratio, and spectrum of incident light on the entire retina, as well as the particular retina in question. We're talking about a chemical reaction in the receptor cells of the retina, which of course have a finite response time (and can be "fatigued" temporarily, like those "burn in" and "negative image" tests demonstrate, and is also why our eyes don't like to stay still.)
What this means is that if you're watching a film in a dark room on a large screen, in which the individual frames have motion blur inherent in them from the slow response time of the original recording medium that essentially bridges the gaps in motion on screen, then yes, that's fine to our eyes. That's because the high relative brightness burns the image into our retina harder, so its residual lasts longer.
Try doing the same on an LCD screen in brightly light room, especially if you omit the blurring, and it will clearly appear to jump and jerk about.
Real scientific tests have shown that certain individuals can perceive smooth motion benefits in high contrast scenes (e.g. a black cube moving over a white background, or v.v.) well above 120 fps. Many individuals won't perceive a benefit over 60 fps, though, which is why it's the holy grail at this point.
The other major benefit is reduced latency. At 24 fps each frame comes after a delay of about 42 ms. 30 Hz is 33 ms, 60 Hz ~17 ms, 120 Hz is 8 ms. Reaction times are typically in the range of 200 - 400 ms, but
recognition of a change occurs well before any reaction, so there are reaction / interaction benefits up to 120 fps for a large number of people, and beyond that for a significant number. This is because with a lower framerate, you could show a frame and then something might immediately change, but you'd still have to wait the full 42 ms before you could see it - faster refresh rates reduce this latency. There's also the feedback loop between control input and on-screen reaction to that input - even though we're remarkably tolerant to large latencies, lower latencies give a much snappier and involving experience.
Another example is those frame-insertion tests, where you take a static scene and then suddenly inject a single contrasting frame and then switch back to the original scene. It might take you a while to react to its presence, but you definitely
saw that rogue frame, and you could probably even say what was on it.
You have to ask why the creators of Wipeout, obviously a fast game, decided that 60 fps was far (far) superior to 30 fps. Or why hacking GPL's pre-renderer to run at 60 Hz instead of 36 Hz allowed people to improve their laptimes without any significant change in driving style or car setup (although the FFB rate was improved at the same time, the physics itself was unaltered). Of course, many people claimed there wasn't a difference, which only cements the idea that everybody's different. So, to play it safe, it's generally better to have a higher frame rate than a lower one (unless you want to level the playing field, but please use motion blur).
Interpolation is actually pointless, it's not adding any extra detail - it can smooth out certain movements and obviously choppy framerates, and any reaction benefits are likely to be marginal due to the softened transitions. Depending on the exact source material and method of interpolation, it can sometimes give the motion an unnatural feel. For example, I cannot watch TV interpolated to 60 Hz, it looks like its in fast-forward somehow, it's awful. I've not tried games interpolated.
If it "works" for your eyes, though, that's fine.
Also,
this is pretty interesting, given its age.