But you would play a game at 12 Hz?
It's funny how you have to keep going back to 10 fps, while the discussion is about 30 vs 60, nothing else. And it doesn't matter how much latencies you add on top of the cake, because it's still only a difference of 0.017 seconds.
No, the bigger picture is not about 30 Hz vs 60 Hz at all. It's about latency / responsiveness, pure and simple. That was my contribution from the start (go look).
I advocate end-to-end latency should be minimised by all means available, including framerate. I personally wish for 120 Hz in combination with a reduction in OS, FS, driver and hardware interface overheads to approach the magic figure of (preferably well) under 20 milliseconds, not the 100+ it currently sits at (for PC hardware; many modern-ish TVs are much, much worse).
You keep mentioning 10 Hz, and I've already explained why that is significant: 100 ms. If your latency is at a 100 milliseconds baseline plus frame delay, then reducing the frame delay by less than this amount will be less and less noticeable each time (but still noticeable, as you agree: it's "smoother"). 2 Hz to 10 Hz is a reduction in latency of 400 milliseconds (600 to 200), which is a 300% improvement. 30 to 60 Hz is 130 to 117 ms, only 12% improvement. That is why total latency should be your focus, not a naïve half-understanding of "framerates" and "reaction times" and "smoothness".
Your original statement that a 250 ms "reaction time" means that individual frames at 30 or 60 fps are negligible is erroneous on a cognitive basis, but you retracted that anyway.
Your statement that 30 fps effectively has no delay because it is "smooth" was refuted by the source we discussed in detail. It was never intended to demonstrate the "superiority" of 60 Hz over 30 Hz (sorry), not least because the information in the test is capped at approximately 3 Hz. But cursor position aliasing still breaks feedback (which is why people like the responsiveness of 120+ Hz displays on PCs, despite it having minimal impact on a 100 ms end-to-end latency figure).
Furthermore, I provided a discussion on framerates in iRacing (
here), where triple digits are desired, or as high as can be locked out (effectively targeting a minimum framerate through graphics settings.)
In that link was also a reference to a 2011 book that states 30-60 fps is a
minimum, with higher rates being "desirable" (but they had already in that chapter cited the great expense of the necessary computation hardware due to the complexity of the simulations, and how that is a barrier even to "interactive" framerates). This in the context of road network engineering visualisation, and medicinal and psychological research using driving simulations (e.g. to inform tuition), which generally have much lower demands on reactions and cognitive functions than competitive racing, at least on a moment-to-moment basis (they are fun, though).
In that same link is an explanation of how they disable frame buffering by default (not at all common) in order to reduce
latency, despite the fact that it will inevitably
increase framerate variation (see:
frame buffering). They also advise the use of a framerate cap (defaults to 84, probably a factor of the physics rate - because aliasing!), for this very reason.
I think you need to spend some time to understand the real implications of end-to-end latency and realise that 0.017 seconds is a misleading figure by at least a factor of six for most modern setups. You cannot substantiate your assertion that any reduction from 100+ milliseconds / 0.1+ seconds makes no difference, especially when the industry disagrees with you (20 millisecond target,
some could go further). It may be a less noticeable difference, especially in the case of TVs, but currently that's a hardware-dependent problem and we are talking in the general case.
On top of this, gaming latency was not an issue in the classic days of consoles and arcades. It is only an issue with complex software, including that embedded in hardware such as TVs. Games have, as a result, become less responsive - as any MAME head will
tell you. This can be remedied with hardware and software collaboration (ironically by making it more complex), and it doesn't need the waters muddying with half truths and no understanding. Once the other overheads are reduced, I bet higher framerates will have a much bigger impact on people's experiences, too.