Not exactly, but vendors agree that changes required to rendering in general to benefit VR would have a positive impact on gaming overall, for basically the same reasons: responsiveness. Maybe also something something share price. Who knows.
Well except that their concept of the "zero delay" case they provide is erroneous in the context of end-to-end latency in a computer system, as I already stated. Regardless, from your pre-existing experience, would you notice a change from 20 to 30 fps? What about 10? I'm not using it to cite 60 Hz being "superior", only their finding that frame delay is analogous to naked lag, which itself you were denying.
The limitations of that test include the fact that it was run at no higher than 60 Hz (probably hardware-limited), so it cannot reproduce information above 30 Hz without aliasing, but its major flaw is its false attribution of "zero lag". Remember I said that typical latency is of the order of 100 ms - that's 10 Hz equivalent
In other words, the tests in the range of 10 Hz plus are probably unreliable (at least in the context of what they
claim it shows), however the small changes in performance over that range imply that relative changes in delay of the order of "only" 17% are noticeable.
Modern methods of measuring lag include high-speed photography, with the catch-phrase: "motion to photons". This kind of analysis has been the game changer for VR this time around, and I'd argue it re-frames the discussion regarding latency for ordinary games as well.
(Sorry, I assumed the .pdf was reachable from that link)
c.f. all the guides to maximising framerates ahead of LAN parties / competitions. Granted, that is more prevalent for FPSs than sims, but I am arguing from a general perspective - as always. Notice that
@MatskiMonk already stated a preference to prioritise resolution in racing sims, and I think I would seek a balance between definition and response myself. In fact, I always have, but I would prefer to have the maximum of both in every type of action game.
I recall some interesting discussions regarding iRacing, but I'm no longer a member. I did find this, though:
https://steamcommunity.com/app/241560/discussions/0/34096318669939263/
Even at 70 Hz, hitting braking points is not guaranteed - it's recommended to run faster than that. Of course, with the square relationship between speed and stopping distance, I guess every little ounce of precision helps, even with our powers of anticipation (how do you know you hit your intended braking point if it falls between frames? That's important feedback you don't want to lose to aliasing).
Wipeout springs to mind at this point.
If the "consensus" is reached using the same flawed deployment of software (notice the software devs state that low-latency serial hardware is available, but that paper states they used the standard keyboard...), then maybe. I saw some surprisingly and incredibly shoddy science when I was in research - often it was simply a case of people not being aware of something, and all the existing "science" had to be re-done (fun times). Maybe people assumed computers respond instantaneously, because speed of light.
I'm not an expert in the field, I'll admit, and, as I've said, I've seen other figures for tracing actual neuron firing events which also depend on a lot of things. So "it depends" is my stance, as it was from the start.
My point is that it varies according to what information is being perceived by the brain and other organs. Also the fact that much of it isn't a case of focused reaction as it is reflex or other subconscious processes, which have different mechanisms and different timings and so, for example, a "pure reaction time" (starts) or "focused attention / anticipation" (braking points) is maybe not the best basis to use outright for what framerate is
ideal.
As above, that 0.017 seconds is noticeable.
Maybe a sprinter does rely on reflex to get out of the gate quickly, rather than any real cortical process (like most processing of the
content of sounds requires).
What do you suppose would happen if some comedian shouts "bang!" at the Olympics? Aside from them being escorted out, I mean...
It is both to me. Is that really the limit of discussion here? If so, fine: I prefer higher framerates and I think everyone should have access to them!
Aliasing. Also, apparently (and relatedly), simple physics: distance = speed * time. As in the above link, try converting frames per second to metres per frame for some of your favourite speeds.
Maybe, but why "gamble" at all? Why not target 60 Hz in the first instance? It's the same process for 60 as it is for 30. Doesn't really say anything about whether 60 Hz is beneficial from an interactivity point of view, just shows how it's an industry inertia issue. My favourite.
Well ditto for your statement. Again, is this the level we're going to work on?
As mentioned, no they didn't. They started with a baseline 100 ms delay, and only
added to it from there. But at least, as I said, it disproves the idea that a stream at 30 fps has no delay. Thus implying it can be improved upon, from a responsiveness and controllability point of view. The errors started to increase significantly above 100 ms because the zero wasn't zero, it was 100. 100% is probably a good definition of "significant".
That's true, and that's commented on a lot in the research as well, we can adapt very well in perceptual terms (probably related to tying together distant events in sight and sound - it takes sound 100 ms to travel 34 metres) - it doesn't mean we are still operating optimally. It doesn't mean that reducing that delay doesn't feel better; consensus is that it does feel better, in fact. It doesn't necessarily prevent us from enjoying it, either, which is, as I said, a part of the art of the video game.
Indeed as I mention that, I did happen across some discussion of old arcade emulators going to extreme measures to hack into render processes and driver gubbins on modern platforms to better approximate the crisp response of classic arcade games. Some of which ran at 60 Hz raw, hardware - probably little more than 17 ms total latency. In the 1970s. John Carmack once regaled of id Softwares early days of porting arcade games to PC (DOS), that involved the use of videoing the native machine and recreating its
output (as opposed to its internal code), frame by frame, in order to reproduce its feel! No wonder he landed on "motion to photons" analysis.
I don't think the musical instrument analogy is accurate. The notes sound the very moment physics lets them (that's what makes them so appealing), and it is that that your brain keys into through various feedback mechanisms. I.e. you latch onto the timing of a particular point within your physical movements (of playing the note) as being the rhythmic "centre". Otherwise all notes start when you pick the guitar up. Also, interestingly, playing a note does not really involve a reaction as such (except maybe in Jazz), but you do subconsciously react to the instrument if, say, your grip shifts or if the controls are in a different position as you go to play them. You feel where they are before you play them and adjust your input accordingly. That would be analogous to my point of there being effectively hidden feedback involved.
Research shows that most people notice a delay down to 0.02 seconds
*, even on a flat screen with the bare minimum of interactivity: a mouse cursor. That's a little more than one frame at 60 Hz, which doesn't leave much room for I/O and driver latency. So to make room for it and still hit that 20 ms target (well, 50 realistically today), you minimise frame delay by maximising framerate - simple.
* which is to say, most people notice an improvement in
responsiveness up to that point, some go beyond it - that change in qualitative terminology (delay -> response) probably reflects a change in the (effectiveness of certain) mode(s) of perception once you reduce lag beyond a certain point (100 ms?), which may be related to what I was talking about with my "it depends" comments.