Sorry if it sounds overly pragmatic. I can enjoy things on a conceptual level too. In this particular case I’m skeptical; pushing a tech demo to the point it’s not even demo-able sure raises some concerns. I’m genuinely interested in HDR and the workflow used by PD for the matter. I can envision how some tools may have allowed more accurate capturing and computing/rendering of spectral data, but with the end product being ultimately about visuals it leaves me wondering how final grading decisions where/could be taken. With no practical tool at hand you’d either refer to abstract models or guesstimates.
As for other developers, if nvidia white papers on HDR are anything to go by (selling that many GPU’s should make there voice at least be heard I imagine), then sure more conservative takes on HDR may become more popular as they seemingly suggest to stick to sRGB/REC.709 primaries and to extend the luminance range to about a one thousand nits peak. Potentially more in line with current generation of SUHD televisions capabilities and probably easier to trim pass for the majority of existing HD displays out there.
Pragmatism is always welcomed, but I wonder what cost the extra colour gamut actually entails in respect of the game itself.
From the Ars Technica interview it seems that Kaz was primarily targeting photo mode with the 10 000 nit range, so in essence it is purely for the purposes of expression - always welcome. In that sense, this is really no different from the HDR implementation as seen in GT's photomode to-date, despite the fact that HDR consumer displays were not even on the horizon yet.
I highly doubt that nVidia still thinks 1000 nits is enough now (and bear in mind the nVidia white papers are often "guest-authored" - it's just a platform). That figure would be the compromise I alluded to regarding the loss of contrast and loss of range during the simulated "accommodation" that HDR renderers typically use for "SDR" displays - that compromise will not be the same for the "average" HDR display in the near future.
In fact, and as I suspected, the current standard (1000 nits), assuming it is such, is nowhere near enough to take advantage of HDR displays going forwards and would, as you say, be "un-demo-able"! The features of HDR rendering, as we have come to know them, will be absent on most such displays as the output gamut exceeds the internal provision (assuming the "standard" is accurate).
In other words, these recommendations are out of date, people have work to do to synergise the software and hardware, and PD would surely serve as an example once again. (Whether that's had an impact on the
game is another matter altogether, but their example is good for the technology outright).
The noise PD (Sony) are making about this is clearly intended to invigorate interest in the concept and the "excess" range will bring benefits to the photomode at the very least and won't look too shabby in-game, either - win / win. We've come a long way since the Lost Coast demo
In short: Rendering at 1000 nits (range) to display at 100 nits (the current "standard") is no less "excessive" than rendering at 10 000 nits to display at 1000 - 4000 nits.
Since we can expect the hardware standard (thanks to video formats) to be closer to 4000 nits, the current SDR situation is technically even more excessive than what PD have done for GTS, and has been for some time. Because HDR rendering to-date has brought more interesting and immersive visuals into wider appreciation, and HDR displays only increase that potential, I think we can conclude that the "excess" is in fact sufficient