That seems the reason why you think it won't be able to do it. If weaker GPUs on PC can play games like iRacing at higher resolutions than 4K Ultra HD at over 60FPS, I don't see any reason why console developers can't
My first laptop was a Dell XPS M1710. Got it on closeout when Dell was bringing in the M1730. 2.26 Core2Duo, GeForce Go 7950GTX, 4GB of RAM, 128GB SSD. That thing could play the
hell out of Quake 3 Arena. Max settings, forced on 16x anisitropic filtering, forced on 4xAA, 1920x1200 resolution.
Never dipped below 90fps.
Strangely, it struggled to accomplish the same with Half Life 2 Episode One. Even more strange is that you're referring to an alleged and vague as possible (and unsourced, of course) benchmark for a game that even the most ardent players wouldn't have called particularly pretty when it was new (in 2008) as proof; but when someone points out the requirements for much more recent and graphically intensive title like here:
A German gaming magazine ran Skyrim at 4K. Doing so required a Radeon 7970 Toxic, which itself has more processing power than the PS4 as a whole system. Skyrim used 3,5GB of VRAM, by the way.
And here (and countless times in the past with other recent-ish PC games, if I recall correctly):
I'm also highly skeptical of the 750ti running Sleeping Dogs at 60fps/4k. Most reviews I've read have the card at ~50fps at 1080p.
And you just ignore it (and in the latter, go out of your way to not talk about it when quoting said post).
or is it "because PS4" nonsense you think it can't?
Nope. I think it can't without toning down graphical details to the point that it no longer looks like a PS4 game, but I think it can't because the closest possible comparisons I've seen have shown that it shouldn't be able to. "Because PS4" is still nonsense because it doesn't mean anything. It doesn't mean anything no matter what thread you post it in, or what evidence to the contrary you ignore, or what discussion you walk out on regarding the subject when it doesn't go your way, or what way you reword it. If PD manage to get some sort of trick 4K mode going for a PS4 title, then great. Some guy on a forum who has been acting like it was a done deal for literally years (before GT6 or the PS4 were even announced) yet never actually bothered explaining why still isn't vindicated.
Let's put it another way:
The PS4, as a system, manages to push about 1.8 TFLOPS of processing power. A stock GTX680 alone is pushing 3.1 TFLOPS, that's excluding the processing power the CPU develops and not including anything you're getting from overclocking. All in all, my rig's likely capable of delivering twice the processing power of the PS4, if not more. And my computer can't hope to run recent games in 4K.
The GTX980, which is far closer to being a 4k-ready GPU is pushing 5 TFLOPS of processing power. Three times of what the PS4 is capable of on the GPU alone. You need a GPU three times as powerful as the PS4 to start tinkering with recent games in 4K and you're really asking why 4K gaming on the PS4 is nonsense?
/edit:
Let's make this a bit simpler, shall we? Sony's selling 4K TVs. They're supporting the format. Why on earth would they limit the PS4 to 1080p if it was capable of 4K? And if the PS4 was capable of 4K at 30 FPS, why are devlopers struggling to keep stable framerates at 1080? 1080/60 should be a cakewalk - if the PS4 was anywhere near as powerful as you seem to believe.
Shows you how inaccurate you can be then...
Except, well, I'm not the one who keeps bringing up how the PS4
definitely has the capability to do something because I believe that it can do it. I'm also not the one who is referring to some other time where I was ultimately vindicated about something that I repeatedly refused to defend with any tangible reasoning as meaning anything for this time where I'm doing the same thing on a completely separate topic.
Other than all that, well, yeah.
*zing*
you can already get a few 4K Ultra HD streaming content sites such as Netflix and Amazon Instant Video.
Bitrate throttled (HEVC or not) internet video on a TV that people typically won't sit close enough to anyway sure sounds like the thing to definitely sell people on the benefits, if an upscaled 1080p image doesn't. That's assuming, of course, that the TV you buy now supports the functionality required to do even that much.
Also it seems it will be the European standard going by this:
Link .
Going by that says it is an approved standard and roadmap that hasn't actually been fully finalized yet. Now it seems they're moving on to testing. And even then:
While great news for 4K enthusiasts, there are some caveats to all this excitement. The main one being that the new specification is completely incompatible with any of today's TV tuners, even the ones inside brand new 4K TVs with HEVC decoding built-in.
Um...
And that's Europe. In the US with the FCC's broadcast spectrum sell off, there are currently doubts whether UHD will ever be able to be done OTA; but even if they can
ATSC 3.0 still hasn't even been written yet.
"Economies of sale" by itself doesn't mean anything. Are TV manufacturers currently operating at capacity with their 4K screen production? If not, shutting down 1080p production isn't going to make things any better. Is the adoption rate for 4K television currently high enough that attempting to force tan increase in it wouldn't just make the TVs more expensive when sales drop? Has the yield rate on 4K TVs increased enough that shuttering 1080p production wouldn't lead to needlessly throwing money down the toilet?
On top of that, you tried to double down and imply that because things like crappy edge lit Chinese TVs from two years ago are cheap that must mean that modern panels from manufacturers people have actually heard of are also potentially cheap. It's the same fallacy that was around in 2006 when the PS3 came out and people wondered why their 1080p Westinghouse HDTVs had terrible picture quality compared to the Sony and Panasonic 720p TVs their friends got.
It likely stays effective 120FPS on time trial mode on a course generator track, about effective 100FPS on track like Nurburgring and worst case scenario can be about effective 36FPS so don't see where you getting information that it can drop 120FPS to 20FPS in the next second. Do you have a source for that?
A few things. First of all, I was referring to a 20FPS framerate as an actual framerate. Not an "effective" one. My apologies for the confusion.
Second:
It likely stays effective 120FPS on time trial mode on a course generator track
Is it not...
supposed to be obvious that you're just lifting what you're saying, once again with no source and in some places almost word for word, from the infamous Eurogamer article?:
Chances are that with a track created with the course maker, in concert with a time trial (to eliminate the other cars), we could see this rise to 60FPS - an effective 120FPS throughput!
Third, it's rather notable that your word choice went from something so definitive:
To something so hypothetical:
As soon as you were questioned about it. Probably because the actual article only went so far as to say:
Meaning a theoretical best case scenario in performance. Certainly still impressive, but far and away different from the original implication that it was indicative of GT5's performance as in the context of what NLxAROSA originally said about 4k and 60fps. It's kinda like how you went from this:
Think it will only make sense for them to do 4K Ultra HD replays and allow photomode use with that resolution.
To this:
I wouldn't be even surprised now if PD do 3840x2160 @ 60Hz gameplay.
Fourth, your copying of testing parameters and results from the Eurogamer review do a very good job at proving what my point was, being that GT5 had fantastic performance under the right
very specific circumstances, but that those had nothing to do with what you would typically experience with the game when you weren't specifically looking for the best framerate. And, yes, as the actual videos in the Eurogamer review show, that includes absolutely humongous, 30+ FPS swings in just regular old gameplay.
Lastly:
Do you have a source for that?
Why don't you just look at the website that you got the above from in the first place.
That's the joke, yeah. Not exactly a new one either.
Looks like someone from LG saying it is only 60Hz:
Link Couldn't find anything about it in the manual.
Wal-Mart says it is 120hz.
Target says it is 120hz.
Amazon says it is 120hz.
CNET says it is 120hz.
Best Buy says it is 120hz.
Sears says it is 120hz. I suppose it is still possible that it still isn't 120hz, but I hope that guy on the forum hurries and alerts all those companies before they get sued.
My comparison is more like for like, have same number of HDMI slots and both are IPS panels. You do get also 2" more display size.
Inputs: 3 HDMI 1.4, 4 USB 2.0, 1 RF, 1 Component, 1 Composite, 1 Digital Audio Out (optical), 1 PC Audio, 1 LAN, 1 Headphone
Inputs: 3 HDMI 1.4, 3 USB 2.0, 1 RF, 1 Component, 1 Composite, 1 Digital Audio Out (optical), 1 PC Audio, 1 LAN
Inputs: 3 HDMI 1.4, 3 USB 2.0, 1 RF, 1 Component, 1 Composite, 1 Digital Audio Out (optical), 1 PC Audio, 1 LAN, 1 Headphone
I'm not even sure which TV is which. And even if the 120hz television is the closer one, you're still arguing that a 30+% higher price to be an early adopter to a format with barely any content (and where regular content like 1080p has now isn't even out of the planning stages yet) is only a "bit" more.
If you compare say budget TVs then something like this 55" 4K Ultra HD TV for
$699.99 or 39" 4K Ultra HD TV for
$339.99 should likely close differences to comparable budget 1080p TVs.
And would almost certainly be worse than buying a mid-range 1080p TV for around the same $/inch. Possibly even worse than just buying the crappy 1080p sets.
The shop sells more than three televisions, they are just the same size ones I could see at time.
That... really doesn't
change anything.