4K Gaming: A Sign of the Future, or Not Entirely Possible?

I just have no idea how people think Sony can make a console for $400 that outperforms PC's with TWO graphics cards worth twice as much as a PS4.

Then you clearly haven't read the whole of Teh Internetz :D

For my own two penn'orth, and as I said way earlier in the thread, I don't think the next-gen consoles (now next-gen is this-gen) will even really be consoles, they'll be control-in-video-out interfaces for cloud gaming. That suits everybody. Except gamers, possibly, but our only function from the game companies' perspective is to put the money in.
 
Cloud gaming is something I'm very sceptical about. Bandwith and broad band internet access haven't been developing as fast over recent years as hardware/processing power has... Dunno, lag-free cloud gaming seems like a big task to tackle.
 
Cloud gaming is something I'm very sceptical about. Bandwith and broad band internet access haven't been developing as fast over recent years as hardware/processing power has... Dunno, lag-free cloud gaming seems like a big task to tackle.

That's a client perspective though, once enough clients have a good enough (on average) connection I think games companies will find the figures quite palatable. Bear in mind that your "console" would be a pure portal device, no other thinking going on, the real limit would be video download speed. Control out (upstream) would be mere fractions of kbps.

There's a great advantage to sell to gamers; let's say we were racing on Spa. Currently we each simulate Spa. Why not simulate it in a central session with one track rather than (lobby * n) versions of it? We all see the same tyre marks, the same deformed barriers, the same weather and lighting... but we don't have to process a single line of code.

What if we were playing Battlefield all in the same model? I'd shoot a window out and you'd see that with no processing disadvantage to yourself.
 
The only truly exciting thing about 4K may be that it will be one step closer to some sort of eye resolution ceiling. Once we can also hit the fps ceiling, developer genius moments will be all the more treasured, as there won't be those fall-back advancements on offer. There won't be that inevitibility of a hardware boost to rely on.

Sure, I love panning an amazingly detailed landscape, but the way that Metal Gear Solid 2 scared the hell out of me when the colonel was screaming "TURN THE GAME OFF!!!", knowing that Hideo might have been just crazy enough to program in something that did require that?..... Utterly irreplaceable. That's not just storytelling, it's also down to knowing that you're dealing with a creator who asks "Ah, so what if we could....?"

I'm of the Commodore 64 era and still marvel at the technical wizardry and genius moments that went on back then. Dragon Breed with it's simple yet magical and inspired realisation of more than 16 pre-defined colours, the incredible stuff that people pulled out of the Sid chip in the sound department. When developers realise that they can't just keep on marching forward, they look left, right, up, down, inside themselves, inside of others, through night vision goggles, infra red goggles, through their sense of smell, touch, hearing.... anything to find something unique and powerful.

So, bring on 16K or whatever it takes, 240fps (I think that's around what it takes), and we can get on to something far more interesting, and ask our developers to innovate rather than just handing them a stream of hardware progress that masks their lateral cognitive deficiencies.
 
we'll still be having this debate when we get to 128k VR sytems :)

Noooooo, you mean when we're on the holodeck surely? :drool:

Sony's marketing department's gotta be one of the best in the world, really.

The amount of people who're believing that the PS4 can do 4K and outperform computers with multiple times the processing power because of [reasons] is astonishing.
Its not down to Sony marketing, they just target their marketing because they are aware that there are plenty of idiots out there that will spout anything that comes into their minds, Sony marketing just happen to put it there (in their minds):sly:
 
Last edited by a moderator:
We have never jumped as high as going from 1080 to 2180 in one step in resolution. We went from 480 to 600 to 720 to 1080.
Ah. You're talking TVs then, not computers and their monitors.
 
I just have no idea how people think Sony can make a console for $400 that outperforms PC's with TWO graphics cards each worth twice as much as a PS4.

Sony licensed AMD's IP (comparing the price to a GPU). "IMO". Looking at the PS4's die, it looks like they stuffed all they can onto it (limitation of 28nm). If the PS4s SOC was made on a 20nm node with a reasonable cost per transistor, imagine how much more GCN cores they could fit on the die along with the power savings (4k for days). Pretty sure it could still be sold in the $400-500 range under those circumstances too.
 
Ah. You're talking TVs then, not computers and their monitors.
I'm talking monitors. My first one was a 480 in about 1981 and was a TV because at home most people did use their TVs back then. After that it slowly progressed to 1080 being what most people used. Even if it was a TV, a TV and monitor at the same resolution isn't any different as far as power needed is concerned.

All I know is 4k is extremely hard to push at any decent settings on any new game.

Here is a benchmark link showing what a 980 does when you max out some games at 4k.

http://www.pcgamer.com/nvidia-gtx-9...single-gpu-benchmarks-and-impressions/#page-2

It is only hitting 30 fps in maybe 1 game on this list and none of these games are the latest and greatest. Even 980s in sli aren't getting 60fps at 4k.

The way these benchmarks look I don't see 4k catching on real big with people actually running games at 4k any time very soon.
 
I am seriously questioning what certain people have been smoking. No, serious 4k gaming is not possible on a PS4. Just like it won't be possible on a PC with a core i3 with 4GB of RAM after sticking in a budget GPU that allows 4k output.

Dunno, lag-free cloud gaming seems like a big task to tackle.
There's only so much you can do about latency, since we're already pretty much at the level where latency is bound by the laws of physics (speed of light, number of hops) and not by inefficient/slow processing of individual devices. The only way out here is a complete overhaul of internet infrastructure (from the major backbones all the way down to the local cable running into your house and your equipment). And even then you still won't be able to eliminate it.

Which is why technology is being developed to anticipate/predict a gamers likely next action/move, rather than optimizing latency further. Not sure I'm confident this will lead to desirable results though, especially when it comes to simulators/realism. Then again, realtime big data analysis is already here, so I'm not saying it's not going to work in the long run. :D
 
Last edited:
I'm talking monitors.
Then why are you using TV numbers like 1080, 720 etc instead of monitor numbers like 320x200, 640x480, 1920x1080, etc?

My first one was a 480 in about 1981 and was a TV because at home most people did use their TVs back then.
I've owned computers since the late 1970's and never connected a TV to a computer until just a few years ago. I'll concede, though, that one of my early computers came with a monitor that was basically a TV with the tuner circuitry removed.

Aside from that, yes there was a very popular computer on the market back then that used a standard TV for a monitor. But the resolution on those was abysmal even by the standards of the day.
 
Then why are you using TV numbers like 1080, 720 etc instead of monitor numbers like 320x200, 640x480, 1920x1080, etc?


I've owned computers since the late 1970's and never connected a TV to a computer until just a few years ago. I'll concede, though, that one of my early computers came with a monitor that was basically a TV with the tuner circuitry removed.

Aside from that, yes there was a very popular computer on the market back then that used a standard TV for a monitor. But the resolution on those was abysmal even by the standards of the day.

He's using the generic form of 'monitor' as in any computer device connected to a TV is referred to as a monitor in that sense even though it is still a TV, whereas the technical use of the word 'monitor' refers to what you and I know about. BTW the PC only became available in late 1981 and they didn't do graphics anyway, only green text on a 80x25 monitor and 640x480 only came in around 88/89? with the coming of VGA cards. My first computer was the TI99/4A on which I learned to program, fantastic little machine.:D
 
Then why are you using TV numbers like 1080, 720 etc instead of monitor numbers like 320x200, 640x480, 1920x1080, etc?


I've owned computers since the late 1970's and never connected a TV to a computer until just a few years ago. I'll concede, though, that one of my early computers came with a monitor that was basically a TV with the tuner circuitry removed.

Aside from that, yes there was a very popular computer on the market back then that used a standard TV for a monitor. But the resolution on those was abysmal even by the standards of the day.
Then why are you using TV numbers like 1080, 720 etc instead of monitor numbers like 320x200, 640x480, 1920x1080, etc?


I've owned computers since the late 1970's and never connected a TV to a computer until just a few years ago. I'll concede, though, that one of my early computers came with a monitor that was basically a TV with the tuner circuitry removed.

Aside from that, yes there was a very popular computer on the market back then that used a standard TV for a monitor. But the resolution on those was abysmal even by the standards of the day.
I used a TV for my commodore 64 then used monitor from then on. I apparently screwed up the res numbers. I was just giving examples. I wasn't sure exactly what they were but I know they're all close to what a PC monitor has been. All I'm saying is we didn't make as big as jump ever as we are now from 1080 to 4k.

Anyway, what did you think of the 4k benchmarks from the link I posted?

I guess my answer to the original post is, yes 4k is the next step in gaming resolution but it will happen when the next consoles are released. Its still a ways off.
 
Last edited:
I guess my answer to the original post is, yes 4k is the next step in gaming resolution but it will happen when the next consoles are released. Its still a ways off.

How peculiar, 4k cards are available today... as are the monitors. They're Ultra-HD rather than 4k but it's close enough :)

Why do you think that consoles are a driver in the development?
 
Because PC gaming has been a comparably niche market for the past 15 years. PC developers had been complaining about the PS360's popularity holding PC development back since... what, 2009?
 
Because PC gaming has been a comparably niche market for the past 15 years. PC developers had been complaining about the PS360 holding PC development back since... what, 2009?

Since forever, really, it took the PS to show IBM where Deep Blue had gone wrong.
 
How peculiar, 4k cards are available today... as are the monitors. They're Ultra-HD rather than 4k but it's close enough :)

Why do you think that consoles are a driver in the development?
I don't think anyone looked at the 4k 980 benchmarks I posted. http://www.pcgamer.com/nvidia-gtx-9...single-gpu-benchmarks-and-impressions/#page-2

That is what a 4k card does at 4k at high settings with decent games.

Consoles really always drive the gaming industry. And the consoles today have no way to ever reach 4k for gaming. One a console can push a 4k game then it'll be mainstream.

Just look at the link I posted. GPUs are a long way from running 4K games.
 
I don't think anyone looked at the 4k 980 benchmarks I posted. http://www.pcgamer.com/nvidia-gtx-9...single-gpu-benchmarks-and-impressions/#page-2

That is what a 4k card does at 4k at high settings with decent games.

Consoles really always drive the gaming industry. And the consoles today have no way to ever reach 4k for gaming. One a console can push a 4k game then it'll be mainstream.

Just look at the link I posted. GPUs are a long way from running 4K games.

They're all Ultra HD settings, 4k is 4 thousand pixels per line, they're just 4xHD at 3840 pixels per line... but the article points out that other cards with more overclock will be tested.

Nowhere in that article did it say that a new console would be need to be released for the performance to increase... and that's what you were leading us towards.
 
They're all Ultra HD settings, 4k is 4 thousand pixels per line, they're just 4xHD at 3840 pixels per line... but the article points out that other cards with more overclock will be tested.

Nowhere in that article did it say that a new console would be need to be released for the performance to increase... and that's what you were leading us towards.
when consoles can get to 4k that will get more and more people to buy TVs and get devs to make games for them at that res.

As for the article they were using the most powerful GPUs made. Even 980s in sli weren't doing that good.

The benchmarks kind of put perspective on what you need for a 4k PC to run games at ultra settings and according to the benchmarks it doesn't really exist yet and looks like it's a ways off.
The benchmarks speak for themselves. Nothing can hit 60fps at 4k on AAA game.
Maybe tri 980s but 1 card is a long ways from it. Its not like OCing a card Is magically going to add 40fps to a system. Maybe 10 if your lucky.
 
Last edited:
I don't think anyone looked at the 4k 980 benchmarks I posted. http://www.pcgamer.com/nvidia-gtx-9...single-gpu-benchmarks-and-impressions/#page-2

That is what a 4k card does at 4k at high settings with decent games.

Consoles really always drive the gaming industry. And the consoles today have no way to ever reach 4k for gaming. One a console can push a 4k game then it'll be mainstream.

Just look at the link I posted. GPUs are a long way from running 4K games.

How come those guys are applying AA at 4K? Isn't 4K sharp enough to eliminate aliasing anyways? I guess there's some aliasing on fine lines at a distance. AA can tax the GPU, even on cards like the 980.

And adding to your point on consoles being drivers of the industry, there's also TV. 4K TVs will need to drop at a good price range so people can buy them. As for cables, there's HDMI 2.0 which will probably support 4K@60fps. Or TVs can have Displayport and consoles have Displayport as well.
 
Last edited:
Nowhere in that article did it say that a new console would be need to be released for the performance to increase... and that's what you were leading us towards.
It will require a (few) generation(s) of GPUs to make 4k/60FPS in a box possible. While the R&D of the hardware that enables this will not be driven by consoles (but by PC as always), games using this technology will not become mainstream until a console starts using it.
So I guess you both have a point. :)
 
How come those guys are applying AA at 4K? Isn't 4K sharp enough to eliminate aliasing anyways?

Good point... I guess it depends on the textures/images that you're using. The textures also need to be 4x in definition (presuming that they were HD textures in the first place). It might be resource-cheaper to do some AA instead.

A 55" screen is roughly 121 cm across, that's just over 3 pixels per horizontal millimetre... that would still give you discernible pixel-edges in high contrast images I think, especially if the image was moving.

when consoles can get to 4k that will get more and more people to buy TVs and get devs to make games for them at that res.

Absolutely, but I still don't think that consoles are likely to push 4k development in gaming right now. The main driver in 4k development currently seems to be video.

For "Consoles" I think we can read PlayStation/XBox... the next versions may well be 4k enable but how far away are they in realistic terms, and aren't we more likely to be talking about an even higher display 'standard' by then?
 
Because PC gaming has been a comparably niche market for the past 15 years.
Reminds me of an article on RPS. PC gaming might not be as big as the console market, but it seems to be growing instead of declining - which is something.
For "Consoles" I think we can read PlayStation/XBox... the next versions may well be 4k enable but how far away are they in realistic terms, and aren't we more likely to be talking about an even higher display 'standard' by then?
The PS4/XBOne still have their fair share of games that don't manage to run at 1080/60. I'm pretty sure they'll go QHD for the next generation and call it a day. I have a feeling that going higher wouldn't allow them to sell the consoles with a profit at the 400 - 500 USD price point.
 
Good point... I guess it depends on the textures/images that you're using. The textures also need to be 4x in definition (presuming that they were HD textures in the first place). It might be resource-cheaper to do some AA instead.
I'm not really that great with knowledge in this area, but wouldn't there be room for one of those "genius moves" in the procedurally generated textures department?
Absolutely, but I still don't think that consoles are likely to push 4k development in gaming right now. The main driver in 4k development currently seems to be video.
Yes and no. Trouble is, 4K was apparently the final nail for plasma, but ironically no plasma might be hampering up-take of 4K. Knowing that the end would still be nigh, I can only imagine the ridiculous money videophiles would have been willing the shell out for a massive 4K plasma (massive would have been the only possibility for the marriage). Right now there's the unproven oled tech (implemented with a superfluous curve no less), and the very ancient lcd tech. Videophiles I'm sure are saying "I'll hold on to the superior panel that I already own thanks.

Addtionally, so many people are still happy with their upscaled dvds on their five year old lcd, that they haven't even bothered with Blu Ray, let alone started screaming that there's no physical format for 4K. I would have loved to see a company come out with Blu Ray 4K (or equivalent) in physical format, and bankroll the temporary resurrection of Pioneer to make 80-100 inch 4K versions of the 10th gen panels that never saw the dark of night. I could have been experiencing that glorious visage right now!!
 
4K is the future answer to a problem nobody asked.

Half-credit, please see me after this decade.
 
Back