And sell it with a profit from launch.I just have no idea how people think Sony can make a console for $400 that outperforms PC's with TWO graphics cards worth twice as much as a PS4.
I just have no idea how people think Sony can make a console for $400 that outperforms PC's with TWO graphics cards worth twice as much as a PS4.
Cloud gaming is something I'm very sceptical about. Bandwith and broad band internet access haven't been developing as fast over recent years as hardware/processing power has... Dunno, lag-free cloud gaming seems like a big task to tackle.
we'll still be having this debate when we get to 128k VR sytems
Its not down to Sony marketing, they just target their marketing because they are aware that there are plenty of idiots out there that will spout anything that comes into their minds, Sony marketing just happen to put it there (in their minds)Sony's marketing department's gotta be one of the best in the world, really.
The amount of people who're believing that the PS4 can do 4K and outperform computers with multiple times the processing power because of [reasons] is astonishing.
Ah. You're talking TVs then, not computers and their monitors.We have never jumped as high as going from 1080 to 2180 in one step in resolution. We went from 480 to 600 to 720 to 1080.
I just have no idea how people think Sony can make a console for $400 that outperforms PC's with TWO graphics cards each worth twice as much as a PS4.
I'm talking monitors. My first one was a 480 in about 1981 and was a TV because at home most people did use their TVs back then. After that it slowly progressed to 1080 being what most people used. Even if it was a TV, a TV and monitor at the same resolution isn't any different as far as power needed is concerned.Ah. You're talking TVs then, not computers and their monitors.
There's only so much you can do about latency, since we're already pretty much at the level where latency is bound by the laws of physics (speed of light, number of hops) and not by inefficient/slow processing of individual devices. The only way out here is a complete overhaul of internet infrastructure (from the major backbones all the way down to the local cable running into your house and your equipment). And even then you still won't be able to eliminate it.Dunno, lag-free cloud gaming seems like a big task to tackle.
Then why are you using TV numbers like 1080, 720 etc instead of monitor numbers like 320x200, 640x480, 1920x1080, etc?I'm talking monitors.
I've owned computers since the late 1970's and never connected a TV to a computer until just a few years ago. I'll concede, though, that one of my early computers came with a monitor that was basically a TV with the tuner circuitry removed.My first one was a 480 in about 1981 and was a TV because at home most people did use their TVs back then.
Then why are you using TV numbers like 1080, 720 etc instead of monitor numbers like 320x200, 640x480, 1920x1080, etc?
I've owned computers since the late 1970's and never connected a TV to a computer until just a few years ago. I'll concede, though, that one of my early computers came with a monitor that was basically a TV with the tuner circuitry removed.
Aside from that, yes there was a very popular computer on the market back then that used a standard TV for a monitor. But the resolution on those was abysmal even by the standards of the day.
Then why are you using TV numbers like 1080, 720 etc instead of monitor numbers like 320x200, 640x480, 1920x1080, etc?
I've owned computers since the late 1970's and never connected a TV to a computer until just a few years ago. I'll concede, though, that one of my early computers came with a monitor that was basically a TV with the tuner circuitry removed.
Aside from that, yes there was a very popular computer on the market back then that used a standard TV for a monitor. But the resolution on those was abysmal even by the standards of the day.
I used a TV for my commodore 64 then used monitor from then on. I apparently screwed up the res numbers. I was just giving examples. I wasn't sure exactly what they were but I know they're all close to what a PC monitor has been. All I'm saying is we didn't make as big as jump ever as we are now from 1080 to 4k.Then why are you using TV numbers like 1080, 720 etc instead of monitor numbers like 320x200, 640x480, 1920x1080, etc?
I've owned computers since the late 1970's and never connected a TV to a computer until just a few years ago. I'll concede, though, that one of my early computers came with a monitor that was basically a TV with the tuner circuitry removed.
Aside from that, yes there was a very popular computer on the market back then that used a standard TV for a monitor. But the resolution on those was abysmal even by the standards of the day.
I guess my answer to the original post is, yes 4k is the next step in gaming resolution but it will happen when the next consoles are released. Its still a ways off.
Because PC gaming has been a comparably niche market for the past 15 years. PC developers had been complaining about the PS360 holding PC development back since... what, 2009?
I don't think anyone looked at the 4k 980 benchmarks I posted. http://www.pcgamer.com/nvidia-gtx-9...single-gpu-benchmarks-and-impressions/#page-2How peculiar, 4k cards are available today... as are the monitors. They're Ultra-HD rather than 4k but it's close enough
Why do you think that consoles are a driver in the development?
I don't think anyone looked at the 4k 980 benchmarks I posted. http://www.pcgamer.com/nvidia-gtx-9...single-gpu-benchmarks-and-impressions/#page-2
That is what a 4k card does at 4k at high settings with decent games.
Consoles really always drive the gaming industry. And the consoles today have no way to ever reach 4k for gaming. One a console can push a 4k game then it'll be mainstream.
Just look at the link I posted. GPUs are a long way from running 4K games.
when consoles can get to 4k that will get more and more people to buy TVs and get devs to make games for them at that res.They're all Ultra HD settings, 4k is 4 thousand pixels per line, they're just 4xHD at 3840 pixels per line... but the article points out that other cards with more overclock will be tested.
Nowhere in that article did it say that a new console would be need to be released for the performance to increase... and that's what you were leading us towards.
I don't think anyone looked at the 4k 980 benchmarks I posted. http://www.pcgamer.com/nvidia-gtx-9...single-gpu-benchmarks-and-impressions/#page-2
That is what a 4k card does at 4k at high settings with decent games.
Consoles really always drive the gaming industry. And the consoles today have no way to ever reach 4k for gaming. One a console can push a 4k game then it'll be mainstream.
Just look at the link I posted. GPUs are a long way from running 4K games.
It will require a (few) generation(s) of GPUs to make 4k/60FPS in a box possible. While the R&D of the hardware that enables this will not be driven by consoles (but by PC as always), games using this technology will not become mainstream until a console starts using it.Nowhere in that article did it say that a new console would be need to be released for the performance to increase... and that's what you were leading us towards.
How come those guys are applying AA at 4K? Isn't 4K sharp enough to eliminate aliasing anyways?
when consoles can get to 4k that will get more and more people to buy TVs and get devs to make games for them at that res.
Reminds me of an article on RPS. PC gaming might not be as big as the console market, but it seems to be growing instead of declining - which is something.Because PC gaming has been a comparably niche market for the past 15 years.
The PS4/XBOne still have their fair share of games that don't manage to run at 1080/60. I'm pretty sure they'll go QHD for the next generation and call it a day. I have a feeling that going higher wouldn't allow them to sell the consoles with a profit at the 400 - 500 USD price point.For "Consoles" I think we can read PlayStation/XBox... the next versions may well be 4k enable but how far away are they in realistic terms, and aren't we more likely to be talking about an even higher display 'standard' by then?
I'm not really that great with knowledge in this area, but wouldn't there be room for one of those "genius moves" in the procedurally generated textures department?Good point... I guess it depends on the textures/images that you're using. The textures also need to be 4x in definition (presuming that they were HD textures in the first place). It might be resource-cheaper to do some AA instead.
Yes and no. Trouble is, 4K was apparently the final nail for plasma, but ironically no plasma might be hampering up-take of 4K. Knowing that the end would still be nigh, I can only imagine the ridiculous money videophiles would have been willing the shell out for a massive 4K plasma (massive would have been the only possibility for the marriage). Right now there's the unproven oled tech (implemented with a superfluous curve no less), and the very ancient lcd tech. Videophiles I'm sure are saying "I'll hold on to the superior panel that I already own thanks.Absolutely, but I still don't think that consoles are likely to push 4k development in gaming right now. The main driver in 4k development currently seems to be video.