Trash talks from sony comparsion with trash talk from ATI

  • Thread starter cobragt
  • 13 comments
  • 650 views
3,420
ATI
ATI developer relations manager Richard Huddy waxed technical about Microsoft's upcoming Xbox 360 and ATI's role in the machine. The 360 will be using ATI's Xenos graphics processor, and Huddy's job is to chat with potential developers and then help them develop the tools that best use Xenos' capabilities.


With regard to the console's architecture, Huddy says, "It's way better than I would have expected at this point in the history of 3D graphics." He sees the unified pipeline, rather than segregated pixel and vertex engines, giving the 360 a huge advantage in accessible processing power.

Huddy goes on to shed some light on backward compatibility. Each Xbox game is written with specific Xbox hardware in mind, and the 360's move to PowerPCs and ATI graphics doesn't jibe with the Xbox's Intel chips and Nvidia graphics processors. To add to the difficulty, the 360 wasn't designed for backward compatibility early on in development.

To solve this, Microsoft has implemented the use of emulator programs that will let the Xbox 360 play Xbox games. According to Huddy, "emulating the CPU isn't really a difficult task. ...the real bottlenecks in the emulation are GPU calls--calls made specifically by games to the Nvidia hardware in a certain way. General GPU instructions are easy to convert: an instruction to draw a triangle in a certain way will be pretty generic. However, it's the odd cases--the proprietary routines--that will cause hassle." Once complete, the Xbox emulators could come preloaded on the unit's hard drive or be downloadable via Xbox Live.

Huddy also dispels the notion that the PlayStation 3's higher graphics clock speed (550MHz versus the 360's 500MHz) means the Sony console will outperform the Xbox 360. He believes that its ATI's unified pipeline will make the biggest difference between the Xbox 360 and PS3. ATI archrival Nvidia, which is providing the RSX graphics processor for the PS3, has chosen not to go the route of a unified pipeline.

"This time around, [Nvidia doesn't] have the architecture, and we do. So they have to knock it and say it isn't worthwhile. But in the future, they'll market themselves out of this corner, claiming that they've cracked how to do it best. But RSX isn't unified, and this is why I think PS3 will almost certainly be slower and less powerful."



sony
Impress PC Watch: Will the PS3's backward compatibility with the PlayStation and PlayStation 2 be done through hardware?

Ken Kutaragi: It will be done through a combination of hardware and software. We can do it with software alone, but it's important to make it as close to perfect as possible. Third-party developers sometimes do things that are unimaginable. For example, there are cases where their games run, but not according to the console's specifications. There are times when games pass through our tests, but are written in ways that make us say, "What in the world is this code?!" We need to support backward compatibility towards those kinds of games as well, so trying to create compatibility by software alone is difficult. There are things that will be required by hardware. However, with the powers of [a machine like] the PS3, some parts can be handled by hardware, and some parts by software.

IPW: What about the endian (byte order) when emulating CPU codes with software?

KK: The Cell is bi-endian (has the ability to switch between usage of big endian and little endian ordering), so there are no problems.

IPW: The Xbox 360's backward compatibility will be done by software, since [there is] no other choice since they don't manufacture their own chips...

KK: The current Xbox will become antiquated once the new machine comes out this November. When that happens, the Xbox will be killing itself. The only way to avoid that is to support 100 percent compatibility from its [Xbox 360's] launch date, but Microsoft won't be able to commit to that. It's technically difficult.

IPW: The most surprising thing about the PS3's architecture is that its graphics are not processed by the Cell. Why didn't you make a Cell-based GPU?

KK: The Cell's seven Synergistic Processor Elements (SPE) can be used for graphics. In fact, some of the demos at E3 were running without a graphics processor, with all the renderings done with just the Cell. However, that kind of usage is a real waste. There are a lot of other things that should be done with the Cell. One of our ideas was to equip two Cell chips and to use one as a GPU, but we concluded that there were differences between the Cell to be used as a computer chip and as a shader, since a shader should be graphics-specific. The Cell has an architecture where it can do anything, although its SPE can be used to handle things such as displacement mapping. Prior to PS3, real-time rendered 3D graphics might have looked real, but they weren't actually calculated in a fully 3D environment. But that was OK for screen resolutions up until now. Even as of the current time, most of the games for the Xbox 360 use that kind of 3D. However, we want to realize fully calculated 3D graphics in fully 3D environments. In order to do that, we need to share the data between the CPU and GPU as much as possible. That's why we adopted this architecture. We want to make all the floating-point calculations including their rounded numbers the same, and we've been able to make it almost identical. So as a result, the CPU and GPU can use their calculated figures bidirectionally.

IPW: We were predicting that eDRAM was going to be used for the graphics memory, but after hearing that the PS3 will support the use of two HDTVs, we understood why it wasn't being used.

KK: Fundamentally, the GPU can run without graphics memory since it can use Redwood (the high-speed interface between Cell and the RSX GPU) and YDRAM (the code name for XDR DRAM). YDRAM is unified memory. However, there's still the question of whether the [bandwidth and cycle time] should be wasted by accessing the memory that's located far away when processing the graphics or using the shader. And there's also no reason to use up the Cell's memory bandwidth for normal graphics processes. The shader does a lot of calculations of its own, so it will require its own memory. A lot of VRAM will especially be required to control two HDTV screens in full resolution (1920x1080 pixels). For that, eDRAM is no good. eDRAM was good for the PS2, but for two HDTV screens, it's not enough. If we tried to fit enough volume of eDRAM [to support two HDTV screens] onto a 200-by-300-millimeter chip, there won't be enough room for the logics, and we'd have had to cut down on the number of shaders. It's better to use the logics in full, and to add on a lot of shaders.

IPW: First of all, why did you select Nvidia as your GPU vendor?

KK: Up until now, we've worked with Toshiba [for] our computer entertainment graphics. But this time, we've teamed with Nvidia, since we're making an actual computer. Nvidia has been thoroughly pursuing PC graphics, and with their programmable shader, they're even trying to do what Intel's processors have been doing. Nvidia keeps pursuing processor capabilities and functions because [Nvidia chief scientist] David Kirk and other developers come from all areas of the computer industry. They sometimes overdo things, but their corporate culture is very similar to ours. Sony and Nvidia have agreed that our goal will be to pursue [development of] a programmable processor as far as we can. I get a lot of opportunity to talk to Nvidia CEO Jen-Hsun [Huang] and David, and we talk about making the ideal GPU. When we say "ideal," we mean a processor that goes beyond any currently existing processor. Nvidia keeps on going into that direction, and in that sense, they share our vision. We share the same road map as well, as they are actually influenced by our [hardware] architecture. We know each other's spirits and we want to do the same thing, so that's why [Sony] teamed with Nvidia. The other reason is that consumers are starting to use fixed-pixel displays, such as LCD screens. When fixed-pixel devices become the default, it will be the age when TVs and PCs will merge, so we want to support everything perfectly. Aside from backward compatibility to, we also want to support anything from legacy graphics to the latest shader. We want to do resolutions higher than WSXGA (1680x1050 pixels). In those kinds of cases, it's better to bring everything from Nvidia rather than for us to create [a build] from scratch.

IPW: Microsoft decided to use a unified-shader GPU by ATI for its Xbox 360. Isn't unified shader more cutting edge when it comes to programming?

KK: The vertex shader and pixel shader are unified in ATI's architecture, and it looks good at one glance, but I think it will have some difficulties. For example, some question where will the results from the vertex processing be placed, and how will it be sent to the shader for pixel processing. If one point gets clogged, everything is going to get stalled. Reality is different from what's painted on canvas. If we're taking a realistic look at efficiency, I think Nvidia's approach is superior.


the big comparsion
KK: The vertex shader and pixel shader are unified in ATI's architecture, and it looks good at one glance, but I think it will have some difficulties. For example, some question where will the results from the vertex processing be placed, and how will it be sent to the shader for pixel processing. If one point gets clogged, everything is going to get stalled. Reality is different from what's painted on canvas. If we're taking a realistic look at efficiency, I think Nvidia's approach is superior.


This time around, [Nvidia doesn't] have the architecture, and we do. So they have to knock it and say it isn't worthwhile. But in the future, they'll market themselves out of this corner, claiming that they've cracked how to do it best. But RSX isn't unified, and this is why I think PS3 will almost certainly be slower and less powerful."


Now while both are assuming, Ken is alot more professional and logical with his appeal. Instead of sounding professional, ATI sounded like kids, "the ps3 doesnt have what we have so it's gonna be weaker and slower."
 
The only comparison that can be made is how well these two companies are doing in the PC graphics market.

Nvidia, is leading, by far.
 
The GPU doesn'tsingle handedly determine the power of a console anyway, so the ATi guy has already ballsed up his comment. Although you can judge yourself that he was talking about the GPU in particular, the phrase it as "this is why I think PS3 will almost certainly be slower and less powerful" is a sly move to undermine the PS3 as a whole, MS already know the PS3 is superior in power, thats why they're scrambling to find other ways to create reason to back XB360 over PS3 like the software being better ect. Games for both platforms will look great, exeptionally great even, but it's down tothe developers to make best use of what they have.

Anyway that was an interesting read, 👍 Cobra.
 
GPU for GPU I'll bet anyone the ATi chip is better, but as for system for system it looks as if PS3 is more powerful.
 
All I know is that "backward compatibility" is going to sell PS3 to me, over even a 10% bonus in perceived graphics with Xbox (if there's even any bonus).

And why is Xbox going with a PowerPC chip when Apple just dropped the Power PC in favor of Intel, citing the speed/watt factor?
 
Apple dropped IBM because IBM couldn't meet the demands of Apple. In a retaliatory statement, Steve Jobs said the Cell architecture was dubious. Apple is now supporting an architecture (x86) which is at the end of it's life span.

Sony, Microsoft, and Nintendo will all have processors built by IBM. Can they all be wrong or is Apple making another irrational mistake?
 
It's not really a fair comparison, Viper. Although the names of the technology are similar (CPU, GPU, RAM, etc), the architecture of a console is significantly different than that of a PC.

Apple was upset at IBM for not being able to squeeze a G5 chip into a laptop, and they felt that the Cell wasn't an ideal chip for general-purpose computing. Which is true.

But a game console doesn't USE general-purpose computing. So who cares if it ain't that good at it? It's like hiring a cook for a department store vs hiring him for a restaraunt. In one environment, he doesn't do so well.. in the other, he does.

Basically, Apple wanted something that IBM couldn't provide. It was a business decision, nothing more. And has nothing at all to do with game consoles. So what if they use IBM technology? So what if they continue to use IBM technology for the next ten generations of game consoles?

I would much rather have my game consoles as dedicated gaming machines rather than general-purpose computers.

Apple's decision also doesn't mean the "end of Cell" or anything like that. Cell will have a niche market, even if it's not in every PC in the world. Computers like CGI workstations, that can use Cell for what it IS. Hell, I'd buy one.
 
Just imagine what super console could be made if sony and Microsoft joined forces. Its probably never gonna happen but I like to imagine...
 
Wow, that was a very nice read. i enjoyed it a lot. sometimes, i wish i was at these interviews with kutaragi. he's got my respect, no doubt.

Anyway, microsoft just seems like they're trying to cover up their faults or something. like they've done something wrong and don't want to admit it or show it. Most, if not all, of their statements are contradictory between their CPU and GPU and comparisons with the ps3. I like to see that kutaragi is just speaking the truth, not trying to impose crap on the opposition like microsoft and ATi seem to be dumping all over sony and folks.

but, again, it just goes to show you the versatility of the cell. you could buy cell-architecture cards and use them for graphics, audio, or anything you please.
 
Omnis
Anyway, microsoft just seems like they're trying to cover up their faults or something. like they've done something wrong and don't want to admit it or show it. Most, if not all, of their statements are contradictory between their CPU and GPU and comparisons with the ps3.

Yep. Most of it is due to Sony's ability to actually keep their secrets. Prior to E3, Microsoft had absolutely no idea what Sony was planning with the PS3. All they knew was what we all knew.. that PS3 was coming and that it would use the Cell as it's CPU. That's it. They vastly underestimated Sony's E3 presence, and got their asses kicked as a result. Now, they're trying to run damage control, downplaying the PS3's capabilities while puffing up their own.
 
they even had to come up with the lame-o supplimentary name substitute. xbox 360 just screams trendwhore to me.
 
Jedi2016
It's not really a fair comparison, Viper.

When was I fair? When was arguing fair?

Apple was upset at IBM for not being able to squeeze a G5 chip into a laptop, and they felt that the Cell wasn't an ideal chip for general-purpose computing. Which is true.

I disagree. One Cell can out perform even multi core x86 processors by very large margins. You would need a complete overhaul of the x86 architecture to keep up, but what is the purpose of that? A group of Cells can run multiple OS, especially Linux, since it's not hardware dependent.

But a game console doesn't USE general-purpose computing. So who cares if it ain't that good at it? It's like hiring a cook for a department store vs hiring him for a restaraunt. In one environment, he doesn't do so well.. in the other, he does.

The PS3 or Xbox 360 may not be general-purpose, but what makes Cell that? I don't see a reason why Cell can't be used in a off-the-shelf computer? It's faster, it's cheaper, and I can emulate Windows (or run Linux). What more do you want?

Basically, Apple wanted something that IBM couldn't provide. It was a business decision, nothing more. And has nothing at all to do with game consoles. So what if they use IBM technology? So what if they continue to use IBM technology for the next ten generations of game consoles?

So, Apple wants be become more like a PC company? I can see Jobs being irrational again and switching to x86 only because IBM is too busy in bed with Sony, Microsoft, and Nintendo. While Jobs is at it, why not take away market from Microsoft? Buy a Mac and run your Windows applications too!

Apple's decision also doesn't mean the "end of Cell" or anything like that. Cell will have a niche market, even if it's not in every PC in the world. Computers like CGI workstations, that can use Cell for what it IS. Hell, I'd buy one.

I see it in every PC in the world. In ten years, x86 is dead.
 
Viper Zero
I see it in every PC in the world. In ten years, x86 is dead.

I agree on this. Remember, this is only the FIRST Cell chip that is actually being used in a mass-production project. The architecture's potential is so great...we will have a whole new future of change and progression that would otherwise be unattainable with the x86 chip. I see the architecture being worked with to evolve into seperate job-specific forms and products. The cell can become a great general purpose processor, given it has the effort behind it in developing it as such. anything is possible. it's the architecture that makes those possibilities very probable, indeed.
 
Back