PS3 General Discussion

I like the idea of cell processor on different device being able to connect together , that means buying more Cell devices would give you more processing power all around :D:D
 
BMW318ciC
I like the idea of cell processor on different device being able to connect together , that means buying more Cell devices would give you more processing power all around :D:D
I don't think it works like that...

If you're only going for a joke, I appologise... (it's so hard to tell these days)
 
fasj6418
well i just realized that i am very poor. :yuck: simple as that.:grumpy: right now i´m too afraid of losing on the next generation because its way too expensive for me... i only got my PS2 in 2004, 4 years after it was released because i couldn´t afford one... you have to agree with me felipe, the PS2 is still a videogame for only a few people here... the PS1 is very popular and affordable, but i don´t know a lot of PS2 owners... maybe blumenau is a very rich city:idea: haha. its just looks like its growing... PS2 more expensive than PS1, PS3 way more expensive than PS2... tough to keep up here... i wish i lived in the US... i would got mine right after the launch... with original games and stuff
I agree the PS2 is expensive for our reality, but there's more people here playing it than you might think. I guess there are at least 5 million units in Brazil, and although that isn't much for a 170 million people country it's still a large number of units sold, specially considering it was never released here.

If Sony would release their consoles over here they would cost less because they wouldn't be imported items anymore, and more people would know about them because it would be advertised. I've heard rumours that MS might do that with the Xbox360, let's hope so.
 
Solid Lifters
I don't think it works like that...

If you're only going for a joke, I appologise... (it's so hard to tell these days)

I'm not kidding here is a link I'm going to quote a part of it
The Future: Multi-Cell'd Animals

One of the main points of the entire Cell architecture is parallel processing. The original idea for Cells working across networks as mentioned in the patent appears to still be in development but probably won’t be in wide use for some time yet. The idea is that “software cells” can be sent pretty much anywhere and don't depend on a specific transport means.

Want more computing power? Plug in a few more Cells and there you have it. If you have a few cells sitting around talking to each other via WiFi connections the system can use it to distribute software cells for processing. The idea is similar to the the above mentioned job queues but rather than jobs being assigned locally they are assigned across a network to any Cell with spare processing capability.
Cell_Distributed.gif

The mechanism present in the software Cells makes use of whatever networking technology is in use, this allows ad-hoc arrangements of Cells to be made. This system essentially moves a lot of complexity which would normally be handled by hardware and moves it into the system software. This usually slows things down but the benefit is flexibility, you give the system a set of software cells to compute and it figures out how to distribute them itself. If your system changes (Cells added or removed) the OS should take care of this without programmer needing to worry about it.

Writing software for parallel processing is usually highly difficult and this helps get around the problem. You still of course have to parallelise the program into software cells / jobs but once that's done you don't have to worry if you have one Cell or ten (unless you’ve optimised for a specific number).

It's not clear how this system will operate in practice but it would have to be adaptive to allow resending of jobs when Cells appear and disappear on a network. That said such systems already exist and are in very wide use.

This system was not designed to act like a “big iron” machine, that is, it is not arranged around a single shared or closely coupled set of memories. All the memory may be addressable but each Cell has its own memory and they will work most efficiently in it.

It appears the IBM workstation / blade will have 2 Cells but it can act as a SMP system (i.e. the Cells can share each others memory). The patent specified a way to connect 8 Cells but given the size and likely cost of the first generation of Cells I doubt anything like this will appear soon.
[FONT='Helvetica-Bold', 'Helvetica'] [/FONT]
 
code_kev
That's for pcs etc, its not like your gonna be able to plug your cell toaster in to your cell ps3 to give it more oompf.
Can you imagine that, "we need more power to run this program", "quick, plug the cell powered toaster in". I don't know how far the Cell will go in terms of revolutionising anything, sillicon chips are nearing the end of their size/power ratio's going down, it's getting to a point where they'll have to start making them bigger to be more powerfull. A lot of companies are experimenting with light and other materials including even coal to create more powerful CPU's than sillicon.
 
In theory if in the future , If I have the PS3 on and its doing nothing (not playing games ) and I have it some how connected to the PC(which off course has a Cell Processor) I can use the PS3 to get more power for my PC.
@Live4speed : have you read this article ?
http://www.blachford.info/computer/Cell/Cell0_v2.html
its very good and from reading it you can tell that the Cell has great future any way I'm going to discuss the cell tomorrow with my professor at University(I'm studying computer engineering :) ) as I wanna see what he thinks of it :)
 
BMW318ciC
In theory if in the future , If I have the PS3 on and its doing nothing (not playing games ) and I have it some how connected to the PC(which off course has a Cell Processor) I can use the PS3 to get more power for my PC.
@Live4speed : have you read this article ?
http://www.blachford.info/computer/Cell/Cell0_v2.html
its very good and from reading it you can tell that the Cell has great future any way I'm going to discuss the cell tomorrow with my professor at University(I'm studying computer engineering :) ) as I wanna see what he thinks of it :)

You are not understanding the concept.

It's not to "give more power" to the PC.

Basically, your PC, in theory, would send tasks, like calculations, and other things, to the "daughter cell" and the results would then be streamed back to the PC. But, it is not a real time extension of the PC itself. You aren't going to be playing a game on a Cell based PC and suddenly realize you need more power to max out the settings, and then try to unload the brunt of it on the PS3. Unfortunately, that is not how it works.

Also, the software in question must be specifically designed to take advantage of a "daughter cell". It must be programmed to recognize other devices that have the Cell processor in them, and then it will devide the tasks as such, more than likely with tasks of lesser importance (low priority)going to the "daughter cell", and high priority tasks being handled by the "father cell".

Also, I think this theory is based, for the most part, on a single system that has multiple "cell units" working together, not multiple devices with single cell units.
 
LaBounti
Thanks for the immature comment but no one was talking about "stand alone" anything. The cell is not a card its a chip.

According to this link that I've read many times I know a lot about the cell already.

http://www.blachford.info/computer/Cell/Cell0_v2.html

I keed, I keed, lol.

Anyway, I would imagine the cell's architecture could easily handle their tasks if implemented into a card.
 
Actually, the designers of the Cell from IBM stated that ultimately, the Cell computing devices would be able to network over the internet in shared arrays. How exactly this science fiction structure would work, I haven't a clue. I assumed that special software taylored to such tasks would be the basis, but they've given this same pitch a few times, and they make it sound like the software isn't what facilitates the Cell linking with other Cells. When this stuff starts appearing in pcs and servers and such, I guess we'll see what Sony, Toshiba and IBM have in mind.
 
Tenacious D
Actually, the designers of the Cell from IBM stated that ultimately, the Cell computing devices would be able to network over the internet in shared arrays. How exactly this science fiction structure would work, I haven't a clue. I assumed that special software taylored to such tasks would be the basis, but they've given this same pitch a few times, and they make it sound like the software isn't what facilitates the Cell linking with other Cells. When this stuff starts appearing in pcs and servers and such, I guess we'll see what Sony, Toshiba and IBM have in mind.

There is no way for the cell to communicate with another device unless the software allows it to. The processor is not ultimately looking for other Cell's in which to share it's workload.

Also, it is simply a method of distributing tasks and returning results to the main processor. Too many people believe they combine to make a "super processor" which is not the case. While they very well are capable of processing faster, it is merely a matter of breaking down certain tasks, assigning them to alternate daughter cell processors, and putting the processed data back together again.

But I must reiterate, without software, there is no way for the devices to communicate with eachother. The software must be there to link the processors, and then break down and distribute the processes to each Cell, and then bring them back to the father cell.

I've read on the theory, and you're close, but your taking what IBM has said out of context. While it is possible to network the processor over the internet, it is much the same as current day process sharing. There are many sites where you can log into, such as Seti@Home, where data is sent to your computer for processing and then sent back to their server. It's not necessarily new, but the Cell is much more effecient, and definately easier taylored to work with, and MUCH more complex. But the theory itself it not completely new.
 
I havent been real in with the lastest news but watching that video Solid linked confirms region free software (at devepolers control). Great news :)
 
Well the cell is able to render stuff very easily, but it takes middleware to get the end results, the reason the warhawk demo used the Cell is because, the Volumetric calculations, not the actual rendering, and the procedurally rendered sea, which means the 3D calculations to form a realistic looking 3D surface, cell isnt actually making it render. If you actually listern to the guys speak, he says this "The sea has its geometry procedurally generated by wave simulation every frame, then sent to RSX for rendering" its same with the volumetric clouds, the geometry is created by cell using a software volumetric sequencer, and rendered by a cluster of SPU's its still a very awsome bit of kit, if you really want to see what cell is capable of (rendering wise) go HERE these guys are IBm staff who work on cell but also work in thegaming devision, this is thier blog, the top post is an actual video of Cell rendering. its quite impressive, the site has loads of tech stuff on Cell, but its not for those who shy away from really heavy info.
 
If its the same :) why are there two different explanations?

Clearly he says the spu is rendering and calculating the clouds. And the RSX is rendering the water ripples but calulated by the cell.

And really this is an interesting paradine shift for us because its the first time we are mixing cell based software rendering with RSX based hardware rendering. Now certainly you dont need to the cell for software rendering, you can do so if you choose. The RSX has plenty of power on its own.
 
I'll have to catch up on all the news this weekend, have an RP to run. But if the clouds and waves are software generated, does that mean the scene will be slightly different each time you play? Wouldn't that be wild!
 
LaBounti
If its the same :) why are there two different explanations?

Clearly he says the spu is rendering and calculating the clouds. And the RSX is rendering the water ripples but calulated by the cell.

La Bounti you answered it yourself with that quote from the Dev team, they are mixing hardware and software rendering.

wikipedia
In the context of rendering (computer graphics), software rendering refers to a rendering process that is unaided by any specialized hardware, such as a graphics card. The rendering takes place entirely on the CPU.

The clouds and sea in Warhawk are actual 3D generated objects which are created in Cell on the fly frame by frame, but the clouds are then rendered by a peice of software running on a cluster of SPU's while the sea is sent to RSX for redering.

Clouds -> Cell generates 3D models -> Cell based Software rendering

Sea -> Cell generates 3D models -> RSX based hardware rendering

hope that clears it up :), all in all the cell creating those geometry's frame by frame is quite something and very nice to see.
 
I know, the method wasnt in question just when you said they were the same when they are not.

If you actually listern to the guys speak, he says this "The sea has its geometry procedurally generated by wave simulation every frame, then sent to RSX for rendering" its same with the volumetric clouds, the geometry is created by cell using a software volumetric sequencer, and rendered by a cluster of SPU's its still a very awsome bit of kit,
 
If anything this proves one thing to me. If WarHawk can handle rendering via both the Cell and RSX, I see no reason why a talented developer could not attain the level of atmosphere seen in the Killzone PS3 trailer.

Since we have already seen tons of clouds rendered, that makes smoke easy, and water is much more hardware intensive than character models and other action scripts, so I see no reason why it isn't possible in the future.
 
LaBounti
I know, the method wasnt in question just when you said they were the same when they are not.

I didnt mean they were the exact same, but just in priciple. Cell creates the geometry for both. thats what I meant by the same, just my way of explaining wasnt very clear my bad :)

tha_con
If anything this proves one thing to me. If WarHawk can handle rendering via both the Cell and RSX, I see no reason why a talented developer could not attain the level of atmosphere seen in the Killzone PS3 trailer.

Exactly, I see no reason now with all thats been show to-date that Killzone like visuals wont bee done on the PS3 in the 2nd or 3rg generation of games. I cant wait to see GT5 tho, it will look stunning, with particle smoke effects and HDR, with wind physics effecting other cars, and foliage. Not to mention Wipeout and Silent Hill games too, they will look beond what we could have hoped for, I hope :D
 
The only bottleneck i can see is the speed of video memory. I hink using cell for simple effects like smoke or clouds is good. They wont clog up the pixel shaders.

I just want to see more hardware specific games. You know game that take advantage of the hardware like the motorstom terrain morphing and partical splater effects.

1 more month for some more real time gameplay.
 
LaBounti
The only bottleneck i can see is the speed of video memory. I hink using cell for simple effects like smoke or clouds is good. They wont clog up the pixel shaders.

I just want to see more hardware specific games. You know game that take advantage of the hardware like the motorstom terrain morphing and partical splater effects.

1 more month for some more real time gameplay.

Hardware specific really doesn't matter as long as the end result is polished. I see no reason to care what's rendered on what.
 
Well some interestin new here :)

Beyond 3d.com
The structure in which a processor core has shared registers accessible from other cores is seen in network processors and the like. For what it's worth, the AGEIA co-founder Manju Hedge was an ex-CTO of a network processor manufacturer.

Memory Architecture Without Memory Hierarchy

One of the biggest characteristics of the PPU is its memory architecture. It has 128bit memory interface for external memory, but has no internal cache memory.

"We don't have cache memory hierarchy of any kind. This is very important because traditional cache is not suitable for physics," says Hedge.

PPU has no structure such as CPU cache that is synchronized with external memory by the set-associative method and updates automatically. It's because in physics simulation it has little data locality. They say memory cache hierarchy is more trouble than it's worth.

"In CPU and GPU, data has locality. But in physics not, as it has to do random access to many objects. Data structures are totally different" says Nadeem Mohammad, who moved from a GPU vendor to AGEIA.

Still PPU has large internal memory in itself. It has various internal memories instead of cache, and has the organization that does explicit and programmable transfer between internal and external memories.

The patent explains memories such as dual-bank Inter-Engine Memory (IEM) connected to VPU, multi-purpose Scratch Pad Memory (SPM), DME Instruction Memory (DIM) which does instruction queuing, and so on. Hedge suggested that those memories in the patent are in the actual implementation by saying "they are probably included" in PPU.

Among those memories IEM is used in the way that looks like traditional data cache. According to the patent, DME loads a data set required for operation of processing units into IEM explicitly. Unlike cache memory, low-latency access is possible in IEM and apparently it could implement a large number of I/O ports. As the result, it could achieve huge internal memory bandwidth.

"One of the important factors in a physics architecture is it requires huge on-chip memory bandwidth. Our PPU has 2Tb(Tera-bit)/sec on-chip memory bandwidth," says Hedge.

In short, removing complicated cache control made it possible that PPU has L2-cache size internal memory with L1-cache latency and huge bandwidth, and it's suitable for physics algorithm according to them.

Cell-like Global Structure of PPU

By the abstract of the PPU architecture, you'll immediately notice the commonality with Cell. Both of them are parallel processors with huge floating point processing units, have no cache hierarchy, and manage inter-memory data transfer by software programs.

If you replace PPU Control Engine (PCE), the RISC core in PPU, with PPE(Power Processor Element), the PowerPC core in Cell, and Vector Processing Engine (VPE), PPU's data processing engine, with SPE(Synergistic Processor Element), Cell's data processor, they almost correspond with each other.

In both the architectures one RISC core does global control and many vector data processors does data processing in parallel. As for the affinity in the architectures Hedge said:

"If you look at the very high level they are very alike. Both of them are huge parallel engines, have floating point processing units, and control each internal memory. But the difference is also big. For example, Cell does internal data transfer by a ring bus (so it has limited bandwidth). On the other hand, our architecture has far higher (internal data) bandwidth.

But it's also true that Cell is relatively suitable architecture for physics processing. In PS3, the GPU is "GeForce 7900+" architecture, but it has Cell. So in PS3 it can do physics on a PPU-like architecture (Cell), not in GPU."

Looking at the PPU architecture like this, you can imagine AGEIA has a relatively good affinity in PS3 library development.

The current transistor count of PPU is 125M and manufactured at 0.13um process of TSMC. For the chip size it's GeForce FX 5800(NV30) class and the process is the same, the die size is about 182mm and it's a bit smaller than NV30. It won't be far from the reality if you assume you have NV30 for physics.

By comparing it with GPU you can imagine the configuration of VPU too. In the 120M class GPU it has 6-12 programmable shaders. Unlike GPU, PPU doesn't do texture handling etc, it should have simpler processing units. Then it's estimated that the current AGEIA PPU have 16 VPUs at most.
 
Here is a link to the GDC videos, for those who havent got them yet, but also for those who downloaded the really **** versions , these vids have the audio and are a little better quality too. enjoy.

LINK
 
sprite
Here is a link to the GDC videos, for those who havent got them yet, but also for those who downloaded the really **** versions , these vids have the audio and are a little better quality too. enjoy.

LINK
I put those in the first post a few days ago. Best check there first before posting something new. Chances are, I've got it covered already.
 
Solid Lifters
I put those in the first post a few days ago. Best check there first before posting something new. Chances are, I've got it covered already.

No worrys, thanks for the heads up. :) Just posted them up quick and got the hell out of dodge.
 
PlayStation 3 rumored to be behind closed doors only at E3


Rumors have begun flying about how much content will be playable on the PlayStation 3 at E3 this year. Now more rumors are starting that there may be no playable content for general attendees at all. This would be a major talking point among those that will be attending without 'Media' or 'Exhibitor' statuses. SPOnG has spoken to some publishers about their products availability, receiving the following comments.

"From what we have been hearing, the PlayStation 3 will be shown in a similar way to which the PSP was debuted at E3," one publishing source told SPOnG last week. "Attendees will be able to view the machine in a tightly-restricted environment. People thinking the floor will be awash with playable demo pods will be sore if they walk through the doors expecting to be able to sample a massive range of [PlayStation 3] software at leisure."

Realm Media Networks will be attending this year's E3, so expect some coverage of whatever Sony is deciding to show. Due to high demand, the play time may be extremely limited, or available on an invitation-only basis. We're committed to doing everything we can to get hands-on time with this anticipated console and its first round of games, then making that information available to everyone else.
 
Back