PS3 General Discussion

slackbladder
Just out of interest, if a game runs at 60fps does that mean it has to have double the amount of animations in it than a game running at 30fps? And if so would that take up more memory and space on the disc media?
I don't think so, I think only the processor and graphics chip have to process more.
 
slackbladder
Just out of interest, if a game runs at 60fps does that mean it has to have double the amount of animations in it than a game running at 30fps? And if so would that take up more memory and space on the disc media?
No. It is just rendered twice as fast. That is why when the game slows down it is because the processor is being overworked.
 
Really? When I used to have mine, I thought the cloth was to wipe the screen down. :dopey:

EDIT: Also, another question. Is 60 FPS a standard for every game? Like for example, EA makes NFS: Carbon 30 FPS for all platforms. Would they need to bump it up to 60 FPS for the PS3? Or is it like the 360 with a mix of 30 and 60 FPS titles?


The entire front face is all the same material(clear plastic) no seems from end to end.
 
When again would you touch your PS3?

Considering there is only one button to eject, and the controllers are wireles...I'm not sure exactly why it'll have finger prints...

But, it's all the same, if you pick it up and lug it around that's awesome :)
 
Jeremy Ricci
But, it's all the same, if you pick it up and lug it around that's awesome :)

Unfortunatly that will be the case at our house, the PS3 will be moved nearly every weekend, as we go to a friends house to play multiplayer games till the earlie ours. I guess Sony may ship it with a cloth ala PSP, or make it from that finger print proof aluminium, who knows?
 
Jeremy Ricci
That stinks man :( I'll probably just have mine on my TV and play 4 player split screen or online.

ahh its not realy a problem, weve always done it since the Amiga days, well my dad did, and then I started to go, its sort of a tradition, My dad and his friends started it, with a couple of my uncles, the now its me my dad, his friend (its his house) and two of my uncles and sometimes my cousin, and every so often a mistery guest my pop in. its actually good fun, but seeing as the friend has no internet connection and no console it would be hard to set up, plus the internet dosn't have people smacking you up-side the head for raming them off the track or killing them :D
 
Anyone know Chairmansteve?

That guy is the owner of PCvsConsole.com. Though so far what he said about the PS2 have been right from the past year. I've been checking this site a long time ago. Very techy place. I would recommend to anyone.

Here his comparison math between GPU, PS3 vs XBOX 360. Anyway, it might be crap, but we never know.

chairmansteveJune 7, 2006 - 5:47:16pm CST1 of 29 Let's look at the maximum theoretical numbers for the Xbox 360 and PS3 GPUs.

Triangle Setup
Xbox 360 - 500 Million Triangles/sec
PS3 - 275 Million Triangles/sec

Vertex Shader Processing
Xbox 360 - 6.0 Billion Vertices/sec (using all 48 Unified Pipelines)
Xbox 360 - 2.0 Billion Vertices/sec (using only 16 of the 48 Unified Pipelines)
Xbox 360 - 1.5 Billion Vertices/sec (using only 12 of the 48 Unified Pipelines)
Xbox 360 - 1.0 Billion Vertices/sec (using only 8 of the 48 Unified Pipelines)
PS3 - 1.1 Billion Vertices/sec (if all 8 Vertex Pipelines remain)
PS3 - 0.825 Billion Vertices/sec (if downgraded to 6 Vertex Pipelines)

Filtered Texture Fetch
Xbox 360 - 8.0 Billion Texels/sec
PS3 - 13.2 Billion Texels/sec (if all 24 Pixel Pipelines remain)
PS3 - 11.0 Billion Texels/sec (if downgraded to 20 Pixel Pipelines)

Vertex Texture Fetch
Xbox 360 - 8.0 Billion Texels/sec
PS3 - 4.4 Billion Texels/sec (if all 8 Vertex Pipelines remain)
PS3 - 3.3 Billion Texels/sec (if downgraded to 6 Vertex Pipelines)

Pixel Shader Processing with 16 Filtered Texels Per Cycle (Pixel ALU x Clock)
Xbox 360 - 24.0 Billion Pixels/sec (using all 48 Unified Pipelines)
Xbox 360 - 20.0 Billion Pixels/sec (using 40 of the 48 Unified Pipelines)
Xbox 360 - 18.0 Billion Pixels/sec (using 36 of the 48 Unified Pipelines)
Xbox 360 - 16.0 Billion Pixels/sec (using 32 of the 48 Unified Pipelines)
PS3 - 17.6 Billion Pixels/sec (if all 24 Pixel Pipelines remain)
PS3 - 13.2 Billion Pixels/sec (if downgraded to 20 Pixel Pipelines)

Pixel Shader Processing without Textures (Pixel ALU x Clock)
Xbox 360 - 24.0 Billion Pixels/sec (using all 48 Unified Pipelines)
Xbox 360 - 20.0 Billion Pixels/sec (using 40 of the 48 Unified Pipelines)
Xbox 360 - 18.0 Billion Pixels/sec (using 36 of the 48 Unified Pipelines)
Xbox 360 - 16.0 Billion Pixels/sec (using 32 of the 48 Unified Pipelines)
PS3 - 26.4 Billion Pixels/sec (if all 24 Pixel Pipelines remain)
PS3 - 22.0 Billion Pixels/sec (if downgraded to 20 Pixel Pipelines)

Multisampled Fill Rate
Xbox 360 - 16.0 Billion Samples/sec (8 ROPS x 4 Samples x 500MHz)
PS3 - 8.8 Billion Samples/sec (8 ROPS x 2 Samples x 550MHz)

Pixel Fill Rate with 4x Multisampled Anti-Aliasing
Xbox 360 - 4.0 Billion Pixels/sec (8 ROPS x 4 Samples x 500MHz / 4)
PS3 - 2.2 Billion Pixels/sec (8 ROPS x 2 Samples x 550MHz / 4)

Pixel Fill Rate without Anti-Aliasing
Xbox 360 - 4.0 Billion Pixels/sec (8 ROPS x 500MHz)
PS3 - 4.4 Billion Pixels/sec (8 ROPS x 550MHz)

Frame Buffer Bandwidth
Xbox 360 - 256.0 GB/sec (dedicated for frame buffer rendering)
PS3 - 22.4 GB/sec (shared with other graphics data: textures and vertices)
PS3 - 12.4 GB/sec (with 10.0 GB/sec subtracted for textures and vertices)
PS3 - 10.0 GB/sec (with 12.4 GB/sec subtracted for textures and vertices)

Texture/Vertex Memory Bandwidth
Xbox 360 - 22.4 GB/sec (shared with CPU)
Xbox 360 - 14.4 GB/sec (with 8.0 GB/sec subtracted for CPU)
Xbox 360 - 12.4 GB/sec (with 10.0 GB/sec subtracted for CPU)
PS3 - 22.4 GB/sec (shared with frame buffer)
PS3 - 12.4 GB/sec (with 10.0 GB/sec subtracted for frame buffer)
PS3 - 10.0 GB/sec (with 12.4 GB/sec subtracted for frame buffer)

Shader Model
Xbox 360 - Shader Model 3.0+ / Unified Shader Architecture
PS3 - Shader Model 3.0 / Discrete Shader Architecture

Xbox 360 has the advantage in most cases.

Some PS3 GPU (RSX) specs are still not confirmed. It's assumed to have 24 pixel pipelines, 8 vertex pipelines, 8 ROPS (raster), and 550MHz clock speed. But any of those could change, especially the clock speed.

Are there any other GPU spec categories worth adding?
http://forum.pcvsconsole.com/viewthread.php?tid=19237

Those are just numbers. But remember this post & see how the it will look later.
 
most likely its BS, since the specs of the RSX are UNKNOWN.

also, he is stating that the cell would do all the graphics or he is guessing numbers for the RSX?

that´s basically the same kind of comparision that MS PR people gave to IGN last year, they focus on bogus numbers to give advantage to their 3 core processor. they always forget about the PS3 cell´s as being synergetic.

remember the dreamcast, as release gets near, situation gets even more familiar...

think of the 2 consoles (360 and PS3) as 2 buildings. the 360 has 50 stores, and the PS3 has 100. right now they are on 25th floor with the 360, but only in the 10th floor with the PS3. what matter in the long run is who has the bigger roof...
 
fasj6418
most likely its BS, since the specs of the RSX are UNKNOWN.

also, he is stating that the cell would do all the graphics or he is guessing numbers for the RSX?

that´s basically the same kind of comparision that MS PR people gave to IGN last year, they focus on bogus numbers to give advantage to their 3 core processor. they always forget about the PS3 cell´s as being synergetic.

remember the dreamcast, as release gets near, situation gets even more familiar...

think of the 2 consoles (360 and PS3) as 2 buildings. the 360 has 50 stores, and the PS3 has 100. right now they are on 25th floor with the 360, but only in the 10th floor with the PS3. what matter in the long run is who has the bigger roof...

exactly, plus it only nubers of which most are made up, has he got a PS3 in his living room or more than likly sweaty small bedroom under his parents house? I think not. I could make it up too, the PS3 is like 50 times more powerfull than my pc and it dose a bazzilion Gflops preformance while making a cup of Yorkshire tea.

If anyone belives stuff like this then they need to hold onto their horses, PS3 isnt out and until it get out inot the market nobody knows how powerfull it is, only the Sony boffins, not even IBM or Nvidia know and they make the chips.

ha ha ha ive just read your post (Mr Deap) over again and noticed two things that make it ohh so pointless

1) PCvsConsole.com << this says it all, he probably a MS fanboy so is hating on Sony before he even has his shreddies in the morning.

2) maximum theoretical numbers << this is exactly what they are nothing more, they are made up.

edit #2 Just been to that site and boiy is a MS fanboys site, wow, ive never seen more hating than a site like this geez i need some rest after all that negative, one guy even said that it was brave of one poster to be posting about him buying a PS3, that just says it all really.
 
Theoretical numbers don't prove anything. The PS2 was supposed to play Toy Story level grapchical quality games, and that never materialized. The original XBox was supposed to be the most powerful console on the market, yet a few games on the Gamecube pushed more polygons than the XBox. This guy is talking out of his ass.
 
Toronado
No. It is just rendered twice as fast. That is why when the game slows down it is because the processor is being overworked.
Gotcha. Understood.👍

Now, I'm wondering what the sound will be like on PS3 games. I've read that the Cell architecture is pretty good at sound processing. I'm certainly looking forward to playing favourite PS3 franchise games like MGS4, DMC4, GT5 etc in 5.1 as well as new games like Resistance. But how good can it be? I've always found the 5.1 on Xbox games a little sparse at times with only a few titles really making the most of it. I just hope these next-gen games will give my 500w amp a workout for once!
 
slackbladder
Gotcha. Understood.👍

Now, I'm wondering what the sound will be like on PS3 games. I've read that the Cell architecture is pretty good at sound processing. I'm certainly looking forward to playing favourite PS3 franchise games like MGS4, DMC4, GT5 etc in 5.1 as well as new games like Resistance. But how good can it be? I've always found the 5.1 on Xbox games a little sparse at times with only a few titles really making the most of it. I just hope these next-gen games will give my 500w amp a workout for once!

It should be pretty good TBH, with Blu_Ray they will have enough to store really high quality sound, so I would expect it to be close to DTS quality. But until we actually play one with a full 5.1 suround system then im not 100% sure, But I would expect it to be great.
 
Recent Comments on Sony PS3


Steven Towns submits: With Sony Corp (SNE) having just reported quarterly earnings last Thursday (click here for summary) buzz about its fall launch of the PlayStation 3 has picked up and brought about some comments that are worth reviewing.


Takao Yuhara, Sony's senior VP in charge of investor relations, responded to press inquiries on Friday saying that Sony wanted to recoup development costs (estimated to be around 500 billion yen ($4.33b)) associated with the PS3 within 5 years -- the same time frame as for the PS2.

Note that Yuhara was quoted in the Jiji press having said sales from the DVD version of The Da Vinci Code will generate profits for Sony's Pictures segment, and the segment will without a doubt report higher profits on the year as compared to the year prior.

Yuta Sakurai, a Nomura Securities analyst has been quoted all over the place based on a note to clients in which he said he predicts 71 million units of the PS3 being sold by 2011, versus 40 million units for Nintendo's (NTDOY) Wii. Sakurai said, "Hardcore gamers will probably prefer the raw power of PS3. And for all the positive vibes right now, the Wii isn't expected to outsell the PS3." Yet, to give Nintendo some credit Sakurai said, "Software developers are increasingly interested in creating games for these Nintendo platforms."

Masafumi Oshiden, a money manager at Merrill Lynch Investment Managers in Tokyo said, "PlayStation 3 will be a huge money loser in the beginning." And, "I think Wii will sell better than the PS3." Note that ML analysts have been pessimistic on the PS3 all along, citing high costs and delays.

As for Sony's goal of boosting profitability, Bloomberg.com cited Kazuhara Mura, an analyst at Daiwa Institute of Research, who said, "If the PS3 doesn't sell well, Sony won't be able to recoup the investment in five years." That would obviously keep heavy downward pressure on company margins.

http://ce.seekingalpha.com/article/14685
 
I wouldnt say PS3 has been delayed, its been refined..... Its clear that it wouldn't have any games if it came out last spring. And its blu-ray player would have been 1x and not capable of an HD video transfer rate and super slow load times..
 
PlayStation 3 packs a punch for scientists

By Alpha Doggs on Mon, 07/31/2006 - 7:43pm

Scientists at Lawrence Berkeley Lab in California praised the processor inside Sony's PlayStation 3 as a lower-cost alternative to Opteron and Itanium chips, while being eight times faster and at least eight times more power-efficient.

The scientists have been evaluating the game console's chip, STI Cell, developed by Sony, Toshiba and IBM (the STI in STI Cell) running several scientific-application kernels against other processor architectures.

"Overall results demonstrate the tremendous potential of the Cell architecture for scientific computations in terms of both raw performance and power efficiency," say the LBL scientists who conducted the research. "We also concllude that Cell's heterogeneous multicore implementation is inherently better suited to the [high-performance computing] environment than homogenous commodity multicore processors."

According to an LBL article about the study, the Cell is compelling to the scientific community which has high-compute needs but often with limited funds, because the intended game market for the STI Cell means it will be produced at high volume, making it a lower-cost alternative to conventional processors from AMD and Intel.

http://www.networkworld.com/community/?q=node/6493
 
Solid Lifters
According to an LBL article about the study, the Cell is compelling to the scientific community which has high-compute needs but often with limited funds, because the intended game market for the STI Cell means it will be produced at high volume, making it a lower-cost alternative to conventional processors from AMD and Intel.
That almost makes me wonder, what if IBM makes the Cell so that you can use it as your processor for your PC? I know you need special things (can't remember what the "things" are though) for it, but you could be the ultimate multitasker.
 
Obviously those scientists are dummyheads for not consulting Chairmansteve at PCvsConsole.com. If they had, they would know that the personal computer is the ultimate supercomputer for all your computational needs. Or the XBox360. :D

Duck: the CELL processor is a Reduced Instruction Set Chip (RISC), which means that the operational instructions that run devices it's plugged into must be able to have it run it. The CPUs in PCs are full instructionset chips which have all the codes necessary to run the motherboards, interact with Windows OS's, PC software, printers, displays, hard drives, DX9, network cards, etc.

The reason devices like game consoles, synthesizers and other things use RISC processors like CELL and its SPEs is because with fewer codes it has to sort through and run, and a MUCH smaller simpler OS, their performance is better. But those missing codes are essential for PCs, so it's going to be a while before IBM and Intel produce a CELL-like CPU for PCs. But it could be the next gen in CPUs.
 
Tenacious D
Obviously those scientists are dummyheads for not consulting Chairmansteve at PCvsConsole.com. If they had, they would know that the personal computer is the ultimate supercomputer for all your computational needs. Or the XBox360. :D

Duck: the CELL processor is a Reduced Instruction Set Chip (RISC), which means that the operational instructions that run devices it's plugged into must be able to have it run it. The CPUs in PCs are full instructionset chips which have all the codes necessary to run the motherboards, interact with Windows OS's, PC software, printers, displays, hard drives, DX9, network cards, etc.

The reason devices like game consoles, synthesizers and other things use RISC processors like CELL and its SPEs is because with fewer codes it has to sort through and run, and a MUCH smaller simpler OS, their performance is better. But those missing codes are essential for PCs, so it's going to be a while before IBM and Intel produce a CELL-like CPU for PCs. But it could be the next gen in CPUs.

Actually I'm pretty sure that the PPE in the Cell processor is capable of running any current operating system, as was said by Sony and IBM execs.
 
that be f-ing sweet. Why are people worried people won't buy a ps3 for 600. Whe the i-pod first started selling for like 600. Look how many people bought that, and it did'nt even play games. Sony rules.
 
some info about the Cell.

Perspective: Cell chip: Hit or hype?

By Michael Kanellos

Published: February 9, 2005, 12:01 AM PST
TalkBack E-mail Print

See all Perspectives

lg_kanellos_m4.jpg

Will the Cell processor be the new Itanium? At the International Solid-State Circuits Conference on Monday, the joint developers of the long-awaited processor--Sony, Toshiba and IBM--unveiled a number of the details about it amid a surge of dramatic speculation. The New York Times said the chip could create "a new digital computing ecosystem that includes Hollywood, the living room and high-performance scientific and engineering markets."
Others speculated that the chip could drive everything from cell phones to servers, tying them into a grand computing grid.
"We believe a 10x performance over the PC, at the same power envelope, can be achieved," said IBM's Dac Pham, one of the designers of Cell. "It will usher in a new era of media-centered computing."
This sort of excitement and speculation about chips is driven by the "Battlestar Galactica" principle.
Intel's limping Itanium debuted with a similar level of fanfare. In 1994, the Microprocessor Report, examining the investment Intel planned to put behind the chip, predicted that it would become commonplace in desktops by 2004. It didn't happen.
Similarly, feelings ran high about the Emotion Engine, the microprocessor inside the original PlayStation 2 game console. Analysts said it could undercut chips from Intel and Advanced Micro Devices in PCs, and become the nerve center for DVD players and other home electronics. Toshiba even created a company, Artile, to license the Emotion.
But the Emotion Engine never migrated outside the PlayStation, and Toshiba snuffed out Artile in 2003. The PlayStation 2, meanwhile, didn't live up to the suggestion that it would serve as a conduit for movies, TV, e-mail and the Internet.
This sort of excitement and speculation about chips is driven by what I call the "Battlestar Galactica" principle. It goes as follows: If the domination of the universe isn't contested on a weekly basis, ratings will go down. Analysts, reporters, consumers and even executives need a gladiatorial contest to keep the job interesting.
The high-public profile of Sun Microsystems can partly be attributed to its role as the William Shatner of computing--donning a new uniform every three seasons to battle a new nemesis.
Put in that perspective, the Cell story starts to look different.
Cell will be a victory if it doesn't lead to layoffs.
Going by papers presented at ISSCC, Cell looks like a tremendous achievement. However, this is the chip industry: Only a handful of companies--Samsung, Intel, Texas Instruments and Taiwan Semiconductor Manufacturing Co.--consistently produce profits. Most everyone else is seemingly always two steps away from the trailer park. Over the past few years, IBM Microelectronics has often reported quarterly losses. Cell will be a victory if it doesn't lead to layoffs.
In all likelihood, Cell will sell in far greater numbers than the just-as-trumpeted Itanium. Sony will put it into the PlayStation 3 video console. Unless gamers lose interest in stock cars, ninja stars and wiping out space aliens between now and 2006, that thing will sell. IBM and Toshiba will put it in products, too.
Still, whether the chip will be able to enter different markets is another question that hinges on factors such as:
Size: Cell contains 234 million transistors and takes up 221 square millimeters in the 90-nanometer production process. That's about double the size of the 90-nanometer 3.6GHz Pentium 4, with 112 square millimeters and 125 million transistors.
Why invite your rival to your top-secret design meetings?
Big chips cost more to produce, can hide more bugs and can be tough to cram into portable devices. Cell will get cheaper when it goes to 65-nanometer production, but so will the alternatives.
Cost: Remember liquid crystal on silicon (LCOS)? The chip that would bring down the price of big-screen TVs? Intel and Brilliant Technologies failed at it. JVC and Sony succeeded. However, the latter two companies sell their LCOS chips to their own television units. The cost of the chip gets absorbed into the TV set.
Sony, Toshiba and IBM don't have to worry about the cost of Cell because they will sell it to themselves. It becomes part of a product that is tagged at a slightly higher price. An expensive Cell, however, will be a tough sell to any other manufacturers.
Alliances: Consumer electronics companies won't want to buy a processor from Sony and Toshiba. Similarly, not a lot of server manufacturers will line up to buy a Cell server chip from IBM. Why invite your rival to your top-secret design meetings?
Power: Cell will have to be air-cooled, IBM said. In other words, fans will probably be required. Ever talk on a cell phone with a fan?
While IBM didn't disclose the exact heat statistics, some at ISSCC said it could run as hot as 130 watts, more than most desktop and notebook chips. If Cell is in this range, kids will really be huddled around the PlayStation 3 at Christmas--for warmth.
On the cool engineering side, however, the chip will come with 10 digital heat sensors to warn of problems and another sensor to regulate temperature.
Memory: Cell comes with an integrated memory controller for high-performance XDR memory from Rambus--which means that the current design works exclusively with this pricey stuff. Sony used an earlier version of Rambus memory in the PlayStation, but it's been a tough sell outside of consumer electronics.
Cell is an outstanding achievement. But we have to wait and see whether it can get a job from someone other than its parents.

Biography

Michael Kanellos is editor at large at CNET News.com, where he covers hardware, research and development, start-ups and the tech industry overseas. He has worked as an attorney, travel writer and sidewalk hawker for a time-share resort, among other occupations.

Sony President Talks Cell, PS3


Submitted by Benjamin Nied
Last update: 06/15/2006


ps3_001_6845.jpg
In out ever-expanding coverage of the wild world of Sony executives, Sony President Ken Kutaragi has given yet another interview. Apparently not satisfied with conquering and angering millions of gamers worldwide, Kutaragi discussed plans for implementing the Cell processor - the CPU chip that powers the upcoming Playstation 3 console - into other consumer electronics, cooling and shrinking the size of the rather oversized PS3, reducing the PS3's incredible power needs, and hinting that the PS3 might come with an external power supply, something that I'm sure won't sit well with anyone who remembers how disastrous the Xbox 360's power bricks have been.

On the matter of adapting the Cell processor for use in home electronics, Kutaragi said that the Cell processor might be downsized from 8 Synergistic Processor Elements (the multiple cores found in the PS3 Cell CPU) to two, which is all that most consumer electronics would ever need. "Just two [SPEs] are all that's needed... But first we have to computerize home electronics. Currently this is not the case. This is the real problem," he said. "As a terminal, consider the flatscreen televisions in ordinary households. If you turn all these televisions into computers, the potential broadens greatly." He then joked, "One wrong move, and Intel and [Steve] Jobs will walk off with everything."

Kutaragi also commented on cooling the PS3, and from his comments, one could infer that an external power supply might be necessary to keep the heat at the lowest possible levels. "We'll use heat pipes and a custom cooling solution, but the methods used will be common," he said. "We definitely aren't using any proprietary methods... We certainly couldn't fit a liquid cooling system [inside the PS3]... We're spending a lot on heat and electromagnetic interference [management]. The power supply could almost be sold separately."
[Source: GameDaily BIZ]

JErickson.jpg
blank.gif
EDITOR'S EYE

The World of Software Development.

by Jon Erickson

if (blogcat == 'Freelancer Blog')document.getElementById('editorheader').innerHTML = '';
if (blogcat == 'Editors Blog')document.getElementById('freelancerheader').innerHTML = '';July 31, 2006
STI Cell: More Than a Game

Some people might call it a hack. But to me the STI Cell is just dessert. Orginally designed by Sony, Toshiba, and IBM (the "STI" in "STI Cell") as the processor for Sony's Playstation 3 game console, the STI Cell is all of a sudden on track to be a building block for next-generation high-performance systems used in computational science.

To this end, computer scientists at Berkeley Labs are benchmarking the processor's performance in running several scientific-application kernels, then comparing its performance against other processor architectures.
"Overall results demonstrate the tremendous potential of the Cell architecture for scientific computations in terms of both raw performance and power efficiency," report researchers Samuel Williams, Leonid Oliker, Parry Husbands, Shoaib Kamil, Katherine Yelick, and John Shalf. "We also conclude that Cell's heterogeneous multicore implementation is inherently better suited to the [high-performance computing] environment than homogeneous commodity multicore processors."
Cell is a high-performance implementation of software-controlled memory hierarchy in conjunction with the considerable floating-point resources required for demanding numerical algorithms. Cell is different form conventional multiprocessor or multicore architectures. Instead of using identical cooperating processors, it uses a conventional high-performance PowerPC core that controls eight single-instruction, multiple-data cores called "synergistic processing elements" (SPEs), each of which contains a synergistic processing unit, a local memory, and a memory-flow controller.

In addition to its departure from mainstream general-purpose processor designs, Cell is interesting because the intended game market means it will be produced at high volume, making it cost-competitive with commodity central processor units. Moreover, the pace of commodity microprocessor clock rates is slowing as chip power demands increase, and these worrisome trends have motivated the community of computational scientists to consider alternatives like STI Cell.
Berkeley Lab researchers examined the use of the STI Cell processor as a building block for future high-end parallel systems by investigating performance across several key scientific computing kernels: dense matrix multiplication, sparse matrix vector multiplication, stencil computations on regular grids, and one-dimensional and two-dimensional fast Fourier transforms. According to the research team, the current implementation of Cell is noted for its extremely high-performance, single-precision (32-bit) floating point resources. The majority of scientific applications require double precision (64 bits), however. Although Cell's peak double-precision performance is still impressive compared to its commodity peers (eight SPEs running at 3.2 gigahertz mean 14.6 billion floating-point operations per second),the group showed how a design with modest hardware changes, which they named Cell+, could improve double-precision performance.
They developed a performance model for Cell and used it to show direct comparisons of Cell against the AMD Opteron, Intel Itanium 2, and Cray X1 architectures. The performance model was then used to guide implementation development that was run on IBM's Full System Simulator, in order to provide even more accurate performance estimates.

The researchers argue that Cell's three-level memory architecture, which decouples main memory accesses from computation and is explicitly managed by the software, provides several advantages over mainstream cache-based architectures. First, performance is more predictable, because the load time from an SPE's local store is constant. Second, long block transfers from off-chip DRAM (dynamic random access memory) can achieve a much higher percentage of memory bandwidth than individual cache-line loads. Finally, for predictable memory-access patterns, communication and computation can effectively be overlapped by careful scheduling in software.
On average, Cell is eight times faster and at least eight times more power-efficient than current Opteron and Itanium processors, despite the fact that Cell's peak double-precision performance is fourteen times slower than its peak single-precision performance. If Cell were to include at least one fully usable pipelined double-precision floating-point unit, as proposed in the Cell+ implementation, these performance advantages would easily double.
Games or HPC. Everyone is having fun with this processor.
Posted by Jon Erickson at 10:18 AM Permalink

Why Cell?
Ever wonder why Cell processors have been the next, best thing?



BLURB
A common question asked by many ChipGeek posters is, essentially, "Why has IBM and Sony's Cell processor received so much press, and why the interest?" The short answer is that it's not just a logical progression, but rather a new way of thinking about computing.

In fact, it's a trend we're seeing in every area of computing these days. Semiconductor companies are facing real challenges in bringing higher-clocked processors to the market in an economical way. They've chosen instead to add additional cores and maintain the same or similar clock speeds as older models. This concept of adding more cores is called "going wider." It definitely provides the potential for much greater throughput and processing power, and it has the added benefit of not requiring a much-improved and much more expensive manufacturing process. These latest batch of dual-core processors from Intel and AMD are just one sign of this growing trend, and it is Cell that takes it to an extreme.

Background
To understand why Cell has sparked such great interest in the industry we need to take a look at existing microprocessor designs and their history. For the sake of argument, let's break them down into what they are and what they do.

First, a microprocessor is a device that carries out some workload. The workload is not a mechanical workload, but rather a form of sequential math equations executed in sequence to produce real-world results. The precision with which that workload is carried out literally boggles the mind. It's truly amazing that microprocessors even work at all.

Second, the workload is applied to a system that then allows its computed values to change the state of something, such as the data displayed on a monitor, the state of an LED on your keyboard, the magnetic "dots" on a hard drive, the relative position of your speaker horn through its electromagnetic properties, etc.

When you put it all together it's nothing short of the most amazing piece of music you've ever heard being played by the most skilled orchestra you could imagine. And the funny part is we're not even at the apex yet. As precise as everything is today, there is still--at very small scales--jitter, fluctation, and all kinds of flat/sharp-inducing qualities that, to the trained ear, will ruin the piece. Still, for the majority of us it is more than enough.

So, if we go back in history we find that in the beginning there was a single core doing all of the work. In fact, the early 80386 models incorporated a three-stage pipeline consisting of "fetch, decode, and execute." This pipeline performed excellently, doing everything necessary to process data. But it was quickly realized that if we changed the nature of how the workload was attacked and processed, even greater speed-ups would be possible.

Stepping forward, we see the 5-stage pipeline evolving into 10 stages, 12 stages, 20 stages, and even 31 stages over time. As a result of this multi-year experiment, reality has shown us that there is a tradeoff between how deep you can go and how much of a real-world return there is when considering the variable nature of software and its required processing steps. It seems that 10-15 stages is just about ideal for today's high-speed microprocessors, with 13-20 stages being more desirable were streaming instruction sequences possible.

Another practical reality hitting the semiconductor world is this: the faster you go, the more expensive it gets. In order for a processor to run faster the entire orchestra must go faster. Now, whereas for the metaphorical person playing the cymbals this might not truly be an issue, you can rest assured that there's someone in the orchestra who has to do some really, really fast and complex hand movements in order to work those 64th notes. As such, there's a physical limit as to how much faster the orchestra can go.

This example shows us that in order to make a processsor go faster, the entire sum of everything inside the processor must be made to go faster. This requires an enormous effort and explains why we don't see faster and faster processors spewed out to the market every other week. It's really an absolutely amazing technological feat, and we should never forget that.

Multi-Core means better throughput
So if it's that much harder to go faster, what's the next best solution? It's the one that doesn't go faster, but still provides greater throughput. Enter the concept of multiple cores and you're halfway to the reason why Cell is so desirable.

In a multi-core scenario a single chip can now be operating on multiple tasks simultaneously, each working on its own set of fixed resources, thereby providing a relatively known increase in performance. And one advantage of multi-core systems is that since more than one task can be executed simultaneously, more than one program can run at the same time; plus, a single program, one that can have its workload broken out into multiple and isolated parts, can be executed more quickly.

The type of software necessary to take advantage of multiple cores adds an additional layer of complexity and requires greater skill from the software writer, be it a compiler doing whatever it can or via a skilled developer writing custom-coded algorithms that take advantage of this level of parallelism. In short, it's harder to do. This fact is exemplified by Microsoft's Bill Gates saying a while back that "the free lunch is over," meaning that software developers can no longer count on chipmakers to make their software go faster for them. Now it's up to the developers' skill.

Cell takes the multi-core concept, one born primarily in the world of general-purpose computing, and says, "What if we didn't really need such general-purpose computing, but rather could leverage specific abilities, streamlined and pipelined for the sole purpose of carrying out a particular type of data processing even faster than those general-purpose computers can do?" Cell addresses that question in its architecture.

Cell is Multi-core inside
Cell processors comprise many cores, but each of them is somewhat less than would be found in a straightforward, general-purpose core, such as Intel's Core Duo or AMD's X2 technology. In the Intel and AMD models the full capabilities of doing all kinds of computing at high rates of speed exists, which makes them very well suited for general-purpose computing. The same cannot be said for Cell processors. They do what they do very, very well, but in order to carry out more general-purpose workloads, many more sequences of instructions must be executed, meaning it takes much longer to process data.

Cell carries with it a specialized, or focused idea of computing, and it will be very well suited for its targeted applications. But make no mistake about it: while Cell might be introducing a concept that will eventually morph into what we end up seeing in the general-purpose computing world, it will not be Cell itself that takes the world by storm. It will be a marriage of existing multi-core, general-purpose computing models coupled to the realization that not all workloads require such generalized equipment.

The future
I still believe whole-heartedly that the future of microprocessor design must center itself and focus on the reality that the microprocessor designer cannot possibly know all of the varied and variable workloads that will come at microprocessors in the years to come. As such, the central core must be designed in a completely flexible manner, allowing the plug-in, add-on abilities of whatever specialized pieces of technology are required to carry out the workload in the most efficient manner possible. I truly believe that with everything I've got inside. It's been my dream for a very long time, and one I wrote down on paper nearly 12 years ago.

Cell technology will change specific types of computing. It will add new power and abilities to specific industries, primarily those that have many sequential, specialized computing needs, and it will be well received once its makers get the manufacturing issues resolved. Today's high-end Cell processors have demonstrated enormous throughput potential for those specialized workloads. And what we are seeing is the promise of something greater, and the door opening to the realization that the single, general-purpose computer, while wholly desirable until about now, is about to change its face as well.

The future will be about multiple, specialized cores running in parallel to carry out the requisite workloads. And today's encapsulated angle of that vision resides in Cell technology. Tomorrow's version will not be quite the same, but it will be an evolution of what we see in Cell now.

Please post your thoughts below on Cell technology and what it has the potential to offer to the world.
 
its a very good read. shows some different views. and honestly, its very smart to be reasonable. the cell will not fluke, but also its not the second coming of sliced bread. i believe it will be very sucessfull, but i don´t have the knoledge to say that it will be standard for PC´s.

for gaming (and that what matters most in this forum) i´ve read a lot of developers say that in the future what they will try to do is assign tasks for each core, kind like one for the OS, one for graphics, one for AI. makes sense. if that happens then we could see a huge improvment in gaming.

what´s interesting is that with the launch getting closer, and also TGS, the press is shifting its focus on the PS3 coverage. past the price point, there´s nothing more to criticize, and we´re beggining to see that.
 
I cant see the Cell becoming a PC CPU alternative to AMD or Intel, siply put because of the FlexIO and XDR RAM needed, they are just too expensive, we already know that they are in certin Medical and military computers and are in the new Blade Severs from IBM, but it wont make it ot mainstream PC any time soon if at all.

Nice read Mr. Deap 👍 ohh and nice info Solid, shme everything that Merral lynch has to say is total pish.
 
I agree with sprite, I believe that the Cell's most widely used applications will be in devices like Televisions, Cell phones, and other household devices. This will enable them to do much more without a lot of power consumption, and for TV's, more importantly, possibly lower the cost of HDTV's, and raise the efficiency.
 
The Pre 2005 E3 Sony Press Show, available in the first post of this thread, shows how Cell can and will be used for multi-media. It's at the 1:06 mark.
 
I'm evil, so let highline the part where it show it cannot really replace a PC.

Cell is Multi-core inside
Cell processors comprise many cores, but each of them is somewhat less than would be found in a straightforward, general-purpose core, such as Intel's Core Duo or AMD's X2 technology. In the Intel and AMD models the full capabilities of doing all kinds of computing at high rates of speed exists, which makes them very well suited for general-purpose computing. The same cannot be said for Cell processors. They do what they do very, very well, but in order to carry out more general-purpose workloads, many more sequences of instructions must be executed, meaning it takes much longer to process data.

Cell carries with it a specialized, or focused idea of computing, and it will be very well suited for its targeted applications. But make no mistake about it: while Cell might be introducing a concept that will eventually morph into what we end up seeing in the general-purpose computing world, it will not be Cell itself that takes the world by storm. It will be a marriage of existing multi-core, general-purpose computing models coupled to the realization that not all workloads require such generalized equipment.

The Cell is a wonderful CPU, but for small device & lower the cost of future media devices like a said some time before. BluRay & the Cell goes togheter. Though I don't know how good the Cell will be for gaming.

You guys should remember that Cell isn't Emotion Engine 2. The Cell is made for a total different purpose than EE.
 
fasj6418
its a very good read. shows some different views. and honestly, its very smart to be reasonable. the cell will not fluke, but also its not the second coming of sliced bread. i believe it will be very sucessfull, but i don´t have the knoledge to say that it will be standard for PC´s.

for gaming (and that what matters most in this forum) i´ve read a lot of developers say that in the future what they will try to do is assign tasks for each core, kind like one for the OS, one for graphics, one for AI. makes sense. if that happens then we could see a huge improvment in gaming.

what´s interesting is that with the launch getting closer, and also TGS, the press is shifting its focus on the PS3 coverage. past the price point, there´s nothing more to criticize, and we´re beggining to see that.

The Xenon is working the same way. Though each vector won't be fully used & some will be overloaded. Keep in mind that each SPU will be limited for only one kind of task. Like I said before, I really doubt it will use all 7. One of the SPU will make HDR, so no worry about HDR+AA. The GPU will do the AA & the cell will do the HDR. The Xenon is not good enough to make HDR. In a way, I can't wait to see how the PS3 will look like, but I don't intend to buy one for 2006-2007.

The games in my opinion will come out very very slow for 2007. About one or 2 games a month. There will be only 2 or 3 title at launch from what I know. What I mean, it's pointless to buy a PS3 a launch. Though they need to sell so they can have the support for 3rd party. THe 360 is in a great advantage for the beginning of 2007.

Here a neat video I found in Gametrailer.
What the little girl think about the 360.
http://www.gametrailers.com/umwatcher.php?id=9449
 
You will never see a risc cpu in a windows based pc. Windows requires so many instructions you would need a special Cell version of windows to run its applications with similar speeds.

but then comes spu array processing and parallel processing. SPU array is when all spu's process the same data in steps, this works like multiple pixel shaders in GPUs. Parallel is when you run individual applications on single spu's The Cell is a parallel processor and you would need software writin for a multithreaded CPU thats the only way the CEll can out perfom a current PC CPU, if the OS and software are designed for it.

Mr. Deap HDR and anti alaising is up the developer how they want to use it. The RSX can easily do HDR on its own(all Nvida gpus 6 and 7 can)
 
There are about 15 titles at launch.

And about 6 titles scheduled for release within the first quarter of 2007, 10 in the second quarter, 6 for Q3, and 7 for Q4. That's not including all the "TBA 2007" titles. That's 29 titles in 12 months, more than "2 per month".

Not to mention of those titles, these are all highly anticipated:

Half-Life 2
Assasins Creed
MGS4
Heavenly Sword
GTA IV
VF5
Unreal 2K7

I can also list the TBA titles that are all highly anticipated...but why waste the time.

Point is, PS3 will have great software support from the start, not like the 360, where it took months (nearly a year) to really get "good" software support. Only now are AAA titles beginning to see the light of day on the 360.
 
LaBounti
You will never see a risc cpu in a windows based pc. Windows requires so many instructions you would need a special Cell version of windows to run its applications with similar speeds.

but then comes spu array processing and parallel processing. SPU array is when all spu's process the same data in steps, this works like multiple pixel shaders in GPUs. Parallel is when you run individual applications on single spu's The Cell is a parallel processor and you would need software writin for a multithreaded CPU thats the only way the CEll can out perfom a current PC CPU, if the OS and software are designed for it.

Mr. Deap HDR and anti alaising is up the developer how they want to use it. The RSX can easily do HDR on its own(all Nvida gpus 6 and 7 can)

I may be mistaken, but I'm pretty sure I remember IBM exec's talking about how the PPE is capable of running Windows OS's.

The SPE's do not make up the processor, they tag along with the PPE. If a mother board were to release that supported the flex I/O of the Cell, and supported XDR memory (IIRC the cell requires both) then it should be no problem to run a Windows OS on the Cell processor. However, doing this, without any additional software support, would make the SPE's useless since they would have no instructions being sent to them.
 
Back