PHaLaNX GTR
CAGiversary!
PS2 had, by far, the strongest CPU, but had no GPU. Without a GPU, the PS2 was fighting on an uneven battle ground, since the CPU had to basically do all the graphical rendering via "software". While, on the other hand, the other consoles alleviated all that burden from their respective CPUs by using hardware designed specifically for graphics rendering (GPU). This giving the Xbox the definitive advantage over the PS2 and enabling the NGC to go heads up, and, at times, ahead of the PS2. Microsoft had Nvidia in its GPU corner and Nintendo had ATI. Now with the release of information regarding the next generation of game systems, Sony has just announced that they will have the always venerable Nvidia at its side. If the Cell is as capable as its brethren, the Emotion Engine, for its time, then Sony may just have the one two punch, if all goes well. But then again, from what I surmized MS seems to have something up their sleeves with their next gen processor, as it seems to share a few ideas as the Cell, with multilateral processing, or some thing or the other, where multiple processors are working together as one. So this should be interesting as always. I can't wait to see the combined might of these two processors working together.
Hey, IGN can be useful, at times.
http://ps2.ign.com/articles/571/571460p1.html
December 07, 2004 - In an announcement made around the midnight hour earlier this morning, Sony Computer Entertainment and NVIDIA Corporation jointly confirmed that they have teamed up to create the graphics chip for Sony's next videogame console. Though the system has not yet been given an official name, the highly anticipated PlayStation 2 follow-up (known in most circles as the PS3) will certainly have some powerful hardware behind it -- as the SCEI/NVIDIA collaboration will incorporate the next-generation GeForce technology as well as Sony's system solutions for Cell Processor-enabled consoles.
The collaboration itself has been made with a multi-year royalty-driven agreement in mind, with the custom GPU serving as the foundation for the PS3's graphics and image processing functions. Interestingly enough, this agreement will go beyond the confines of just the PS3, however, and will also apply to future Sony digital electronic products as well. The custom graphics chip will be manufactured by Sony's Nagasaki Fab2 group in addition to OTSS -- a joint fabrication facility co-run by Toshiba and Sony.
"In the future, the experience of computer entertainment systems and broadband-ready PCs will be fused together to generate and transfer multi-streams of rich content simultaneously. In this sense, we have found the best way to integrate the state-of-the-art technologies from NVIDIA and SCEI," said Ken Kutaragi, executive deputy president and COO, Sony Corporation, and president and Group CEO, Sony Computer Entertainment Inc. "Our collaboration includes not only the chip development but also a variety of graphics development tools and middleware, essential for efficient content creation."
"We are thrilled to partner with Sony Computer Entertainment to build what will certainly be one of the most important computer entertainment and digital media platforms of the twenty-first century," added Jen-Hsun Huang, president and CEO, NVIDIA. "Over the past two years NVIDIA has worked closely with Sony Computer Entertainment on their next-generation computer entertainment system. In parallel, we have been designing our next-generation GeForce GPU. The combination of the revolutionary Cell processor and NVIDIA's graphics technologies will enable the creation of breathtaking imagery that will surprise and captivate consumers."
Speculation as to when further announcements regarding the PlayStation 3 (or whatever it ends up being called) continue to run rampant throughout the industry. Several industry insiders point to E3 for the big unveiling, while others feel that the GDC may provide a surprise reveal early next year. Whatever the case may be, IGN will be there to bring it to you.
-- Jeremy Dunham
Hey, IGN can be useful, at times.