[quote name='Lan_Zer0']Good, so we can expect even more from the Wii?[/quote]
I have looked at hardware comparisons for last gen systems a lot, and there is some stuff I don't understand completely in terms of architecture. I'll try to piece this together to the best of my ability, but if any proper computer architecture person (preferably a post-undergraduate) is wishing to correct me on any claims, I'd appreciate it the most.
The GameCube had a 485 MHz IBM "Gekko" PowerPC CPU, and a 162 MGz "Flipper" GPU (developed by ArtX). (
SOURCE) If you compare this to XBOX 1 specs, you will see that in terms of pure numbers alone the system ought to out-perform GameCube to shit. (You can make your own comparison of what the actual products look like in comparison of the systems)
XBOX1 had a 32 bit 733 Mobile Celeron CPU, and a 233 MGz "NV2A" GPU. (
SOURCE) The improvement over the GameCube is massive, but there are other items in the equation apart from the chip's ability to churn out numbers. I won't make a scientific claim (enelctrical engineers would be much better qualified for such), and instead will show just raw numbers for the Wii.
Oh, that's right. We DON'T HAVE THOSE AT ALL. Not officially, anyway. However, there's this IGN quote:
[quote name='IGN']Insiders stress that Revolution runs on an extension of the Gekko and Flipper architectures that powered GameCube, which is why studios who worked on GCN will have no problem making the transition to the new machine, they say. IBM's "Broadway" CPU is clocked at 729MHz, according to updated Nintendo documentation. By comparison, GameCube's Gekko CPU ran at 485MHz. The original Xbox's CPU, admittedly a different architecture altogether, was clocked at 733MHz. Meanwhile, Xbox 360 runs three symmetrical cores at 3.2GHz.[/quote]
And the outline of the other technical specs are found on the Wikipedia Wii page:
PowerPC "Broadway" CPU reportedly clocked in at 729 MHz.
ATI "Hollywood" GPU reportedly clocked in at 243 MHz.
Here's a very nice bit:
[quote name='IGN']Clearly, numbers don't mean everything, but on paper Revolution's CPU falls performance-wise somewhere well beyond GameCube and just shy of the original Xbox. However,
it's important to remember that there is no way to accurately gauge the performance difference between GCN's PowerPC-based architecture and the the Intel-based CPU of Xbox. Further, even if we could, these numbers are only one part of the equation.
Revolution's ATI-provided "Hollywood" GPU clocks in at 243MHz. By comparison, GameCube's GPU ran at 162MHz, while the GPU on the original Xbox was clocked at 233MHz. Sources we spoke with suggest that it is unlikely the GPU will feature any added shaders, as has been speculated.
"The 'Hollywood' is a large-scale integrated chip that includes the GPU, DSP, I/O bridge and 3MBs of texture memory," a studio source told us.
The overall system memory numbers we reported last December have not greatly fluctuated, but new clarifications have surfaced. Revolution will operate using 24MBs of "main" 1T-SRAM. It will additionally boast 64MBs of "external" 1T-SRAM. That brings the total number of system RAM up to
88MBs,
not including the 3MB texture buffer on the GPU. By comparison, GameCube featured 40MBs of RAM not counting the GPU's on-board 3MBs. The original Xbox included
64MBs total RAM. Xbox 360 and PlayStation 3 operate on 512MBs of RAM.[/quote]