Sony "helped design" 360 processor

[quote name='zewone']I'm not sure if that's the case. RE5 is already looking better on the 360 than PS3.

http://www.the-horror.com/index.php?id=features&s=bh5demo

Valkryia Chronicles is a gorgeous game because of the art style, not a GPU powerhouse.[/quote]


I saw some comparison clips of a guy who played both demos (with mouse overs), wish I could remember where I saw it, but they looked different but pretty much the same. Kind of like GTA IV in that which one is better is dependent on who is playing.

http://www.gametrailers.com/player/44237.html

And Really, Killzone 2 is an average looking game?
 
[quote name='H.Cornerstone']

And Really, Killzone 2 is an average looking game?[/quote]

It's a good looking game, but nothing jaw-dropping or eye opening.
 
[quote name='SL4IN']It's a good looking game, but nothing jaw-dropping or eye opening.[/quote]

I said the same thing about Uncharted and then I actually played it, and my jaw did drop. Sometimes, you just need to play a game yourself to appreciate it.

Either way, I am done as I do not want to take this thread anymore off topic.
 
[quote name='H.Cornerstone']
And Really, Killzone 2 is an average looking game?[/QUOTE]
I don't own a PS3 nor do I know anyone who does, so I don't know what games typically look like. Maybe it looks good for a PS3 game, but after Crysis and various other shooters with texture replacement packs on the PC, it just looks average to me.
 
[quote name='H.Cornerstone']

Back to the point of the thread, The Wii uses a form of the PowerPC as well, so pretty much you could say Sony funded it's development as well /sarcasm.[/QUOTE]

Actually, that would be Ford. The Gecko chip used in the GameCube and revised for the Wii, most closely resembles the PowerPC chip used in automotive controllers. The other big user of that version of the PowerPC architecture is, of course, Apple but that was Motorola (now Freescale) doing the design work. Moto and IBM ran a shared design center with both companies getting access to the IP. Much of the vector ops in the Cell are derived from work done for Apple under the marketing name AltiVec.
 
[quote name='epobirs']The original concept for the Cell was that they would scale with great efficiency as more Cells were added. Some of the early Sony claims, long before there was any real silicon, was that the PS3 or other Cell based device would be able to enlist Cells in other household devices for supplemental processing power. The idea that this could be used for games, especially on a console where homogeneity is critical, is completely absurd. It might be useful for lengthy non-interactive tasks like transcoding HD video but the cost of dedicated silicon for that purpose has always been more cost effective.

The 'box full of Cells' concept kept going for a few years of development but two problems kept it from becoming a reality. Just as with most existing CPUs, scaling past two sockets had a big efficency hit. Anyone with a quad-core PC has seen this in how most apps not specifically designed for it don't make much better use of four cores compared to two. The gains for most use is that most systems are doing more than one big app these days. In servers, techniques like virtualization allow the resources to be allocated so that no one app is really seeing all of the processing elements.

The Cell really isn't a multicore processor in the same sense as a Core 2 Quad or a Phenom. It is a single PowerPC core with a bunch of specialized execution units bolted on. It has a lot of power but this is used in a very different way than something like system with multiple x86 processors.

So putting a bunch of Cells in a box wasn't going to deliver as planned. This was just as well because the Cell was proving to be a bitch to manufacture. Much as when the PS2's Emotion Engine was first produced, a smaller process node was badly needed for good yields and pricing. Putting multiple Cells in a product would have made it so expenisve as to make the eventual losses from the actual launch PS3s seem minor. So they went shopping for a GPU to use in a more conventional design.

If you look at videos from the first E3 where Sony demoed the PS3, even though no actual PS3 boards yet existed, you'll see two different sets of demos. Some are Cell demos and others are Nvidia GPU demos running on a PC. This is because there were no systems integrating the two then and certainly no time to do any programming on it.

If you were starting from scratch today with the available parts, the original PS3 concept could be implemented. It would still have the scaling issues but compiler support for genuine multi-processor systems is a lot better now than just a few years ago. But in general it's unlikely anyone would see it as a good design to pursue for an entertainment system. For other apps it makes more sense to put Cells on bladesto be installed in a server rack rather than make a standalone system.[/quote]

Thanks for the info epobirs. Would you say similar graphical performance could have been achieved if the PS3 had 2 or 3 cells instead of 1 cell + the nvidia GPU?
 
Can a game that looks as good as Killzone 2 be produced on the 360, or will MS need a new console to get a game that's comprable graphically?
 
[quote name='Maklershed']Hey Thomas, I see you have an xbl name linked to your account now. Did you get in on that XBL deal from Amazon/Walmart?[/QUOTE]

no I'm broke, my XBL is Thomaticus 305, I need to get Gears 2 and an one year online, If I had an extra 30 I would have bit one it. [when I created Thomaticus 305, Thomaticus had been taken, but I think it was me who created it, but there's no way to find out, at least not from MS] I linked it to my cag account to see is Thomaticus was actually being used by someone, but I don't think it is, now I need link Thomaticus 305 to my account.
 
[quote name='rickonker']Thanks for the info epobirs. Would you say similar graphical performance could have been achieved if the PS3 had 2 or 3 cells instead of 1 cell + the nvidia GPU?[/QUOTE]

Hard to say. The demos were pretty good for their era but not really graphic splendors. They were more concerned with getting the graphics to do interesting stuff. This is largely the same path the big GPU companies are pursuing, putting more and more general purpose logic in their products and improving the tool set for exploiting it. Similarly, Intel is developing their new graphics product will inherently be very easily programmable for other tasks because it is a collection of modified 90s era Pentiums. At today's manufacturing sizes for chip features, what was once a big CPU can be tiny and one of many on the same die. Intel doesn't seem to be in any great rush to make this a product, though. I didn't see any reference to it in their CES booth yesterday but I didn't have as much time as I would have liked before I got horribly ill and had to go back to my car. (I paid ten bucks to take a cab a distance I'd have normally walked.)

I regard the original PS3 concept as an example of what I call '7th Guest' Syndrome. For those who don't remember, 7thGuest was a PC game that was first announced during the 386 era and promised to do extraordinary things for that day. They had some remarkable tech demos to back up their claims but getting from there to a releasable game took far too long. By the time the game shipped, the average machine owned by PC gamers had increased in power considerably. This made the accomplishments of the software much less impressive. Also, other developers had been working toward the same goals but had done so quietly so that weren't committed to a release date they could never hope to make.

Unless you're trolling for investors, it's a bad idea to talk up technology you don't really have yet.
 
Since they went with a regular GPU for the PS3 anyway, I wonder if they would have been better off with a more conventional CPU design instead of those execution units. If those were intended to help with the graphics, doesn't the nvidia GPU make them less useful and just harder to use regularly? Maybe it was too late to change?
 
[quote name='rickonker']Since they went with a regular GPU for the PS3 anyway, I wonder if they would have been better off with a more conventional CPU design instead of those execution units. If those were intended to help with the graphics, doesn't the nvidia GPU make them less useful and just harder to use regularly? Maybe it was too late to change?[/QUOTE]

Power is power. Only a few areas of the Cell are done with graphics as the target. This is no different in the Xbox or in x86 CPUs since MMX was introduced. A SIMD instruction may be best applied to processing graphics but aren't so specific as to eliminate their value in other algorithms.

The problem with the Cell is that it uses a structure that is most common to 80s supercomputers. Those generally used a LOT of highly optimized custom programming and were rarely applied to interactive operations. This meant there was very little precedent in the game development world for how to apply such a beast. The benchmarks are real but not an accurate measure of how readily the power can be harnessed to the needs of a game.

In the long run, the PS3 will almost certainly have a higher performance ceiling than the Xbox 360 but the value of this is questionable. Microsoft had the highest performance ceiling of the previous generation and by a considerable amount. But it was the least powerful machine of that generation that held the top position and by a considerable margin. The PS2 is still selling units while the original Xbox ended production and new game development years ago.

By the time the PS3 achieves a position of dominance it will be simple for Microsoft to release a new, backward compatible Xbox that will jump ahead in performance. By 2010 the premium for using a Blu-ray drive instead of DVD will be only about $40. Possibly less, depending on how sales of Blu-ray decks go. (Distribution by download will grow but it will be many years before the need for a cheap high capacity retail medium goes away.)

On the chipset front, 32 nm manufacturing will be available, allowing the new silicon to have over twice the transistor count while still launching at a lower price than the original 90 nm 360 chip set. The upgrade path on the GPU side is pretty straightforward since ATI has continued developing much of the concepts first demonstrated in the 360 GPU. The CPU will be more of a problem if big performance gains are desired. Clock rates just haven't gone up at all for a few years now. The simplest thing would be to just double the number of cores and expanding the caches all around. This puts a lot more potential power in the box but scaling is always a problem if all of that power is to be used by a single app like a game.

OTOH, extra cores can be put aside for non-games functions like background downloading and DVR functions. (Selling an add-on package that enable the Xbox to also be a DVR can be a good way of offsetting the hit from selling the newly launch system for below cost. In the past this wasn't practical because game systems didn't have the power to spare for recording and playing simultaneously but future machine will need additional functions to justify all of that power in one box.)

So the PS3 may enjoy a longer lifespan for early adopters but for the majority of the market, will it matter?
 
Thanks again epobirs. I think MS demonstrated IPTV DVR software for the 360 but I don't know if it could record while a game was being played. I guess it would have to.
 
[quote name='Paco']I heard that Alan Wake is Duke Nukem's younger brother.[/QUOTE]

I saw a clip of Alan Wake on XBL, and the game doesn't look as good as Left 4 Dead. Also, If "Light" is going to be a main weapon in the game, I would hope that they improve the lighting from what saw in the trailer. Alan Wake looks like an N64 game.
 
Killzone 2 is about the 3rd generation of PS3 game, if you were to compare 3rd generation PS3 games to the 3rd (or 4th) generation of 360 games, there's a difference there, in terms of graphical quality.
 
[quote name='Maklershed']Look at Resistance 2 and then look at Gears of War ... 1 ... and get back to me.[/quote]

They both have their plusses and minuses that make them look great. It's hard to compare Third person and first person shooters like that. I would say Resistance 2 looks better than Gears of War due to the fact it has color and has much bigger scale. I would say Gears of War 2 looks a little better, but again, that is only because it doesn't have near the scale Resistance 2 has.

Gears of War 2 is a third generation Xbox 360 game, where as Resistance 2 is 2nd. :)

Let's compare Uncharted 2 and Gears of War 2 at the end of the year. They are much more similar and will make for a better comparison.
 
[quote name='Maklershed']I was saying Gears of War 1 looks better graphically than Resistance 2.[/quote]

I would disagree, and again it's all about opinion. I would go with Resistance just due to the sheer scale they are able to pull off in the game, and still make it look great. But they are close, and I think you could attribute that to Insomniac overreaching for the MP and SP.

But again, it's hard to compare a FPS and a third person shooter. Gears of War doesn't really have as much on screen at once as Resistance.
Now, if you wanted to compare Uncharted and Gears of War 1, that would be a much better comparison, and I think most people would go with Uncharted.
 
I agree that Uncharted is a much prettier game than Gears of War. What's interesting about graphics is just how much design comes into play when we discuss "good" graphics. For example, I think Mario Galaxy is still one of the best looking games this console generation despite the Wii's obvious lack of power comparatively. Best looking? Not quite, but certainly ranks up there.
 
[quote name='Maklershed']I was saying Gears of War 1 looks better graphically than Resistance 2.[/QUOTE]

Gears 1 looks better than Resistance 2? you know what for the sake of argument, don't use Resistance 2 use MGS4... Gears(1 and 2) is the best that the 360 has to offer(graphically), and MGS4 is the best that PS3 has to offer. Looking at the consoles best there are two things I can say; 1. PS3 games in their 2nd generation look better than 360 games that are in their 3rd generation. 360 programming is as easy as its going to get... where do the 360 games go after 3rd generation? I think its obvious that PS3 games have improved tremendously over the past few years, Warhawk, to Uncharted, then to MGS4, and on to Killzone.
 
[quote name='elwood731']I agree that Uncharted is a much prettier game than Gears of War. What's interesting about graphics is just how much design comes into play when we discuss "good" graphics. For example, I think Mario Galaxy is still one of the best looking games this console generation despite the Wii's obvious lack of power comparatively. Best looking? Not quite, but certainly ranks up there.[/QUOTE]

All the marios have been so well done that their graphics still look good. Obviously, you can see the limitations of the console, but the design of the graphics still look good. I may be behind the times on this thought, but I feel that Waverace 64 still has the best looking water, and water physics, of any game.
 
[quote name='elwood731']I agree that Uncharted is a much prettier game than Gears of War. What's interesting about graphics is just how much design comes into play when we discuss "good" graphics. For example, I think Mario Galaxy is still one of the best looking games this console generation despite the Wii's obvious lack of power comparatively. Best looking? Not quite, but certainly ranks up there.[/quote]

You are correct, when it comes to making a pretty looking game, there is much more than just graphical prowess (which is what we are talking about here), and that has more to do with art style and design choices and what not. Which is one of the reasons God of War is the best looking game on PS2. It had both, great art style and graphical prowess. Another game we mentioned in this was Valkryia Chronicles. Doesn't use a lot of power, but looks great due to art style, but it still uses enough to pull off the art style.

Which goes back to why many think Uncharted looks better. What is going to look better, a black and gray planet or a beautiful jungle with lush plants and giant waterfalls? (which still to this date is the first time I ever dropped my jaw due to how awesome a game looked).

I think Mario Galaxy looks great for a Wii game, but it's hard to compare it to other games on PS3/360.
 
Uncharted looked nicer, but Gears really nails the post apocalyptic look, as well as the underground. Especially Gears 2 since it's more varied and has levels in Forrest etc.

The graphics in Uncharted and the Gears series are great and both do what they should for the type of game IMO.

Essentially the 360 and PS3 are nearly identical graphics wise--or at least close enough that graphics power shouldn't factor into a decision on which to buy. Just get whichever has the games you want to play and if both have a lot of games you want to play and you have the time to play them all the the money to buy them then buy both.

No sense arguing about which is better at this or that on the internet. Just buy a console (or both) and enjoy the games on it.
 
I just was thinking about some months ago, Crytek (or some company) was showing that they got the Crysis engine to work on the PS3 and 360. Has there been any games that has utilized the Crysis engine on PS3 and/or 360, if not are their some game in development that will?
 
[quote name='rickonker']Thanks again epobirs. I think MS demonstrated IPTV DVR software for the 360 but I don't know if it could record while a game was being played. I guess it would have to.[/QUOTE]

There is a really important difference between IPTV and a full standalone DVR capability. If you ever used an integrated DVR with a satellite/cable TV setup, you may have noticed that the quality of the recording s was higher than when using a standalone device added to a TV reception system. This is because the integrated DVR is seeing the compressed data stream from the sender and is simply recording the bits unaltered as they go by. A standalone box, such as an original TiVo or Media Center PC, has to take the decompressed video stream and encode it to whatever codec is being used. (MPEG-2 in older gear and an MPEG-4 variant in newer machines.)

Obviously, just capturing a data stream without having to perform much work on it takes far less horsepower than having to encode video while capturing. A game console that can continue running content downloads in the background while playing a game won't have much trouble including IPTV content in those downloads. It's just another file. Doing the full standalone DVR recording task while allowing games to be played would be far more demanding.

This would requirement a fair amount of processing power to be dedicated to the task and never available for game developers to exploit. Otherwise, the game developers are going to use it and break the DVR functionaility while their game is engaged. It's their job to get as much as possible out of the machine.

The alternative is a console with hardware features usable solely for a non-gaming function. Not a problem if you feel assured tens of millions of people will want to buy such a unit but in real life you'll get a lot of people saying they already have a device for DVR stuff or just don't care about it, so why can't they have a cheaper model without it?

So having a generational upgrade that doesn't apply to game developers is an iffy thing. Background downloading during gameplay would be nice and that would encompass IPTV material. Beyond that, things get very speculative. This is why just throwing more cores in the new system is not a slam dunk solution. The GPU upgrade path is pretty clear cut since almost everything happening in that business can be applied to gaming, while CPU needs are far more narrow than for a full on PC or server.
 
[quote name='epobirs']There is a really important difference between IPTV and a full standalone DVR capability. If you ever used an integrated DVR with a satellite/cable TV setup, you may have noticed that the quality of the recording s was higher than when using a standalone device added to a TV reception system. This is because the integrated DVR is seeing the compressed data stream from the sender and is simply recording the bits unaltered as they go by. A standalone box, such as an original TiVo or Media Center PC, has to take the decompressed video stream and encode it to whatever codec is being used. (MPEG-2 in older gear and an MPEG-4 variant in newer machines.)

Obviously, just capturing a data stream without having to perform much work on it takes far less horsepower than having to encode video while capturing. A game console that can continue running content downloads in the background while playing a game won't have much trouble including IPTV content in those downloads. It's just another file. Doing the full standalone DVR recording task while allowing games to be played would be far more demanding.

This would requirement a fair amount of processing power to be dedicated to the task and never available for game developers to exploit. Otherwise, the game developers are going to use it and break the DVR functionaility while their game is engaged. It's their job to get as much as possible out of the machine.

The alternative is a console with hardware features usable solely for a non-gaming function. Not a problem if you feel assured tens of millions of people will want to buy such a unit but in real life you'll get a lot of people saying they already have a device for DVR stuff or just don't care about it, so why can't they have a cheaper model without it?

So having a generational upgrade that doesn't apply to game developers is an iffy thing. Background downloading during gameplay would be nice and that would encompass IPTV material. Beyond that, things get very speculative. This is why just throwing more cores in the new system is not a slam dunk solution. The GPU upgrade path is pretty clear cut since almost everything happening in that business can be applied to gaming, while CPU needs are far more narrow than for a full on PC or server.[/QUOTE]
I guess Sony has tried some things along those lines. An SPE in the PS3 is reserved for the OS so games can't use it. They also tried the PSX, a PS2 with a DVR, but I think it was a flop. In that case, you could say the regular PS2 was the cheaper model without DVR and it was obviously much more popular. On the other hand I think the PSX was really overpriced, like a luxury item.
 
[quote name='rickonker']I guess Sony has tried some things along those lines. An SPE in the PS3 is reserved for the OS so games can't use it. They also tried the PSX, a PS2 with a DVR, but I think it was a flop. In that case, you could say the regular PS2 was the cheaper model without DVR and it was obviously much more popular. On the other hand I think the PSX was really overpriced, like a luxury item.[/QUOTE]

The reason the PSX was so costly was that you were buying an entire DVR with complete functionality independent of the PS2 portion. They just happened to be stuffed in the same box. The PS2 alone was far from capable of handling the encoding in real time without completely tying up the system, meaning no games or watching other recordings at the same time. The PS2 had a complete MPEG-2 decode solution in hardware but as games tended to use that it wasn't enlisting for the DVR stuff. Back then, the makings of a DVR were a lot more expensive.

Sony could do this for a much lesser premium than was needed then but now they'd be up against hardware offered directly by the satellite and cable companies which deliver better quality by directly recording the compressed data stream. Far better for console companies to offer video downloads and VOD stuff which only require software support on the existing consoles.

The reservation of the SPE in the PS3 is largely for OS overhead. There is a thread reserved on the Xbox 360 CPU for largely the same reasons. This is why you can pause a game to go out to the system menu and come back to where you left off. This also deals with stuff like handling the communications for online gaming, downloads, etc. This makes supporting this stuff much easier on developers than the previous generation and in turn making developers more willing to do this stuff, even in games that don't really demand it.
 
bread's done
Back
Top