[quote name='SteveMcQ']Question: I have a monitor I use for the PS3 in my room that's 1080p. Most games only support 720p and thus look like shit on a 1080p native screen.[/QUOTE]
Depends on the quality of your monitor/TV's scaler.
[quote name='IAmTheCheapestGamer']OK. I'm glad we have that cleared up. I'm still glad I picked up a PS3, since it's a great system, but I've always been a worrywart with my game consoles.[/QUOTE]
And for good reason, given all the issues we've seen lately! That GTA4 crash thing would be spooking me (although maybe it's just a GTA4 thing?). Mine's fine so far...well, one crash in Oblivion.
[quote name='DestroVega']I see they have new 80 GB PS3's... so, now the SACD and BC is gone from the 80?
I kinda want the SACD feature, and had I known about it when I bought my PS3 last year, would have bought the 80 instead.[/QUOTE]
The new 80GB unit is really just the current 40GB unit with a larger hard drive and all the same limitations. It's different hardware from the current Metal Gear Solid 4 bundle 80GB units (and the older 80GB bundle).
[quote name='Hostile']Where's the source for the 80 GB that has BC being discontinued? I always thought the new MGS4 80 GB bundle was the new $500 model.[/QUOTE]
It is-the MGS4 80GB unit is fully featured, but they're talking about the new $400 80GB unit that's coming out in a few months. It's not really new at all, it's just a 40GB unit with a larger hard drive and all the same limitations as the current 40GB unit.
[quote name='aeauvian']Another question to add. How long will they be manufacturing the current 80 gig MGS4 bundle? That's the one it seems like I'd like the most, and 500 isn't bad (paid the same for my 360 elite + halo). I'd rather not get the one that's crippled coming out in september though. Does anyone know if they will keep the mgs bundle and replace the current 80gig with the new one? Or will they phase the bundle out alltogether? They seem to be OOS everywhere, so hopefully I can get one before they do get phased out

[/QUOTE]
I'm with Happy, I wouldn't count on it being around long. I'd like to be wrong, but so far Sony seems very uninterested in giving us a PS3 model we want (the 20GB unit was hard to find, the 80GB unit wasn't on the market long, we went over 6 months with only the 'tard pack 40GB unit, etc.)
[quote name='TURBO']a Ferrari is car in a sense. But you dont call it that. You call it a Supercar. eDRAM is not normal ram, just like a Ferrari is not a normal car.[/quote]
Literally all that "e" is for is "embedded". It is RAM, and that's what you call it-words have meaning...
I still have no idea why you're talking about what that 10MB is called...
When you talk about RAM you're talking about main memory, not embedded.
No, you specify what you're talking about. You'd say main system RAM, or video cache/RAM/whatever in the case of that 10MB.
You seem to be under the impression the 360s main ram is slower than PS3s; you'd be wrong.
Not sure how you got that impression since I said no such thing. The PS3 does have roughly double the total bandwidth by virtue of using a more normal main RAM/video RAM setup though.
PS3 (not even taking into account the latency of XDR) reads XDR at 15.5GB/s & writes to it at 10.6GB/s.
That's not what the specs say...
Even if the eDRAM didnt exist (which it does) the 360's GPU would still have a significant memory advantage.
Only if all the graphics work can fit into that 10MB. If not, then it effectively has less and less bandwidth the less of the graphics work that fits into that 10MB.
Also quit saying PS3 & PC. PC has nothing to do w/ PS3.
For what I'm talking about the comparison is completely valid.
There is no rapid drop off in performance.
How can you possibly argue there's no drop off? If it's not fitting into that 10MB, it's coming out of main system RAM. You seem to be arguing both ways-earlier saying the 10MB RAM's bandwidth gives it an advantage but then claiming there's no dropoff in performance once that 10MB is full, which implies that you think it's bandwidth doesn't matter at all.
...This is the main reason 360 almost always has at least double the AA in cross platform games
I assume the 360 would use AA because it's got that hardware sitting there that can't do anything else. It's hardwired, and doesn't "cost" much to use. So that's an advantage if you care about 2-4x AA. (Personally I don't as I never use AA in PC games, and have never thought it's particularly useful in any HD resolution, but for some reason lots of people love it...)
... & of course the whole framebuffer isnt going to fit into 10MB; it's not supposed to. Honestly, why do you think there's GDDR3 connected to the GPU if they werent going to use it? You think it does everying in 10MB eDRAM & then just spits it out to your TV? unpossible. It uses the eDRAM to do 10MB at a time faster than could be done w/ the GDDR3.
This is exactly my point. And the smaller the percent that can be done in that 10MB, the less advantage it gives-the higher the drop off in performance. A more typical PS3/PC design doesn't have that. Not saying it's a bad design at all, it just is even less suited to doing progressively higher resolutions.
it also effectively doubles bandwidth because calculations which need to be sent twice (like halo 3's 2 pass HDR) only have to go through the main memory pipeline once instead of twice.
Which is all fine, but doesn't change that you lose more and more of it's advantages as you have to store more and more of the frame in main RAM.
Regarding texture compression, you just said the 3Dc was supported THROUGH DRIVERS for the 7800 series...which is exactly my point, it's NOT actually supported. They didn't "remove" anything. There's no reason to convert 3Dc to Nvidia's format on a dedicated piece of hardware like the PS3-they're just going to put textures in Nvidia's native format to begin with.
I have no idea if Nvidia's of the time was as good as ATi's or not, but just because they didn't give it a fancy brand name and/or didn't expose it as a new format doesn't mean it wasn't competitive. At any rate I'm just saying they're not sitting there with 10 year old hardware. It works well enough from what I've seen.
[quote\You think devs are going to write their games for 2 different compression methods? no, 3Dc is the best out so that's what they use & thats why it's on nVID cards.[/quote]
Something getting supported doesn't necessarily mean it was the best.
Regarding interlacing-if it was so great, again, we wouldn't have moved away from it. If you have a super perfect deinterlacer it's not as huge of a deal, but in a real-world TV/monitor it's not going to look quite as good as the progressive scan equivalent.
i wasn't making it up when i said that the 360 has the better GPU & the PS3 has the better CPU. It's reality. Accept it.
Why is that "reality"? How is it overall better? Why can't it actually produce better results if it's better?