I can't remember getting triple beeps though I may have...
I think usually it writes data to the drive for a few seconds and shuts down-the whole process can take a little bit, it's not instantaneous for me.
I don't eject the game. You should if you're going to move the system I guess, but otherwise I've never removed games when turning off the system-shouldn't need to if the drive is remotely normal.
[quote name='TURBO']calling eDRAM ram is like calling a Ferrari transportation.[/quote]
No it's like calling a car a car. It's RAM. I'm making no judgment call on what type of RAM it is, how fast it is, or anything else. I'm calling it by the words that describe it. The "e" just stands for embedded for that matter.
like i said only the back buffer & z buffer need to fit in it & the whole buffer doesnt have to fit in it at the same time because it can tile
You're not understanding what that means. Yes, it can tile. I know that. THAT IS IN FACT MY POINT. The higher the resolution it's trying to render at, the less of the frame that fits into that 10MB RAM, the more it has to write to main system RAM. Main system RAM is VASTLY slower than that 10MB. The more you're having to rely on main system RAM, the slower it is-the system's effective memory bandwidth drops off more and more the higher the resolution it's rendering at.
In other words if you were doing 480p, and can fit an entire frame in that 10MB, then the effective video bandwidth is whatever that 10MB is, and also it's not having eat up bandwidth the rest of the system's using. But each time you increase the resolution, less and less percent of the frame is fitting into that 10MB, it's having less and less of a benefit-the video system's effective memory bandwidth falls.
This is in ADDITION to the image being harder to render just because there's more pixels being rendered. There is *NOT* this same kind of rapid drop off in performance on a more normal PC/PS3 setup, where you have the same effective memory bandwidth regardless of the resolution being targeted because you have a large chunk of video RAM. A more normal design the performance is going to fall off more or less linearly. The 360 is going to just plummet as soon as it has to start tiling and throwing part of the image in main system RAM.
(being eDRAM & having that level of speed/lack of latency).
First of all, "eDRAM" just means it's embedded, it doesn't mean it's magical or anything-in the case of the 360's it's video RAM is much faster than main system RAM. What you're ignoring is that the 360 LOSES that speed more and more the less and less of a frame that's fitting into that 10MB.
[quote[on the PS3 if it needs more than 256MB it has to use XDR and doing that incurs a much larger performance hit than anything possible on 360.[/quote]
Why? That's absolutely not the case. The PS3's GPU can freely use main system RAM too. It's a hassle in that AFAIK a programmer is going to have to worry about where to store textures, etc. if they need more than 256MB (unless they write a system that manages that for them) but it has the advantage of giving the Playstation 3 a bit over double the total system bandwidth that the 360 has...excluding it's 10MB video RAM. And again, that's an issue with higher resolutions because the higher the resolution, the lower the effective memory bandwidth the 360 has, whereas the PS3's total bandwidth is always around 50GB/s regardless of the resolution it's targeting (exactly like a normal PC setup). Do *NOT* say "oh, but you can't ignore the 10MB"...I know that. What you keep ignoring is that 10MB is less and less effective the higher the resolution.
put that 10MB in perspective, it was enough to render 2 640p back buffers per frame for halo 3; high & low pass HDR.
Even if true, you'll note that's not even 720p, and my entire point is the effective bandwidth drops off the higher the resolution.
I can't find the articles I've read that show the exact size you need for a given resolution, and I don't trust my own calculations, but with what Microsoft requires, HD resolutions don't fit entirely in that 10MB:
http://www.beyond3d.com/content/articles/4/5
...if I could find the exact figures again you might get a better feel for this, as you can flat out see the percent of the image fitting in that 10MB shrinking as the resolution goes up.
i said PPU not SPUs. & the RSX is not the same as the 7x00 series. it's just evolved from the same prototype. like i said no 3Dc even though the 7800 had it. don't believe me, register at B3D or gamedev.net & ask the guys w/ the dev kits.
As I said, from a quick Google, the 7800 series did NOT support 3Dc, it just converted that format to the format it used. And again you're missing the point that that's just an ATi thing. It's not a shock Nvidia is going to support their own compression technology and not ATi's branded technology. That doesn't in and of itself mean it's any worse. It could be better for all we know, but regardless it's not limited to 10 year old technology like you're claiming.
...S3TC was invented by S3, but RSX uses that; why arent they using this nVID invented compression tech? probably because it doesn't exist.
I have no idea why you're insisting Nvidia has done nothing with compression technology in the last 10 years.
if it did nVID would be using it in their own graphics cards (instead of 3Dc).
They are. Google is your friend. You seriously think they were just sitting there while ATi was implementing updated texture compression? I don't remember what they called it, or even if they gave it some cutesy brand name, but for that matter that's probably what their drivers were converting ATi's texture compression into.
it's not a conversion step like analog to digital. there is no data loss. it's digital to digital. 1080i or 1080p they both wait for all the lines to be delivered, assemble them into a frame, then display the frame. the only difference is the i signal is assembled in a different order.
Even if you had a deinterlacer that could do it perfectly, you've still got the issue that you have half the frame rate in that case. But deinterlacers are NOT flawless.