PS3 FAQ's: PSN, Gaming, Hardware, Accessories, Gen. Discussions...etc.

[quote name='Thomas96']I started to make a new thread but oh well I'll put it here -

is this video of the PS3's new break apart controller? What makes me think so is the slip of the tongue the says during the video when it almost refers to the motion controller as a "Dual ...." something. Him saying Dual leads me to believe that perhaps the periphreal is Sony based... oh well just pure speculation on my part.

http://www.youtube.com/watch?v=-kjBASqJzzY[/quote]
http://kotaku.com/5024175/is-this-xbox-360s-motion-controller-in-action
it's actual this controller, a third party one that apparently is for either console.
 
I'm so beyond fucking frustrated with the network settings on the PS3 that it's unbelievable.

I'm using a wired connection. I have my PS3 and 360 sitting next to each other and I literally pop the ethernet cable out of one and plug it in to the other and go about my business. However, since the 2.4 update, I haven't been able to connect to the PSN at all. I had to manually update to 2.41 via thumbdrive. When I check my IP settings on the PS3, it gives off some random IP address and says (private networks only). I don't know what the fuck is going on and every solution from google searching has lead me to believe that power cycling the PS3, modem and/or router will be the magical solution to all my problems, but it didn't work.
 
I've got a couple questions about my (fairly new) 80 GB Metal Gear PS3.

Is the disc supposed to spin into and out of the drive? I read online that some consoles insert and eject the disc without spinning them? Is there a reason for the difference?

Also, is it normal for my PS3 to make small beeping sounds while it is on? Maybe "beeping" isn't the right word; I saw a video online where a console was making loud beeps (and that's not what mine does). It's not a click...but I don't know how to describe it. I only really notice it if I'm on the dash, and it's very quiet in my room. Somebody tell me they know what I'm talking about!
 
[quote name='Rig']I've got a couple questions about my (fairly new) 80 GB Metal Gear PS3.

Is the disc supposed to spin into and out of the drive? I read online that some consoles insert and eject the disc without spinning them? Is there a reason for the difference?

Also, is it normal for my PS3 to make small beeping sounds while it is on? Maybe "beeping" isn't the right word; I saw a video online where a console was making loud beeps (and that's not what mine does). It's not a click...but I don't know how to describe it. I only really notice it if I'm on the dash, and it's very quiet in my room. Somebody tell me they know what I'm talking about![/QUOTE]
spinning when taking & ejecting BD yes is normal.

beeping, i have no idea what youre talking about.
 
So I was looking at the back of some newer releases and noticed some of them only show compatibility up to 720p. The PS3 doesn't upscale (or whatever the proper technical term is) to 1080p? I have a monitor that is natively 1080p, so 720p doesn't look too great.

And another question referring specifically to GT5 Prologue (can't find a specific thread for it). How do you set the correct date in the home screen? I always get an error at the start of the game that the date/time is set incorrectly, but I can't find an option to correct it anywhere.

Thanks.
 
If the game is not 1080p, it's going to be taking a 720p image and your television will be upscaling it to 1080p (its native resolution). If you say, set your PS3 to output to only 1080p then the PS3 will be upscaling it to 1080p and then displaying that image.

Most likely your television scales better than the PS3, so I would recommend showing everything at its native resolution and then have your television take care of it.
 
[quote name='SteveMcQ']So I was looking at the back of some newer releases and noticed some of them only show compatibility up to 720p. The PS3 doesn't upscale (or whatever the proper technical term is) to 1080p? I have a monitor that is natively 1080p, so 720p doesn't look too great.

And another question referring specifically to GT5 Prologue (can't find a specific thread for it). How do you set the correct date in the home screen? I always get an error at the start of the game that the date/time is set incorrectly, but I can't find an option to correct it anywhere.

Thanks.[/QUOTE]

PS3 does not have a scaler. It has a horizontal strecher which is only bilinear. very rudimentary. some games which are rendered lower will upscale using the cell. but that usually incurs a performance hit, so it's only really recommended if you have a cheap HDTV (crappy internal scaler) or a HDTV monitor (no scaler just HDCP compliant input).

[quote name='yukine']If the game is not 1080p, it's going to be taking a 720p image and your television will be upscaling it to 1080p (its native resolution). If you say, set your PS3 to output to only 1080p then the PS3 will be upscaling it to 1080p and then displaying that image.

Most likely your television scales better than the PS3, so I would recommend showing everything at its native resolution and then have your television take care of it.[/QUOTE]

true. unfortunately a lot of TVs dont scale correctly. many will scale it simply but end up w/ cutting off a border of the picture (which should be seen) w/ overscan.

that's part of the reason so many people are angry w/ Sony for touting 1080p so much; harping on "True HD" incessantly & then not even including a hardware scaler when 99% of the games are 720p or below.
 
[quote name='Scorch']I'm using a wired connection. I have my PS3 and 360 sitting next to each other and I literally pop the ethernet cable out of one and plug it in to the other and go about my business. However, since the 2.4 update, I haven't been able to connect to the PSN at all. I had to manually update to 2.41 via thumbdrive. When I check my IP settings on the PS3, it gives off some random IP address and says (private networks only). I don't know what the fuck is going on and every solution from google searching has lead me to believe that power cycling the PS3, modem and/or router will be the magical solution to all my problems, but it didn't work.[/QUOTE]

Could you go back through network setup maybe or reset the console's settings? I don't know, mine worked just fine after 2.4 and 2.41, so hopefully nothing went wrong. Could even be the port broke or something unconnected with the update? I've got my systems and PC connected to a switch.

[quote name='TURBO']spinning when taking & ejecting BD yes is normal.

beeping, i have no idea what youre talking about.[/QUOTE]

That is weird that they'd spin while ejecting. Mine doesn't and I would have been worried about that too! I don't even see how mine COULD spin then.

I can't recall any beeping, although actually most of the disc based systems I've ever owned beep when they first spin up a disc. No idea why. The PS3 and 360 are about the only two I can remember NOT doing that.

[quote name='SteveMcQ']So I was looking at the back of some newer releases and noticed some of them only show compatibility up to 720p. The PS3 doesn't upscale (or whatever the proper technical term is) to 1080p? I have a monitor that is natively 1080p, so 720p doesn't look too great.
[/QUOTE]

Just FYI the 360 renders at 720p (or below), it's got a scaler chip in it similar to the one in your TV, so setting it to 1080p just has that chip scale the image rather than the TV. As mentioned if your TV's scaler is any good it shouldn't really make a difference.
 
[quote name='Wolfpup']
That is weird that they'd spin while ejecting. Mine doesn't and I would have been worried about that too! I don't even see how mine COULD spin then.[/QUOTE]

My brother's 60GB PS3 just accepts/ejects discs straight in/out. But with my 80GB PS3, the disc spins whenever it's inserted/ejected. So maybe it's just because of a new disc drive being used or something.

Anyway, I've got a question and I'm not sure if it's taboo to ask about something like this but... I was wondering, is there is a way to rip a DVD and play it on the PS3? I'm not talking about converting into a video file, I'm talking about an actual DVD image. And to clarify, these are legit DVDs that I own! I'm not talking about bootleg/downloaded movies.
 
Again, beeping isn't the right word for it...but I don't know of a better way to describe it.

I only know one other person with a PS3, so I plan on going over and listening to his. I have a feeling it's a simple console noise (natural), but I want to be sure.
 
[quote name='TURBO']PS3 does not have a scaler. It has a horizontal strecher which is only bilinear. very rudimentary. some games which are rendered lower will upscale using the cell. but that usually incurs a performance hit, so it's only really recommended if you have a cheap HDTV (crappy internal scaler) or a HDTV monitor (no scaler just HDCP compliant input).



true. unfortunately a lot of TVs dont scale correctly. many will scale it simply but end up w/ cutting off a border of the picture (which should be seen) w/ overscan.

that's part of the reason so many people are angry w/ Sony for touting 1080p so much; harping on "True HD" incessantly & then not even including a hardware scaler when 99% of the games are 720p or below.[/quote]

Yeah, I have 360 and PS3 set to 720p since that is closest to my native resolution of my television. I know some people set their consoles to 1080i but that seems like way too much processing taking place as then your television would have to downscale it to 768p as well as deinterlace the image.

Interesting the PS3 can't scale, so when you set it to 1080p or something and a game doesn't it support it... it's just outputting 720p and having your television take care of it anyway? Is setting the video output just letting the PS3 know what your television can handle?
 
[quote name='Vinny']My brother's 60GB PS3 just accepts/ejects discs straight in/out. But with my 80GB PS3, the disc spins whenever it's inserted/ejected. So maybe it's just because of a new disc drive being used or something.[/quote]

Yeah, maybe they use different drive models?

I've got a new MGS4 80GB bundle by the way. (No spinning.)

[quote name='yukine']Yeah, I have 360 and PS3 set to 720p since that is closest to my native resolution of my television. I know some people set their consoles to 1080i but that seems like way too much processing taking place as then your television would have to downscale it to 768p as well as deinterlace the image.[/quote]

Yeah, it looks worse. Unfortunately both the 360 and PS3 default to 1080i on a 720p set through HDMI (since that's the "highest" resolution the set is reporting it supports).
 
Last edited by a moderator:
I don't know if its been mentioned before but mkv2vob works really well. I converted Some Gundam 00 videos and they look fantastic. It takes like 30 minutes an episode, but that is because it has to transcode and I'm doing it on the highest quality and they are 720p videos. If it doesn't have to transcode it takes a few minutes.
 
[quote name='yukine']Yeah, I have 360 and PS3 set to 720p since that is closest to my native resolution of my television. I know some people set their consoles to 1080i but that seems like way too much processing taking place as then your television would have to downscale it to 768p as well as deinterlace the image.

Interesting the PS3 can't scale, so when you set it to 1080p or something and a game doesn't it support it... it's just outputting 720p and having your television take care of it anyway? Is setting the video output just letting the PS3 know what your television can handle?[/QUOTE]
on the 360 if you set it to 720p it will display at 720p for all games.
the PS3 will render 720p for all games too by default.

however, the problem for PS3 arises when some1 who has a 1080p or i (i & p are exactly the same btw, its only difference is the order in which the signal is sent to the TV). because it doesn't have a real scaler; if you have in your options all the resolutions selected it will output 720p & if your TV doesn't accept it will default to 480p; for most games that support cell upscaling, you have to manually uncheck everything but 1080 to have it force the upscale using cell. & then go check them again if you're playing a game which doesnt support it or else end up not being able to display it.

the simple fact is, despite what Sony's marketing team professed so adamantly in that year after 360 was launched & they were pushing hard for ppl to wait for PS3. HD started when 360 launched & the PS3 is not delivering on "Full HD" (1080) because it's not powerful enough, plain & simple. The GPU is too weak & even w/ cell offloading of post processing there isnt much overhead for other CPU based tasks to make games look good at that res as well as feel good (mainly AI, physics). the cell, being a stream processor excels at scripted/static scenes but isn't as great at procedural. it could be if it had all 8 cores to work w/ (only 6 are actually working) & parallel processing delivered at 100%. But if you remember the times where they would say "cell isnt being used 100% & it probably never will" making it sound more like it's power is unlimited; what they really meant was that Amdahl's Law restricts performance gains. It's something Stanford's Pervasive parallel lab is working to remedy atm. You can watch a presentation by one of AMD's R&D fellows talking about it here.
 
I gotta say it's really quite disappointing how the hardware and thus the software doesn't natively support 1080p. It's really quite annoying having games that only output in 720p when you have a 1080p capable screen. And the argument of the general populace not having 1080p compliant sets or the difference between 720p and 1080p are negligible depending on screen size is garbage. Sony touts all these great technical feats, yet can't back it up with their software or hardware consistently. It's all just a ridiculous marketing ploy to get people to bite when they see another bullet point on the PS3's rap sheet.
 
Just so guys know though that the 360 is even LESS capable of 1080p. Neither system has the horsepower to do it justice. The 360 just scales it's 720p (or lower) output to 1080p using it's internal scaler.
That could make 1080p a harder sell on the next gen consoles since unfortunately most people will probably already think their system does 1080p (though the primary benefit of more power would be more detailed graphics/gameplay anyway, not a higher resolution).

Oh and 1080i/p are NOT the same, since obviously one's interlaced, and is sort of 1/2 the resolution, or half the frame rate, or however you want to think about it.
 
1080p is only needed if you want to sit closer to the screen, as I believe the resolutions all look the same- depending on the TV's placement and your distance from them. :)
 
[quote name='Wolfpup']Just so guys know though that the 360 is even LESS capable of 1080p. Neither system has the horsepower to do it justice. The 360 just scales it's 720p (or lower) output to 1080p using it's internal scaler.
That could make 1080p a harder sell on the next gen consoles since unfortunately most people will probably already think their system does 1080p (though the primary benefit of more power would be more detailed graphics/gameplay anyway, not a higher resolution).

Oh and 1080i/p are NOT the same, since obviously one's interlaced, and is sort of 1/2 the resolution, or half the frame rate, or however you want to think about it.[/QUOTE]
no & no. 360 has a stronger GPU, PS3 has a (potentially) stronger CPU.

look at GTA4 for instance, runs 720p on 360, 640 on PS3; w/ a higher FPS on 360 & w/ AA on 360. most cross platform games thus far have run at either a higher FPS or res w/ better AA & textures on 360. This is based on tests done by eurogamer's "face off" series & the tech guys @ Beyond3D/console technology (the FPS tests being led by grandmaster, the creator of digital foundry; the res by Quaz51, who became internet famous for discovering Halo 3's res). again PS3's problem is parallel programming efficiency coupled w/ the cell needing to compensate for the RSX's shortcomings. that and more memory restrictions (& lack of 3Dc). programming for the PS3 is tricky because to take advantage of stream processing you need events to constantly be triggering the next events, sort of like dominoes but never ending (since it lacks branch-prediction & multi-threading in the SPUs like traditional CPUs). again, part of Amdahl's Law & probably the largest reason why only the first party games w/ large budgets & help from sony's "team cell" have been able to really look good on PS3 so far.

the difference between i & p is just in how the signal is sent. it's only interlaced on CRT tvs. all LCD, LCoS, plasma & DLP sets are progressive inherently; only phosphor based TVs w/ electron guns can display interlaced. i & p when referring to a cable signal is not the same thing as i & p when referring to a TV's display. for instance, many of the early sony LCoS rear projection sets only accepted 1080i inputs, the display is physically incapable of displaying it as interlaced though (& it's 120Hz capable of 60FPS). 1080p neither transmits more resolution or higher frame rate than i. the signal compression & amount of info sent between the 2 is identical. it's only the order in which the signal is sent which differs.
 
[quote name='NamPaehc']1080p is only needed if you want to sit closer to the screen, as I believe the resolutions all look the same- depending on the TV's placement and your distance from them. :)[/QUOTE]
i have a 52" set i sit about 4 feet from when i :joystick:
 
[quote name='TURBO']no & no. 360 has a stronger GPU, PS3 has a (potentially) stronger CPU. [/quote]

What's your reasoning behind that? The PS3's GPU has roughly 50% more transistors too (as does the CPU), not counting the 360's cache/video RAM. It also would drop off in performance at a constant rate, not plummet like the 360 would as less and less of the frame is able to fit in its 10MB RAM. It has more execution hardware on paper. Only advantage I see with Xenos is it's a unified shader architecture, but its execution units are only split up into three groups (and since they're consoles, programmers will make sure execution hardware stays busy regardless).

look at GTA4 for instance, runs 720p on 360, 640 on PS3; w/ a higher FPS on 360 & w/ AA on 360. most cross platform games thus far have run at either a higher FPS or res w/ better AA & textures on 360.

The 360 version runs
 
Last edited by a moderator:
[quote name='Wolfpup']What's your reasoning behind that? The PS3's GPU has roughly 50% more transistors too (as does the CPU), not counting the 360's cache/video RAM. It also would drop off in performance at a constant rate, not plummet like the 360 would as less and less of the frame is able to fit in its 10MB RAM. It has more execution hardware on paper. Only advantage I see with Xenos is it's a unified shader architecture, but its execution units are only split up into three groups (and since they're consoles, programmers will make sure execution hardware stays busy regardless).



The 360 version runs
 
[quote name='TURBO']its not RAM it's eDRAM[/quote]

Which is RAM.

& it has nothing to do w/ FPS, it only effects how much can fit in the z-buffer & back buffer before tiling

Which has everything to do with how fast it can render. Without that 10MB, the 360 has roughly half the memory bandwidth of the PS3. That fast 10MB RAM helps negate that advantage, but the higher resolution it's running at, the less of the frame fits in to that 10MB, and the more it has to rely on it's main system RAM.

1st of all transistor count is a horrible way to gauge performance.

Can be, but it's generally a decent guide, and it's reflected by the amount of actual execution hardware we see in both chips.

2ndly, xenos has 337M transistors RSX has 300M.

No, it doesn't. It's in the low 2xx range, the rest being a physically separate chip that's mostly that 10MB.

The numbers you're giving aren't correct either-that's not 24 execution units on the RSX, that's 24 pathways, each of which has multiple execution units. And you can't directly compare those anyway.

I don't remember where I read about GTA4-I'd suspect one of the tech sites, but don't know. Could be wrong, or maybe I was thinking of the PS3 version (I'll probably never play the game so...)

the single PPU is actually about 50% weaker than one of the xenon's cores

Why do you say that? It's virtually the same thing.

...if you tried to take a game written for 3 multi-threaded cores on the 360 & run it on the PPU it wouldnt work

Well sure it would work, it would just work slower...which is exactly what we saw in early ports, and why I suspect little more than a recompile may have been done in a lot of cases.

also i said it doesn't have 3Dc, not it doesn't have compression. it has S3TC which is not the same thing & not near as good; it struggles w/ normal maps while 3Dc excels. 3Dc also works w/ HDR & reflections. it's 10 year old compression technology vs 4 year old compression technology.

A quick search shows the 7800 architecture RSX is based on also supports more advanced texture compression. No one was sitting around using the same technology for 10 years, they just call it different things. I have no idea why is "better", but I'd suspect they're similar.

& MGS4 is 1024x768 (anamorphic) far from 1920x1080. the FMV is, but that's video it has nothing to do w/ rendering. it looks good yes, mostly because of art direction, but it's a horrible example of sony being "more capable" of HD.

It sounds like you misunderstood what I said. The point is the 360's performance drops off more quickly as resolution increases because of it's more limited memory bandwidth, so hypothetically the PS3 is more capable at higher resolutions. It's still not going to be doing 1080p stuff, it just doesn't have the horsepower for it-if these systems used Geforce 8800GTX GPUs, they STILL should be doing 720p or below as you'd want to spend that power on better visuals, not just higher resolutions.

Regarding the 1080i stuff, that's just not true. Visit an AV forum if you want to know more about how that works. Any time you're doing an extra conversion step you're losing info (but 1080i doesn't have as much info there to begin with).

it's ridiculous that youre trying to argue in favor of sony's obviously misleading marketing about HD starting when they say it does, & PS3 being the console of "Full HD" 1080p

I didn't argue either of those things. Obviously the 360 was the first HD console.

But, lets face the facts; the PS3 is not a 1080p console (except for PSN arcade games).

Never said it was. Neither console can realistically do 1080p, and I knew that long before either was launched (that's one of the reasons I went ahead and got a 720p set).
 
[quote name='Wolfpup']Which is RAM.



Which has everything to do with how fast it can render. Without that 10MB, the 360 has roughly half the memory bandwidth of the PS3. That fast 10MB RAM helps negate that advantage, but the higher resolution it's running at, the less of the frame fits in to that 10MB, and the more it has to rely on it's main system RAM.



Can be, but it's generally a decent guide, and it's reflected by the amount of actual execution hardware we see in both chips.



No, it doesn't. It's in the low 2xx range, the rest being a physically separate chip that's mostly that 10MB.

The numbers you're giving aren't correct either-that's not 24 execution units on the RSX, that's 24 pathways, each of which has multiple execution units. And you can't directly compare those anyway.

I don't remember where I read about GTA4-I'd suspect one of the tech sites, but don't know. Could be wrong, or maybe I was thinking of the PS3 version (I'll probably never play the game so...)



Why do you say that? It's virtually the same thing.



Well sure it would work, it would just work slower...which is exactly what we saw in early ports, and why I suspect little more than a recompile may have been done in a lot of cases.



A quick search shows the 7800 architecture RSX is based on also supports more advanced texture compression. No one was sitting around using the same technology for 10 years, they just call it different things. I have no idea why is "better", but I'd suspect they're similar.



It sounds like you misunderstood what I said. The point is the 360's performance drops off more quickly as resolution increases because of it's more limited memory bandwidth, so hypothetically the PS3 is more capable at higher resolutions. It's still not going to be doing 1080p stuff, it just doesn't have the horsepower for it-if these systems used Geforce 8800GTX GPUs, they STILL should be doing 720p or below as you'd want to spend that power on better visuals, not just higher resolutions.

Regarding the 1080i stuff, that's just not true. Visit an AV forum if you want to know more about how that works. Any time you're doing an extra conversion step you're losing info (but 1080i doesn't have as much info there to begin with).



I didn't argue either of those things. Obviously the 360 was the first HD console.



Never said it was. Neither console can realistically do 1080p, and I knew that long before either was launched (that's one of the reasons I went ahead and got a 720p set).[/QUOTE]

eDRAM is not normal RAM. it's embedded which allows enormous bandwidth. you make it sound like the entire frame buffer has to fit in 10MB, it doesn't. & if it didnt exist (which is a ridiculous argument, since it does exist) the 360 would not have half the memory bandwidth of the PS3 because when the PS3 uses XDR it incurs a major latency hit across all the RAM. transistor count is generally a bad guide, that was my point. i say the PPU's weaker because it is. it lacks any instruction window, only has a 512K L2 cache & has a shallower branch prediction unit. it would not work, if it could not carry out the computations in the correct amount of time the game wouldn't work. things would have to be removed or changed. a quick search shows what (btw RSX is NOT a 7800, it's a castrated 7800/7600 hybrid w/ redundant areas removed for yield)? what more advanced compression? RSX supports S3TC like i said, what is this more advanced one youre talking about?; what's it called? it's not 3Dc, despite the 7800 being the 1st nVID card to have it, the RSX does not; that's documented fact. dont lecture me on AV i've had home theaters since the days of laserdisc. there isn't "less info" and there isnt a loss in IQ when deinterlaced. like i said, that only applies to 1080i sourced/shot video material. not 1080p frame buffer sent as 1080i to a 1080p display. honestly, i believe you have no idea what you're talking about. i strongly suspect you just dont like that i pointed out sony's hypocrisy & the PS3's weak points. this is all very off topic anyway. i get the feeling you're never going to acknowledge any facts no matter how valid they are. so unless you can back up your argument w/ real technical reasons (which im about 100% sure you cant since a large number of PS3 developers at Beyond3D & GameDev.net, some of which used to be 360 devs, like Joker454, have differing options from yours) i suggest an end to this 'debate'.
 
Last edited by a moderator:
[quote name='Wolfpup']Which is RAM.



Which has everything to do with how fast it can render. Without that 10MB, the 360 has roughly half the memory bandwidth of the PS3. That fast 10MB RAM helps negate that advantage, but the higher resolution it's running at, the less of the frame fits in to that 10MB, and the more it has to rely on it's main system RAM.



Can be, but it's generally a decent guide, and it's reflected by the amount of actual execution hardware we see in both chips.



No, it doesn't. It's in the low 2xx range, the rest being a physically separate chip that's mostly that 10MB.

The numbers you're giving aren't correct either-that's not 24 execution units on the RSX, that's 24 pathways, each of which has multiple execution units. And you can't directly compare those anyway.

I don't remember where I read about GTA4-I'd suspect one of the tech sites, but don't know. Could be wrong, or maybe I was thinking of the PS3 version (I'll probably never play the game so...)



Why do you say that? It's virtually the same thing.



Well sure it would work, it would just work slower...which is exactly what we saw in early ports, and why I suspect little more than a recompile may have been done in a lot of cases.



A quick search shows the 7800 architecture RSX is based on also supports more advanced texture compression. No one was sitting around using the same technology for 10 years, they just call it different things. I have no idea why is "better", but I'd suspect they're similar.



It sounds like you misunderstood what I said. The point is the 360's performance drops off more quickly as resolution increases because of it's more limited memory bandwidth, so hypothetically the PS3 is more capable at higher resolutions. It's still not going to be doing 1080p stuff, it just doesn't have the horsepower for it-if these systems used Geforce 8800GTX GPUs, they STILL should be doing 720p or below as you'd want to spend that power on better visuals, not just higher resolutions.

Regarding the 1080i stuff, that's just not true. Visit an AV forum if you want to know more about how that works. Any time you're doing an extra conversion step you're losing info (but 1080i doesn't have as much info there to begin with).



I didn't argue either of those things. Obviously the 360 was the first HD console.



Never said it was. Neither console can realistically do 1080p, and I knew that long before either was launched (that's one of the reasons I went ahead and got a 720p set).[/QUOTE]

eDRAM is not normal RAM. it's embedded which allows enormous bandwidth. you make it sound like the entire frame buffer has to fit in 10MB, it doesn't. & if it didnt exist (which is a ridiculous argument, since it does exist) the 360 would not have half the memory bandwidth of the PS3 because when the PS3 uses XDR it incurs a major latency hit across all the RAM. transistor count is generally a bad guide, that was my point. i say the PPU's weaker because it is. it lacks any instruction window, only has a 512K L2 cache & has a shallower branch prediction unit. it would not work, if it could not carry out the computations in the correct amount of time the game wouldn't work. things would have to be removed or changed. a quick search shows what (btw RSX is NOT a 7800, it's a castrated 7800/7600 hybrid w/ redundant areas removed for yield)? what more advanced compression? RSX supports S3TC like i said, what is this more advanced one youre talking about?; what's it called? it's not 3Dc, despite the 7800 being the 1st nVID card to have it, the RSX does not; that's documented fact. dont lecture me on AV i've had home theaters since the days of laserdisc. there isn't "less info" and there isnt a loss in IQ when deinterlaced. like i said, that only applies to 1080i sourced/shot video material. not 1080p frame buffer sent as 1080i to a 1080p display. honestly, i believe you have no idea what you're talking about. i strongly suspect you just dont like that i pointed out sony's hypocrisy & the PS3's weak points. this is all very off topic anyway. i get the feeling you're never going to acknowledge any facts no matter how valid they are. so unless you can back up your argument w/ real technical reasons (which im about 100% sure you cant since a large number of PS3 developers at Beyond3D & GameDev.net, some of which used to be 360 devs, like Joker454, have differing options from yours) i suggest an end to this 'debate'.
 
[quote name='metaly']Are these the same guys from a couple months ago? How does this keep happening?[/QUOTE]
this is the first time i've made such an argument. so i doubt it. i just dont like being called a liar; especially when i know the other person has no idea what they're talking about.
 
You guys used a lot of words so I'll just go ahead and agree with both of you.

So anyone here have GT5 Prologue? I don't wanna create a new thread for one question. How do you fix the in-game clock/calendar? It always tells me it's set wrong.
 
[quote name='SteveMcQ']You guys used a lot of words so I'll just go ahead and agree with both of you.

So anyone here have GT5 Prologue? I don't wanna create a new thread for one question. How do you fix the in-game clock/calendar? It always tells me it's set wrong.[/QUOTE]
ditto, i have no idea; i looked in options and didn't see one for it. & i know the time isn't wrong, i used an atomic clock to set my time in the XMB.
 
[quote name='TURBO']eDRAM is not normal RAM. it's embedded which allows enormous bandwidth.[/quote]

Obviously I know it's embedded RAM. Embedded RAM is...RAM. Are you claiming it's not RAM? I'm not sure what or why you're arguing this...(?)

you make it sound like the entire frame buffer has to fit in 10MB, it doesn't.

No, in fact that's exactly the opposite of what I said. To quote myself:

...but the higher resolution it's running at, the less of the frame fits in to that 10MB, and the more it has to rely on it's main system RAM...

I'm clearly saying the entire frame doesn't fit in that RAM, which is the entire point of my argument.

if it didnt exist (which is a ridiculous argument, since it does exist)

Between this and the first comment you're missing the point of what I'm saying. I'm not sure how else to phrase it-the less of the frame that can be stored in that 10MB, the more the GPU has to write to main system RAM. In other words, the higher the resolution of the image, the less that fits in that 10MB, and the less effective memory bandwidth the system has. It doesn't drop off linearly like in the Playstation 3 or PC parts.

the 360 would not have half the memory bandwidth of the PS3 because when the PS3 uses XDR it incurs a major latency hit across all the RAM.

First, even if that XDR has bad latency, that has nothing to do with it's bandwidth, which is what it is. Second, supposedly XDR's latency is fine. I have no idea if it is, and don't particularly care. I hate Rambus, and I'm assuming Sony using them AGAIN has something to do with some licensing agreement rather than any technical issue. But that has nothing to do with anything...

transistor count is generally a bad guide, that was my point. i say the PPU's weaker because it is. it lacks any instruction window, only has a 512K L2 cache & has a shallower branch prediction unit. it would not work, if it could not carry out the computations in the correct amount of time the game wouldn't work. things would have to be removed or changed. a quick search shows what (btw RSX is NOT a 7800, it's a castrated 7800/7600 hybrid w/ redundant areas removed for yield)? what more advanced compression?

I've read every technical document I could get my hands on back in the day, and none of them agree with what you're saying. Unless you just mean that it has fewer ROPs (which is apparently the case, but it's still the same as the 360). (And yes, I know the SPUs aren't as advanced, and never said otherwise, so I have no idea why you were talking about them...)

RSX supports S3TC like i said, what is this more advanced one youre talking about?; what's it called? it's not 3Dc, despite the 7800 being the 1st nVID card to have it, the RSX does not; that's documented fact.

Google it yourself. Took about three seconds to find info on it. You're basically arguing that it's worse because it doesn't have ATi's brand name for compression technology and is using Nvidia's instead. I could pull up all kinds of silly marketing terms Nvidia used too to claim Xenos is worse since it doesn't have . ATi's of that time might be better than Nvidia's, or it might be worse, but it's ludicrous to argue that it's compression is the same as a 10 year old part or worse than an ATi part because it's not using ATi's brand name.

dont lecture me on AV i've had home theaters since the days of laserdisc. there isn't "less info" and there isnt a loss in IQ when deinterlaced.

Any time you have to go through a conversion step like that you're losing data. At best it's going to look about the same to the naked eye.
 
Last edited by a moderator:
[quote name='Wolfpup']...[/QUOTE]

calling eDRAM ram is like calling a Ferrari transportation. like i said only the back buffer & z buffer need to fit in it & the whole buffer doesnt have to fit in it at the same time because it can tile (being eDRAM & having that level of speed/lack of latency). having unified ram also means that it wont overflow. on the PS3 if it needs more than 256MB it has to use XDR and doing that incurs a much larger performance hit than anything possible on 360. if the eDRAM wasnt there it would operate the same as the PS3, only it wouldnt have latency issues w/ split GDDR3 & XDR. your assumption that anything suddenly drops is just wrong. & to put that 10MB in perspective, it was enough to render 2 640p back buffers per frame for halo 3; high & low pass HDR.

sony used XDR because the cell was designed to work w/ XDR. XDR works best w/ cell being fast & latent. but it works horribly w/ the RSX which needs fast timings & doesnt need speed. Writing to & fetching from XDR slows down the RSX's ability to render. it's that simple.

i said PPU not SPUs. & the RSX is not the same as the 7x00 series. it's just evolved from the same prototype. like i said no 3Dc even though the 7800 had it. don't believe me, register at B3D or gamedev.net & ask the guys w/ the dev kits.

if it took about 3 seconds to get the info; it would have taken you 5 seconds to answer me and tell me what it was called. but you're telling me to google it. google what? just tell me what it is if you found it. 3Dc is not an ATi only technology. ATi just invented & implemented it a generation before nVID adopted it, it's an open standard. S3TC was invented by S3, but RSX uses that; why arent they using this nVID invented compression tech? probably because it doesn't exist. if it did nVID would be using it in their own graphics cards (instead of 3Dc).

it's not a conversion step like analog to digital. there is no data loss. it's digital to digital. 1080i or 1080p they both wait for all the lines to be delivered, assemble them into a frame, then display the frame. the only difference is the i signal is assembled in a different order.

the end. that's it, this is again all very off-topic. register at B3D, PM me your username & we can discuss this there. maybe if some other people chime in you'll be less resistant to my ideas.
 
Newb question incoming-

I am curious how much space I have on my HDD left, and can't seem to figure it out. What's the easiest way to check free space?
 
[quote name='cdietschrun']Newb question incoming-

I am curious how much space I have on my HDD left, and can't seem to figure it out. What's the easiest way to check free space?[/quote]

Check under System -> System Information. It'll list a whole buch of stuff one of which is total/available space.
 
OK, I had a weird experience this morning after playing my PS3 for about 4-5 hours. I had just quit out of GTAIV, ejected the disc, waited about 3-5 seconds, then went to shut down the system.

When I did shut down the system, it beeped 3 times, then shut down after what seemed like an extended shutdown period.

My questions are:

Do I have to sign out of PSN before shutdown? How long is appropriate to wait before shutting the PS3 down after ejecting a game? And WTFH do the three beeps mean(or could they mean?)?

I also had my controller plugged in via the USB cable to let it charge since the battery was low, if that makes any difference.
 
Last edited by a moderator:
[quote name='IAmTheCheapestGamer']OK, I had a weird experience this morning after playing my PS3 for about 4-5 hours. I had just quit out of GTAIV, ejected the disc, waited about 3-5 seconds, then went to shut down the system.

When I did shut down the system, it beeped 3 times, then shut down after what seemed like an extended shutdown period.

My questions are:

Do I have to sign out of PSN before shutdown? How long is appropriate to wait before shutting the PS3 down after ejecting a game? And WTFH do the three beeps mean(or could they mean?)?

I also had my controller plugged in via the USB cable to let it charge since the battery was low, if that makes any difference.[/quote]Maybe that's the system going into a low energy consumption mode so the USB ports still charge your system? If it's the same as the 360 anyway, that's what it does so you can charge while the system is "off".

So do most of you guys take the disc out then wait to shut off the system? I'm usually stuck on one game with my PS3 and shut it off directly from the game screen.
 
Triple beep happens ever so often with me. Can't pinpoint it to one thing though. Mostly happens when I try to shut down from ingame and it's like it can't shutdown normally to it has to do it someother way (kinda of like in windows a program will die and you have to hit 'end now' except the ps3 will auto do it for you).
 
[quote name='SteveMcQ']Maybe that's the system going into a low energy consumption mode so the USB ports still charge your system? If it's the same as the 360 anyway, that's what it does so you can charge while the system is "off".

So do most of you guys take the disc out then wait to shut off the system? I'm usually stuck on one game with my PS3 and shut it off directly from the game screen.[/quote]

I usually pop the game disc out and then I do normally wait a bit to shut down, but for some reason last night I waited maybe five seconds before trying to shut the system down after I got back to the xmb after I ejected GTA IV.

I know when I first got the system, I waited until I saw the HDD access light stop flashing completely, but lately I've just been doing things alot quicker.
 
Ive gotten the triple beep before after 2.40 when GTA froze on me after i tried to launch it. had to put my finger on the the power button for like 10 seconds to force it to turn off. i think it might have something to do w/ the system not being shut down right. i rebooted it again after i turned it back on, to make sure everything is loaded correctly back into memory.
 
I can't remember getting triple beeps though I may have...
I think usually it writes data to the drive for a few seconds and shuts down-the whole process can take a little bit, it's not instantaneous for me.

I don't eject the game. You should if you're going to move the system I guess, but otherwise I've never removed games when turning off the system-shouldn't need to if the drive is remotely normal.

[quote name='TURBO']calling eDRAM ram is like calling a Ferrari transportation.[/quote]

No it's like calling a car a car. It's RAM. I'm making no judgment call on what type of RAM it is, how fast it is, or anything else. I'm calling it by the words that describe it. The "e" just stands for embedded for that matter.

like i said only the back buffer & z buffer need to fit in it & the whole buffer doesnt have to fit in it at the same time because it can tile

You're not understanding what that means. Yes, it can tile. I know that. THAT IS IN FACT MY POINT. The higher the resolution it's trying to render at, the less of the frame that fits into that 10MB RAM, the more it has to write to main system RAM. Main system RAM is VASTLY slower than that 10MB. The more you're having to rely on main system RAM, the slower it is-the system's effective memory bandwidth drops off more and more the higher the resolution it's rendering at.

In other words if you were doing 480p, and can fit an entire frame in that 10MB, then the effective video bandwidth is whatever that 10MB is, and also it's not having eat up bandwidth the rest of the system's using. But each time you increase the resolution, less and less percent of the frame is fitting into that 10MB, it's having less and less of a benefit-the video system's effective memory bandwidth falls.

This is in ADDITION to the image being harder to render just because there's more pixels being rendered. There is *NOT* this same kind of rapid drop off in performance on a more normal PC/PS3 setup, where you have the same effective memory bandwidth regardless of the resolution being targeted because you have a large chunk of video RAM. A more normal design the performance is going to fall off more or less linearly. The 360 is going to just plummet as soon as it has to start tiling and throwing part of the image in main system RAM.

(being eDRAM & having that level of speed/lack of latency).

First of all, "eDRAM" just means it's embedded, it doesn't mean it's magical or anything-in the case of the 360's it's video RAM is much faster than main system RAM. What you're ignoring is that the 360 LOSES that speed more and more the less and less of a frame that's fitting into that 10MB.

[quote[on the PS3 if it needs more than 256MB it has to use XDR and doing that incurs a much larger performance hit than anything possible on 360.[/quote]

Why? That's absolutely not the case. The PS3's GPU can freely use main system RAM too. It's a hassle in that AFAIK a programmer is going to have to worry about where to store textures, etc. if they need more than 256MB (unless they write a system that manages that for them) but it has the advantage of giving the Playstation 3 a bit over double the total system bandwidth that the 360 has...excluding it's 10MB video RAM. And again, that's an issue with higher resolutions because the higher the resolution, the lower the effective memory bandwidth the 360 has, whereas the PS3's total bandwidth is always around 50GB/s regardless of the resolution it's targeting (exactly like a normal PC setup). Do *NOT* say "oh, but you can't ignore the 10MB"...I know that. What you keep ignoring is that 10MB is less and less effective the higher the resolution.

put that 10MB in perspective, it was enough to render 2 640p back buffers per frame for halo 3; high & low pass HDR.

Even if true, you'll note that's not even 720p, and my entire point is the effective bandwidth drops off the higher the resolution.

I can't find the articles I've read that show the exact size you need for a given resolution, and I don't trust my own calculations, but with what Microsoft requires, HD resolutions don't fit entirely in that 10MB:
http://www.beyond3d.com/content/articles/4/5
...if I could find the exact figures again you might get a better feel for this, as you can flat out see the percent of the image fitting in that 10MB shrinking as the resolution goes up.

i said PPU not SPUs. & the RSX is not the same as the 7x00 series. it's just evolved from the same prototype. like i said no 3Dc even though the 7800 had it. don't believe me, register at B3D or gamedev.net & ask the guys w/ the dev kits.

As I said, from a quick Google, the 7800 series did NOT support 3Dc, it just converted that format to the format it used. And again you're missing the point that that's just an ATi thing. It's not a shock Nvidia is going to support their own compression technology and not ATi's branded technology. That doesn't in and of itself mean it's any worse. It could be better for all we know, but regardless it's not limited to 10 year old technology like you're claiming.

...S3TC was invented by S3, but RSX uses that; why arent they using this nVID invented compression tech? probably because it doesn't exist.

I have no idea why you're insisting Nvidia has done nothing with compression technology in the last 10 years.

if it did nVID would be using it in their own graphics cards (instead of 3Dc).

They are. Google is your friend. You seriously think they were just sitting there while ATi was implementing updated texture compression? I don't remember what they called it, or even if they gave it some cutesy brand name, but for that matter that's probably what their drivers were converting ATi's texture compression into.

it's not a conversion step like analog to digital. there is no data loss. it's digital to digital. 1080i or 1080p they both wait for all the lines to be delivered, assemble them into a frame, then display the frame. the only difference is the i signal is assembled in a different order.

Even if you had a deinterlacer that could do it perfectly, you've still got the issue that you have half the frame rate in that case. But deinterlacers are NOT flawless.
 
[quote name='Wolfpup']...[/QUOTE]

a Ferrari is car in a sense. But you dont call it that. You call it a Supercar. eDRAM is not normal ram, just like a Ferrari is not a normal car. verstehen sie? When you talk about RAM you're talking about main memory, not embedded. That's the general language in the industry.

You seem to be under the impression the 360s main ram is slower than PS3s; you'd be wrong.

The bandwidth between the eDRAM & the Logic (alpha & Z) unit is 256GB/s, the bandwidth between the eDRAM & the GPU is 32GB/s (each way). the bandwidth between the GPU & the GDDR3 is 22.4GB/s (each way.).

PS3 (not even taking into account the latency of XDR) reads XDR at 15.5GB/s & writes to it at 10.6GB/s. the GDDR3 is 22.4GB/s (each way). Even if the eDRAM didnt exist (which it does) the 360's GPU would still have a significant memory advantage.

Also quit saying PS3 & PC. PC has nothing to do w/ PS3. the buses are completely different, PCs also dont (& dont need to) share main memory since the vid cards alone usually have the same if not more memory as a console. PowerColor's releasing a 4850 w/ 2GB; & that's a mid range card now.

There is no rapid drop off in performance. the only drop off is in the amount of free AA the eDRAM can offer. & that AA can be dynamic like Dead Rising or Forza 2 as to not effect frame rate at all. This is the main reason 360 almost always has at least double the AA in cross platform games (GRID, GRAW, GTA4, Lost Planet, NBA Street Home Court, Soldier of Fortune Payback, Pro Evo 08, Virtua Figher 5, etc.. look) The cell does allow the PS3 an AF advantage over the 360 however.

it's 2 back buffers. that's essentially 2304x1280 pixels per frame (though it doesnt translate exactly to that since it doesnt need to render geometry twice). & of course the whole framebuffer isnt going to fit into 10MB; it's not supposed to. Honestly, why do you think there's GDDR3 connected to the GPU if they werent going to use it? You think it does everying in 10MB eDRAM & then just spits it out to your TV? unpossible. It uses the eDRAM to do 10MB at a time faster than could be done w/ the GDDR3. it also effectively doubles bandwidth because calculations which need to be sent twice (like halo 3's 2 pass HDR) only have to go through the main memory pipeline once instead of twice.
& the calculations for Back & Z are
Back-Buffer(s) = Pixels * FSAA Depth * Rendering Colour Depth (may include multiple render targets for deferred rendering techniques)
Z-Buffer = Pixels * FSAA Depth * Z Depth (usually 32-bit depth)
that doesnt change anything I've said; because that's what I've said.

The 7800 supported 3Dc in it's drivers through embedded V8U8; it was about 50% slower but it was there. & it's there on the 8800 & up embedded. again it's not on the RSX it is on the Xenos. Theoretically 3Dc could be supported by using the cell but that would force XDR to be used; as the connect between Cell & GDDR3 is 16MB/s (not a typo).

I didn't say they didnt do anything. But most of what they've done has been cooperating w/ MS, Intel & AMD doing DXT optimizations; their only DXN has been the V8U8 which isn't 1/2 as good as 3Dc. Why they support 3Dc now. You think devs are going to write their games for 2 different compression methods? no, 3Dc is the best out so that's what they use & thats why it's on nVID cards.

Quit telling me to google things. If you googled it, then provide what you found. & yes, you seem to sort of understand how 3Dc was supported on the 7800 series; drivers & V8U8. but again all their cards now 3Dc on the silicon; because its the best normal map compression & that reason makes it what PC game devs use.

unless your game is running 120FPS the frame rate wont matter. even older 1080i sets run 120Hz which is 60FPS displayed at 1080p. i'm not sure why youre obsessing over 1080i either since the PS3 doesnt have a real scaler. & the games which add this support (like drakes fortune) do it for 1080i. That 700+ page massive thread of pissed of PS3 owners at the official playstation site makes that abundantly clear.

i wasn't making it up when i said that the 360 has the better GPU & the PS3 has the better CPU. It's reality. Accept it. If you dont believe me do what i keep suggesting. Register at GameDev.net or Beyond3D & ask other people.
 
I have a question about custom soundtracks. I know it was put into the latest update. If I get a game that supports the feature, can I stream over music from my pc? Or does the music have to be saved on to the hard drive?
 
[quote name='XxFuRy2Xx']I have a question about custom soundtracks. I know it was put into the latest update. If I get a game that supports the feature, can I stream over music from my pc? Or does the music have to be saved on to the hard drive?[/quote]

It can't be streamed. Has to be on your HDD. (Which sucks.)
 
Thanks Rig. I was excited about the custom soundtrack feature, but now I'm bummed to find out that I can't stream from my PC. Why the hell is that anyway?
 
[quote name='XxFuRy2Xx']Thanks Rig. I was excited about the custom soundtrack feature, but now I'm bummed to find out that I can't stream from my PC. Why the hell is that anyway?[/QUOTE]It's because it's not designed on PS3 to read from the network, just the HDD. Personally, I don't care since all my music goes to the HDD.

It's like this, 360 owners are more so into streaming, while PS3 owners are more so into have a large HDD storing everything (that's what I've been doing, since I do not like having to media stream).
 
[quote name='The Mana Knight']It's because it's not designed on PS3 to read from the network, just the HDD. Personally, I don't care since all my music goes to the HDD.

It's like this, 360 owners are more so into streaming, while PS3 owners are more so into have a large HDD storing everything (that's what I've been doing, since I do not like having to media stream).[/quote]Sheesh...you really do have an explanation/excuse for everything Sony related don't you? ;)

Question: I have a monitor I use for the PS3 in my room that's 1080p. Most games only support 720p and thus look like shit on a 1080p native screen. If I check off 720p and 1080p in the XMB settings screen, will it auto-adjust resolutions if a game supports 1080p or will it default to the 720p resolution?
 
[quote name='The Mana Knight']It's because it's not designed on PS3 to read from the network, just the HDD. Personally, I don't care since all my music goes to the HDD.

It's like this, 360 owners are more so into streaming, while PS3 owners are more so into have a large HDD storing everything (that's what I've been doing, since I do not like having to media stream).[/quote]Wha? I can stream over music when I'm not playing a game, so it doesn't make sense that I can't do it while I am playing a game...
 
OK, I had my second perpetual 'loading' screen on GTA IV tonight. I ended up having to quit the game and the PS3 restarted itself, after three beeps.

What the hell is with this system and 3 beeps?

Either way, I ended up having to turn off auto loading for games, since it kicked me right back into GTAIV and I had to quit the game while it was loading AGAIN.

I mean, I know every console out there has that option now, but why? It's so fuckin' annoying if you're trying to check other stuff(like PSN invites and messages and crap), you get locked out of it until the game is finished loading.

Either way, is the three beeps when you have to quit out of a game that's stuck on a loading screen a really bad occurence or am I just overly paranoid(as usual)?
 
bread's done
Back
Top