Why can't 480p games look as good as dvd?

vherub

CAGiversary!
Feedback
26 (100%)
I haven't been able to find this answer online.
If a game system outputs to "only" 480p, like the wii or the xbox, and 480p can display dvds-which look pretty damn good, why do none of the games, even those on ps3 or xbox360, look as good as a dvd does in 480p?
It would seem to me that 480p is not the limiter, but rather some other link in the graphic output chain is.
 
They do look as good as DVD to me. Or as good as DVD does relative to how each looks on an SDTV.

It can look fine IMO, but much worse than 720p.
 
[quote name='Mookyjooky']Because real people make up a bazillion polys.[/QUOTE]

Oh, yeah. Maybe I was misinterpreting what the OP meant.
 
It has to do with the smoother edge. Your Finding Nemo and Invincibles type graphics were done with computers that are MUCH more powerful than what your game system can do and there's no math involved since everything is on the disc.
Also if you're using a tv that has higher resolution than 480p and you're watching with a full screen stretch (basically anything other than 1:1 pixel mapping) then you're asking that 1 pixel to all of a sudden be two or three and then things get ugly real quick.
 
Games don't look so hot all the time mainly because of texture resolution and compression techniques. I will add that resolution of the meshes that are used in games could be higher res, which would give you smoother looking edges around things.

Higher anti-aliasing (aa) would clean up the edge quality of the meshes regardles of the mesh size. What it does is resample all goemetry edges to a certain level. AAx2 would do 2 passes of the edges, AAx4 would do 4 passes, so forth and so on. This requires processing power, and if you throw the cpu at that it cant be doing something else. I believe this function is handled on the graphics chip though, but if youre overwhelming it with AA routines, it might be taking a performance hit trying to do something else. I'ts all a balencing act.

The main issue though is the compression on the texture maps. I assume you've seen a really compressed jpeg image before, or even a highly compressed avi, mpeg or mov file. It's the same deal. Images with high compression on them tend to look muddy or chunky. Images with low or no texture compression on them look clean and sharp.

Another thing to keep in mind is that when you get a ton of models in a game onscreen at the same time, you have to share the texture memory in the console with all of the models and geometry. If you add in things like reflections and refraction (bending of light hrough transparent objects) you need to create multiple versions of the same thing.

Not to mention any other effect thats thrown in there like glows and shading types to make things look dull or shiny.

A related issue to visual quality is frame rate. When you have too many things onscreen at the same time, probably using more texture memory than available, you get the console slowing down. While it isnt solely a texture problem, it is related.
 
Also nasum, the one of the guys that worked on renderman (Larry Gritz), which is what Nemo and other pixar movies are rendered with, made his own version of the renderman renderer a while back while in grad school, and eventually went to work for Nvidia. Why do I bring this up? Nvidia makes console graphcs chips amongst other things. His renderer called BMRT somehow got bought up by Nvidia if I remember correctly. Nvidia rolled his tech into their hardware. Nvidia at the time was stating Toy Story graphics would be possible on their boards. There's this new Nvidia product that can approach that level in realtime if I am not mistaken:

http://www.nvidia.com/page/quadroplex.html

http://www.nvidia.com/page/gz_home.html

That second link is very interesting, it allows people to use their Nvidia boards in their computers (normal cheap ones) to render their animations at a Toy Story quality level in very little time.

I assume the next gen or two will have graphics that rival Toy Story. That second link is several years old.

[quote name='nasum']It has to do with the smoother edge. Your Finding Nemo and Invincibles type graphics were done with computers that are MUCH more powerful than what your game system can do and there's no math involved since everything is on the disc.
Also if you're using a tv that has higher resolution than 480p and you're watching with a full screen stretch (basically anything other than 1:1 pixel mapping) then you're asking that 1 pixel to all of a sudden be two or three and then things get ugly real quick.[/quote]
 
In terms of PC graphics they've exceeded Toy Story quality-at least with Direct X 10 and Crysis.

So should be the next generation for consoles (though the way things are going, I wouldn't expect that for at least 6 years, unless Nintendo wants to release a next gen system before then.
 
[quote name='DirtRoadSport']Pretty much sums it up perfectly.[/quote]


Polygon count is less the problem than image and edge quality.
 
[quote name='nasum']It has to do with the smoother edge. Your Finding Nemo and Invincibles type graphics were done with computers that are MUCH more powerful than what your game system can do.[/QUOTE]

LIES! Sony has repeatedly said and proven that even the PS2 is capable of realtime CGI quality graphics :roll:
 
Actually a few years back I saw a realtime version of that horrid Final Fantasy Spirits Within movie being played through an array of PS2's

from: ACM SIGGRAPH 2000: Views from a game developer

Consumer products were on show at the expo, displaying the latest and greatest in hardware, software, and some really cool display technologies. Sony's GSCube was particularly frightening, rendering a shot from "Final Fantasy" the movie in realtime.

http://www.siggraph.org/gen-info/game_outreach.html
 
Wrong question. What you really want to know is why we don't have perfect photorealistic graphics at NTSC levels.

Because it's extraordinarily friggin' hard to do.

Look at the state of the art for attempting this in CG movies that take months or years to render. FFVII: Advent Children was amazing to behold but very few would be fooled by any of the characters in the movie. They looked good and could be very engaging but never mistaken for actual actors performing for a camera.

But that is for a level of detail in which progress is measured in frames per day, with frequent intervention for tweaking by the artists. Even if a single box could perform such as to generate a decent framerate for material of that quality to be termed real-time, you'd still have the tweaking. On top of that, it would just be a movie. Adding interactivity bumps up the processing demands an ordier of magnitude.
 
So reading these replies, I guess my question becomes, 5 years from now, if a brand new console outputs to 480p, can it look better than a ps3 or xbox360 that today outputs at 1080, assuming that it's processing and graphics power is substantially higher?
Or phrase another way, can a system that is powerful enough to render dvd quality images on the fly, even if it is outputting to 480p look better than, another system outputting to 720 or 1080?
 
[quote name='vherub']So reading these replies, I guess my question becomes, 5 years from now, if a brand new console outputs to 480p, can it look better than a ps3 or xbox360 that today outputs at 1080, assuming that it's processing and graphics power is substantially higher?
Or phrase another way, can a system that is powerful enough to render dvd quality images on the fly, even if it is outputting to 480p look better than, another system outputting to 720 or 1080?[/QUOTE]

Yes, but IMO there's no way anyone's going to make a system that can't do HD, and any system that crazy powerful would easily be able to do HD.
 
bread's done
Back
Top