Pixel shading for Gamecube games?

AdultLink

CAGiversary!
Feedback
2 (100%)
I just got Splinter Cell: Pandora Tomorrow for GC, and I notice it uses pixel shaders for water :shock:

These apparantly are pixel shader 1.0, the crappy kind from geforce 4, but they are definatly pixel shaders.

I've also noticed them in the squadron games, and one other (Not in mario sunshine though, it used something else...)

What other GC games use this?

And BTW< why don't other GC games use pixel shaders? This effect can have a HUGE effect on your game, since water can take a huge area... if can change your game from 'ok' to 'good' or great even! Just look at Morrowind...
 
The GameCube doesn't have dedicated hardware in the Flipper chip to accomplish this function, so it has to be done in software at a high cost in clock cycles. Making this work with lots of other interactive stuff going on at the same time makes for intense amounts of time spent tunig the affected scenes.

Mario Sunshine was far froma graphics showcase but even so, considering the role water plays in that game, using such effects on the waters would have brought some scenes to a crawl. If they were doing the same game today with the better codebase established for GC developers to access in middleware products it would probably look a fair bit better but you can't pay the bills if you never ship any games while waiting for the programmers to come through with another incremental performance gain.

Such effects have been used since early on and frequently on the Xbox because the dedicated shader hardware in the XGPU pipelines make it almost trivial in terms of overhead. This is the sort of thing where dedicated hardware is much more effective than sheer horsepower and software.
 
Somebody obviously doesn't know their Gamecube very well...

Or even their consoles very well...

Or PCs...

2 things:

1. Most hardware related things must be encoded into hardware, or can't work, like texture filtering (Try getting pixel shading to work on a video card that doesn't allow pixel shading... oh wait, you can't).
2. Even if you COULD somehow get pixel shading to work in software mode, considering it is used in the squadron games, and considering how much goes on in them, running in software, and at 60 fps, would mean GC is 3 times more powerful. Don't tell me that. Don't even tell me GC is more powerful then Xbox, because it's not.

The truth is, the GC is easy to work with, but the special effects, like pixel shading, are hidden, and very hard to program. My question is, why didn't the programmers at least try? It would make a world of difference!
 
[quote name='AdultLink']Somebody obviously doesn't know their Gamecube very well...

Or even their consoles very well...

Or PCs...

2 things:

1. Most hardware related things must be encoded into hardware, or can't work, like texture filtering (Try getting pixel shading to work on a video card that doesn't allow pixel shading... oh wait, you can't).
2. Even if you COULD somehow get pixel shading to work in software mode, considering it is used in the squadron games, and considering how much goes on in them, running in software, and at 60 fps, would mean GC is 3 times more powerful. Don't tell me that. Don't even tell me GC is more powerful then Xbox, because it's not.

The truth is, the GC is easy to work with, but the special effects, like pixel shading, are hidden, and very hard to program. My question is, why didn't the programmers at least try? It would make a world of difference![/quote]

Somebody should do a little self-education before throwing insults around.

Hidden? You've got to be kidding? Why would Nintendo hide having full buzzword compatibility on their spec sheet?
http://www.nintendo.com/techspecgcn
The PR value alone for having such a feature makes it pretty much impossible that Nintendo wouldn't be telling anyone who cared the feature was in there.

The GameCube does not have pixel shader hardware, period. It has a mechanism that is sort of half of a pixel shader. Good programmers can enlist it to make a software shader effective if used sparingly, in the places where they'll be most appreciated.

Perhaps you haven't noticed but 3D games on computers were thriving before the first 3D video boards for consumer PCs shipped. Starglider on the Atari ST was a big deal when it came out despite being all in vector graphics ala the first Star Wars arcade machine. People were completely stunned when Starglider 2 came out with filled polygons and hidden surface removal but ran on the very same Atari ST (also Amiga, PC, and Mac eventually) Yet those systems had not a single bit of hardware within them for the specific purpose of generating 3D images. It was all software. After showing Starglider 2 to Nintendo, Argonaut suggested that much of their software for doing this stuff could be implemented as a chip inexpensive enough to go in a video game cartridge. This led to the creation of Starfox, which I tend to think of as Starglider 3 in ways. Thus the chip that brought the first dedicated 3D hardware to the console market had been implemented in software first.

Pixel shading has been around for a long time. Much longer than the first hardware implementations done at SGI. It was done entirely in software for rendering systems where individual frames could take hours each to produce. The Pixar Renderman software was one of the first commercial packages to allow users to create their own shaders back in the 80's. A general purpose CPU can perform any computational task as dedicated hardware. It's just a matter of how much time it will take vs. the other things you need to get done.

Modern systems are immensely faster. Not just thanks to dedicated hardware function but in general. Functions that would once have taken hours to perform now take milliseconds. Fast but it can still slow things down if you do too much of it or don't know how to balance out your rendering loop. What you call pixel shading on the GameCube isn't by most definitions. It's just very good use of the GC's very capable texturing functions and polygon level shading. It makes the games where it is applied look good but it cannot duplicate those things the Xbox can produce without slowing things down terribly. That isn't necessarily a problem for non-interactive sections of a game such as scene rendered in realtime by the game engine rather than streaming video from the disc.

If you don't believe sophisticated shading can be done in software, take a look at Metal Gear Solid 2 on the PS2. At the time it was first demoed people were amazed because the coding team had managed to wring effects out of the PS2 beyond what most people had seen previously in any interactive material. This depended on long hours spent squeezing everything they could from the PS2 to have those scenes look that good while still allowing the player to do things. An interesting twist is how they screwed up the Xbox version. If you were starting from scratch, doing that game on the Xbox would be vastly easier since the pixel shaders would reduce the overhead massively. Instead they tried to port over as much of the Ps2 code as possible. THis didn't work well at all and largely ignored the dedicated functions of the XGPU by trying do it the PS2 way.

If you have to do it in software the Emotion Engine can outperform the Xbox's P-III by a good bit on that type of operation but that isn't how an Xbox game should be designed. If you ignore the XGPU's feature you're going to lose a major portion of the Xbox's advantage. The GameCube doesn't have all the whizbang features that became the major buzzwords in PC gaming around the time it launched. What it does have is a versatile hardware design that allows skilled coders to achieve things not explicitly offered in the hardware specs.
 
[quote name='epobirs']The GameCube doesn't have dedicated hardware in the Flipper chip to accomplish this function, so it has to be done in software at a high cost in clock cycles. Making this work with lots of other interactive stuff going on at the same time makes for intense amounts of time spent tunig the affected scenes.

Mario Sunshine was far froma graphics showcase but even so, considering the role water plays in that game, using such effects on the waters would have brought some scenes to a crawl. If they were doing the same game today with the better codebase established for GC developers to access in middleware products it would probably look a fair bit better but you can't pay the bills if you never ship any games while waiting for the programmers to come through with another incremental performance gain.

Such effects have been used since early on and frequently on the Xbox because the dedicated shader hardware in the XGPU pipelines make it almost trivial in terms of overhead. This is the sort of thing where dedicated hardware is much more effective than sheer horsepower and software.[/quote]

hes pretty much right , If I remember correctly ( epobirs help me out remembering this game ) on the ps1 there was a game with a simlilar situation where the feature wasnt built in the hardware but was actually coded in the software . If im correct they actually did that technique with a couple of games on the ps1 .
 
Wow, nice reply.

I also like the way you compared 3d technology, WHICH IS USABLE ON ANYTHING, to pixel shaders, which is hardware based, as an example of how anything can be software or hardware. That really shows what you are talking about :roll:

But I did a bit of looking, and you were right about one thing: It can be emulated in software. There's just one problem...

http://www.shaderx.com/direct3d.net/tutorials/shader/shader4.html

won't run on graphics hardware that doesn't support pixel shaders.

Or specifically, follow this link:

http://www.gamedev.net/community/forums/topic.asp?topic_id=196717

Yes, so somebody who is using pixel shaders FOR SIMPLE THINGS in software mode, ON A GEFORCE 4 MX (Probably more powerful then a Gamecube, sadly), is getting... 10 fps.

That REALLY is what the GC is using. And the PS2. Yeah.

I really enjoy it when fools post stuff that they have no clue of, which you can easily find online and make an ass out of them. Maybe you should go join GameFAQS. They have millions just like you.

BTW, if the PS1 only has a 33 mhz processor and can't do good textures OR can't even emulate texture filtering, there's NO WAY IN HELL it could do pixel shading.
 
You are truly a jackass. You throw around claims that other are ignorant while amply demonstrating that attribute in yourself.

Read slowly so it sinks in: ANY COMPUTATIONAL OPERATION CAN BE PERFORMED IN SOFTWARE USING A GENERAL PURPOSE PROCESSOR. Hardware elements may accelerate portions of the task without having a complete hardware implementation of the function. A very good example of this is the hardware assist features in most modern video chips for reducing the overhead of playing DVD video on a PC. An x86 processor has been able to do this purely in software since the days of the 300 MHz P-II but it completely consumes the systems to do so. Putting a dedicated MPEG-2 decoder on the system is usually considered an unwarranted expense but placing a subset of that function set the covers the ost computationally intense portions gets the job done for much lesser cost. Rather than including a $50 dedicated chip you can add about $5 worth of transistors to a video chip to have 70% of the same effect. There is still some CPU load but now the system can be available for other task while feeding DVD playback out to a TV.

You've really made your failure to pay attention apparent with both of those links. Each is in the specific context of DirectX. The DX8 pixel shader support was largely defined by Nvidia, a company with a vested interest in selling the then new GeForce 3 rather than providing such operations on earlier hardware. Another big reason as well is the very purpose of DirectX, which is provide hardware abstraction, enabling all different brands of chips to be addressed by a single code base that is translate to the correct individual instructions by the vendor specific driver.

Nvidia has always had a big influence on DirectX thanks to being the first chip company to make a commitment to doing things Microsoft's way as opposed to using a subset of OpenGL like the then leader in gaming cards, 3Dfx. Early on though, too large a portion of PC had no 3D hardware at all and it was deemed a bad idea to not offer software mode support to those systems when it was viable in the particluar game. This turned out to be a largely wasted effort as many publishers soon took to placing '3D Accelerator Required' stickers on boxes. (3Dfx wisely financed this as part of their PR budget. Games with the label and a recommendation for a Voodoo card got co-op advertising funds, similar to the 'Intel Inside' campaign.) By the time of DirectX 8 years later it wasn't seen as worth the trouble to provide a software shader fallback function. Nvidia had such code they'd developed as part of the process for designing the GeForce 3 hardware but as meantioned before they had a vested interest in seeing the new chip get sold by encouraging devlopers to go whole hog with effects that would be overwhelming in a software fallback. Nvidia has similar code for their DX8 calss chips they used in the process of designing the DX9 generation. But that doesn't mean PC game devlopers weren't rolling their own. Plenty of PC games at the time had moments in the game where frame rate wasn't an issue and software effect could be used for abit of extra dazzle. It didn't get spoken of as pixel shading because few people outside of the CG industry knew the term. It was just 'that nice effect in that one part.'

Now we come to the big question: Why do you think the self-imposed limitations of DirectX have anything whatsoever to do with GameCube programming? Are you under the belief Nintendo is shipping a Microsoft owned API on their platform? GameCube developers are free to do what ever they can squeeze out of the system. This, on every platform, includes effects that would have much less overhead if they were hardware functions but are nonetheless worth doing if they make the game look better without disrupting play.

Apparently you aren't too up on the Nvidia product line. The GeForce 4 MX, despite the name, is not a cut down GeForce 4. It is an improved GeForce 2, mainly with upgrades to video playback assist functions. At NTSC resolution it is in no way superior to the GameCube. At higher resolutions it could have an advantage but since the GC is only intended to drive NTSC/PAL displays that difference is meaningless. Unlike the XGPU the GC was design from the ground up as a game console for TVs and doesn't have a lot of modes for things it would rarely be calle upon to perform. This is one of the reason for it's lower cost to produce.

I wasn't kidding when I said Pixar was shipping animation software with a shader language built in long, long before Nvidia made it a buzzword term. Have a look at this from 1992:
http://www.renderman.org/RMR/Books/sig92.course21.pdf
The DirectX and OpenGL shader concepts are direct descendents of this material.

For another cutting edge example of a function that has long been purely software and a source of major cPU loads, there is physics. The situation is the same. You have a set of mathematical function that are used in different combinations over and over to achieve a given goal. A realistic animated rock slide in a game, for instance. If those complex functions are implemented in hardware you could get a lot more rocks tumbling towards the player nad offer more realistic behavior to their movement. The trick is being able to identify those mathematical functions that are most needed and create a silicon version that has a low enough cost to sell to consumers. High-end simulation systems have long used expensive FPGA setups for such acceleration. A $5,000 board is reasonable for a film studio rendering such for a movie. To get this into computer games the price has to be a lot, lot lower. One company thinks they have the first such product: http://www.ageia.com/index.html
 
If you guys are going to fight, get to your points quicker.

: ) just kidding.

I don't know which post to read because you're both calling bs on one another.
 
You might also want to read this:
http://www.extremetech.com/article2/0,1558,1152642,00.asp

Again, GameCube doesn't have all the whizzy stuff that was coming to consumer product at the time of its design but it does have some good functionality in there for those with the ability to work at that level. Not every developer can boast such talent. Those effects can be achieved by those of lesser skill writing Xbox games but as we've seen that is no guarantee of quality. In terms of what Nintendo was seeking to accomplish in the hardware ArtX delivered very well. They weren't ignorant of the idea of programmable shaders in hardware. Some of the same engineers had create boards for accelerating shaders while working at SGI. (ArtX was mostly the same SGI personnel who created the N64 coprocessor.) The problem is that the cost in additional silicon real estate would have driven up the cost considerably.

Things may not have worked as Nintendo hoped but their planning was at least rational. Low cost of entry was a major issue and that meant kkeping down chip costs. The GameCube's failure to manage greater market penetration cannot be blamed on a lack of hardware shading functions.
 
[quote name='epobirs']ANY COMPUTATIONAL OPERATION CAN BE PERFORMED IN SOFTWARE USING A GENERAL PURPOSE PROCESSOR.[/quote]

You know, you're really not playing fair; you're using facts to prove your point just because all the facts are clearly on your side.

I mean, what's the other guy supposed to do? Go learn about this stuff?
 
I see, I don't have the facts, and I'm ignorant, when people who actually USE the software mode, get 10 FPS.

On a small area. Using a Geforce4mx.

On a next-gen area, they'd get 1-5 fps.

Why don't YOU pay attention dumbass: Using software mode is slow as hell.

If I knew there was a software mode I could've told you this a LONG time ago.
 
I'd like one of you, just ONE, to explain how the GC would stay at 60 fps, LIKE MY ORIGINAL point, if a better video card uses software pixel shaders on just a sphere and gets 10 fps...

I'm waiting... Of course, you'll ignore it and call me dumb, as is the ways of fools.
 
I didn't want it to come to this, but I see you aren't going to let me by without... a reply. I really didn't feel like it.

But whatever.

Hardware elements may accelerate portions of the task without having a complete hardware implementation of the doftware.
Uh huh... First let me congratulate you on making 1 out of every 3 words a big, 'intellectual' word! You're obviously trying to influence somebodies opinion about you and your intelligence...

But enough about that, lets get to the good stuff:

A very good example of this is the hardware assist features in most modern video chips for reducing the overhead of playing DVD video on a PC.

Ah, so we are now comparing dvd viewing to shaders? Why don't we compare stencil shaders to bilinear texture filtering?! We'd get the same results as this garbage.

You've really made your failure to pay attention apparent with both of those links. Each is in the specific context of DirectX. The DX8 pixel shader support was largely defined by Nvidia, a company with a vested interest in selling the then new GeForce 3 rather than providing such operations on earlier hardware. Another big reason as well is the very purpose of DirectX, which is provide hardware abstraction, enabling all different brands of chips to be addressed by a single code base that is translate to the correct individual instructions by the vendor specific driver.

You took all this just to say that Directx does what all normal drivers do? Wow, I'm sexually aroused! :roll:

By the time of DirectX 8 years later it wasn't seen as worth the trouble to provide a software shader fallback function. Nvidia

I see, so they must provide SUPPORT in software to let you encode in software?! Are you really trying to feed me this BS?

GameCube developers are free to do what ever they can squeeze out of the system.

You're acting as if they have unlimited framrates. When you encode in software, it kills it.

Apparently you aren't too up on the Nvidia product line. The GeForce 4 MX, despite the name, is not a cut down GeForce 4. It is an improved GeForce 2, mainly with upgrades to video playback assist functions. At NTSC resolution it is in no way superior to the GameCube. At higher resolutions it could have an advantage but since the GC is only intended to drive NTSC/PAL displays that difference is meaningless. Unlike the XGPU the GC was design from the ground up as a game console for TVs and doesn't have a lot of modes for things it would rarely be calle upon to perform. This is one of the reason for it's lower cost to produce.

A geforce 4mx was an offshoot of a the top of the line geforce 2, with a cpu speed upgraded to almost a regular geforce 4. I said it because it's the best video card without hardware pixel shaders. And yes, if an Xbox is equal to around a Geforce 3 (Which in fact, is what it was made on), then obviously the GC wouldn't be better then a mx, even if it's just an upgraded geforce 2.

For another cutting edge example of a function that has long been purely software and a source of major cPU loads, there is physics. The situation is the same. You have a set of mathematical function that are used in different combinations over and over to achieve a given goal. A realistic animated rock slide in a game, for instance. If those complex functions are implemented in hardware you could get a lot more rocks tumbling towards the player nad offer more realistic behavior to their movement. The trick is being able to identify those mathematical functions that are most needed and create a silicon version that has a low enough cost to sell to consumers. High-end simulation systems have long used expensive FPGA setups for such acceleration. A $5,000 board is reasonable for a film studio rendering such for a movie. To get this into computer games the price has to be a lot, lot lower.

And you needed to go off topic and say this, why? For any other reason then attempting to influence others into thinking your 'intelligent'?

If I owned a store, and was talking to someone about, god I dunno, their day, and then started trying to sell them something, isn't that just a bit...
 
[quote name='AdultLink']I see, I don't have the facts, and I'm ignorant, when people who actually USE the software mode, get 10 FPS.

On a small area. Using a Geforce4mx.

On a next-gen area, they'd get 1-5 fps.

Why don't YOU pay attention dumbass: Using software mode is slow as hell.

If I knew there was a software mode I could've told you this a LONG time ago.[/quote]

Dude, you keep missing the obvious. You keep trying to compare apples and oranges. The performance strengths and weaknesses of different platforms vary wildly. The PC is no barometer of how other systems will perform at the same task. Even different brands of x86 processors have dramatically different performance at the same clock rate, as seen in AMD products sold under a performance rating to indicate what Pentium 4 clock rate is equivalent in performance. For instance, the Athlon 64 rated at 3000+, indicating performance equivalent or better in most areas to a 3.0 GHz Pentium 4, runs at just 2 GHz. Other processor architectures can substantially outperform an x86 at the same clock rate. This is especially true of the PowerPC line which, surprise, surprise, we find in the GameCube. On top of this the Gekko was substantially customized by IBM for console game applications. For instance, 64-bit registers that would have been useless for nearly all game use were changed to instead perform SIMD operations on 32 and 16-bit values, which added great value.

You've made a few assumptions that have led you astray. First, you've assumed what you saw in certain GameCube games is pixel shading of the sort produced on the Xbox and PC games in recent years by dedicated hardware units. In turn you've assumed the GC has hardware features that Nintendo has chosen to hide from public knowledge for unfathomable reasons and furthermore has hidden these features from most developers or somehow made them reluctant to use them in their projects. (Feel free to show some documentation for the existence of a pixel shader in the Flipper chip.) You may find this of interest, an interview in which a very reputable GameCube developer speaks of being able to apply the Flipper's pipeline in a manner that outputs equivalent to a shader. There is much software work required but he asserts being able to match Xbox: http://www.planetgamecube.com/specials.cfm?action=profile&id=203 In other word, a versatile design can often match dedicated functions where it matters. It just take talent and work. Not every developer can manage that. Thus the lack of wider usage.

Second, you've assumed the PC was the ideal platform for attempting this operation in software compared to all other platforms. The only reason you'd see most people doing their experimenting on a PC (or Mac under OpenGL) is because that is the only open platform that encourages such without having to become a registered developer. Sure, there are people hacking consoles with DIY dev tools but doing it on a PC is much simpler if you're just seeking an understanding of the technology as opposed to any particular implementation. The X86 doesn't dominate the PC market because it's the best architecture and won over developers. It's there because the competing platforms were badly managed, thanks to being confined to a single company, and died with their makers. The use of PCs in business guaranteed the platform wasn't going anywhere and the multiple companies selling compatible machine further assured its stability. In the era when competing platforms were common the Motorola 680x0 series was much more popular, almost to the point of exclusivity among 16-bit systems including Apple, Atari, and Commodore. One of the few notable exceptions was the Acorn Archimedes which contained the first ARM chip. It was a good deal more powerful than the 80286 chips then common in PCs but was relatively unknown outside of the UK. The CPU used in the GBA, DS, and most PDAs and cellphones today is a direct descendent.

Not only can another processor have greatly better performance at a specific operation despite a lower clock speed, much in the way a minor amount of functionality accelerates DVD playback, a specific feature of a system that accelerates a major portion of the otherwise software operation can make a major difference. This is amply demonstrated in running two versions of an app on a modern processor with SIMD extensions. Assuming the application makes heavy use of the functions the SIMD set is intended to accelerate, the code that uses SIMD rather than doing each operation individually will be much faster. Without having dedicated shaders a set of CPU and coprocessor functions that speed up a subset of the operation can provide enough of an improvement to make it usable in application where another platform lacking these traits would not be able to deliver. As indicated by Julian Eggbrecht in the interview linked above, the GameCube does offer such an ability to accelerate such thing depite lacking a complete solution in hardware.

(Side note, you may have noticed x86 processors have nearly no presence in video game systems. There was a British company, Konix, that had a system with a very elaborate reconfigurable controller that used an 8086 CPU but the company lacked the funds to reach launch. http://home.wanadoo.nl/hessel.meun/konix/konix-main.htm The next closest thing to a x86 console was the Tandy VIS, a machine Microsoft and Tandy created as a competitor to CD-i, as if CD-i represented a market worth grabbing at. It was OK for CD-ROM edutainment sort of stuff but essentially a historical footnote.)

Third, you made several assumptions about the swShader application. You assume on the basis of no data that it is the best that can be done on the platform in question. You also make assumptions about the equipment of the person who said he was getting 10 FPS when he didn't offer any such information at all. There is no telling what he is using and thus no basis for any comparison to anything attempting the same task in software. Then finally you do something that goes beyond assumption, it is an outright mistake. The site for swShader states plainly that it emulates the DX9 class shaders. If you've ever looked intot he specs of DX9 chips you'll see they're massively larger in terms of transistor counts than their DX8 predecessors. This is because a 2.0 and higher shader is much, much more complex than a DX8 shader. It uses floting point at high accuracy levels end to end (though Nvidia and ATI have differed in minimum accuracy requirements), supports a much fuller instruction set, and support massively larger shader programs. To produce a DX9 shader in software involves almost two orders of magnitude greater processing than doing a DX8 shader in software. In others words, you aren't comparing the GameCube to an Xbox, you're comparing it to the functionality of a GPU almost four years more advanced than the XGPU.

So if you look at the situation with some clarity, it becoems apparent the GameCube has a lot of power but not everyone knows how to apply it fully or is willing to do the work.
 
[quote name='AdultLink']I see, I don't have the facts, and I'm ignorant[/quote]

I'm glad you finally came to this conclusion, the rest of us had a while ago.

As you said in the same post that I quoted from before, doing functions entirely in software is possible, though it slows down the application tremendously. This goes for any function, whether it be three dimensional image rendering, radiosity, pixel shading, ray tracing, etc. Just because your graphical hardware does not have instructions hardcoded on that would help to optimize and accelerate the execution of your graphical function does not mean that it cannot be done. I won't go any further, since epobirs explained all of this to you and then some, but you won't, you know, read it.
 
[quote name='"AdultLink"']I didn't want it to come to this, but I see you aren't going to let me by without... a reply. I really didn't feel like it.

But whatever.

Hardware elements may accelerate portions of the task without having a complete hardware implementation of the doftware.

Uh huh... First let me congratulate you on making 1 out of every 3 words a big, 'intellectual' word! You're obviously trying to influence somebodies opinion about you and your intelligence...

If you're having difficulty following my responses this might explain why you so badly misunderstand the issues involved.

But enough about that, lets get to the good stuff:

A very good example of this is the hardware assist features in most modern video chips for reducing the overhead of playing DVD video on a PC.

Ah, so we are now comparing dvd viewing to shaders? Why don't we compare stencil shaders to bilinear texture filtering?! We'd get the same results as this garbage.

Because it is a universal comparison. As I mentioned before , any computational problem can be performed on a genral purpose processor. It doesn't matter what the operation is, the advantages of lesser or greater hardware implementation are the same. The question is whether the operation merits the cost for the hardware. Long ago it was a common part of PC benchmarks to time the recalculation of a large spreadsheet. This has ceased to be useful because system became so fast that nobody cared about it anymore. So long as it happened faster than human perception everyone was happy and much greater loads were needed to measure things. Applications like video encoding soon became more popular benchmarks that were more in tunes with what users wanted to know. This is a function that is readily performed by dedicated hardware but at current costs most people don't want to pay for such in the consumer market. So the capability of a general purpose CPU for this task matters, even if performs at substantially less than realtime and takes five minutes for every minute encoded. Since the reader is goinging to own a PC regardless getting it to do the job slowly is better than not being able to do it at all. Thus, even after a hardware solution is available, ATI's Theater 550 chip's main improvement is hardware encoding where its predecessor relied on the system CPU, for instance, the demand for software versions remains if the hardware represents too great of an investment.

You've really made your failure to pay attention apparent with both of those links. Each is in the specific context of DirectX. The DX8 pixel shader support was largely defined by Nvidia, a company with a vested interest in selling the then new GeForce 3 rather than providing such operations on earlier hardware. Another big reason as well is the very purpose of DirectX, which is provide hardware abstraction, enabling all different brands of chips to be addressed by a single code base that is translate to the correct individual instructions by the vendor specific driver.

You took all this just to say that Directx does what all normal drivers do? Wow, I'm sexually aroused! :roll:

There you go again, Ignorance Lad. DirectX is an API layer, not a driver. Drivers are unique to the hardware they support and differ greatly among different manufacturers whose hardware has the same objective but entirely different internal designs. The point of an API layer is to allow developers to ignore the difference between devices of the same class from different companies. If you can remember that far back the early 3D video card scene was chaotic in terms of software standards. Almost every chip company had their own API for talking to their products. 3Dfx, Rendition, Matrox, and others all had features that couldn't used without writing separate code for that card. 3Dfx did the best job early on of evangelizing their GLide subset of OpenGL and thus most games looked best on Voodoo equipped PCs since all other cards receive much lesser support.

DirectX provided almost full abstraction of the major chip features so software could written to a generic virtual card and provide nearly the best of each card's capabilitie while criting only one code base. Vendor who wanted to push new features had to make an effort to convince Microsoft they were worth supporting or run the risk of having those features ignored by developers. Nvidia was the first chip maker to make a deep commitment to DirectX and thus had little difficulty getting most things they proposed supported by Microsoft. In many ways the unsuccessful NV1 was the founding platform for DirectX but it began to pay off with the RIVA 128 getting picked by a lot of PC vendors in numbers developers couldn't ignore.


Continued next post
 
This thread is unbelievable. Two nerds/geeks (don't worry, we all mostly are, I included) going at in a duel of wits in a messageboard. Back and forth, who knows who will come out on top, of course the one with the facts, epobirs :D!
 
By the time of DirectX 8 years later it wasn't seen as worth the trouble to provide a software shader fallback function. Nvidia

I see, so they must provide SUPPORT in software to let you encode in software?! Are you really trying to feed me this BS?

You misunderstand, which seems to be the rule. Since the point of DirectX is to write one code base and avoid time spent in support of individual brands of hardware, part of this is to provide an automatic substitute for when a hardware feature is missing. While developers could query the features list DirectX provided to see if the function in question was present decided if the game shouldn't do it at all, the idea was to make it transparent in those cases where the developer chose to leave it up to the player. If you've played PC games much you should be familar with the menus that allows different features of the game to be adjusted to fit the performance level of the computer.

Since in the early days many computer had no 3D hardware at but the games were often still viable running entirely in software, this was considered very valuable. Nowadays, the developers instead have to check for the class of video board in the system and adjust to which shader level is available among those the game supports. In time, more and more games will apear that simply do not run on anything less than DX9 level hardware but for now that would limit the market too much. such transitions are part of the hassle in PC gaming.

There are occasions when a developer wishes to optimize for a particular video chip or use a feature not yet offered as part of DX. (Money from the chip company is usually involved.) In this case they'd want to bypass DirectX and sometimes even the driver. DirectX has commands to announce when you're going 'off-road' so that this can be managed without abandoning DX entirely. This method also allows developers to implement operations that the hardware doesn't support at all but can be described in code. In the past this has often been tweaks to the texture operations to get a certain look but that is becoming increasingly less likely with newer hardware providing full programmability.


GameCube developers are free to do what ever they can squeeze out of the system.

You're acting as if they have unlimited framrates. When you encode in software, it kills it.

Julian Eggbrecht, who knows far more about the GameCube innards than almost anyone, apparently disagrees. The games produced by his company, which you yourself noted earlier, serve as examples.

Apparently you aren't too up on the Nvidia product line. The GeForce 4 MX, despite the name, is not a cut down GeForce 4. It is an improved GeForce 2, mainly with upgrades to video playback assist functions. At NTSC resolution it is in no way superior to the GameCube. At higher resolutions it could have an advantage but since the GC is only intended to drive NTSC/PAL displays that difference is meaningless. Unlike the XGPU the GC was design from the ground up as a game console for TVs and doesn't have a lot of modes for things it would rarely be calle upon to perform. This is one of the reason for it's lower cost to produce.

A geforce 4mx was an offshoot of a the top of the line geforce 2, with a cpu speed upgraded to almost a regular geforce 4. I said it because it's the best video card without hardware pixel shaders. And yes, if an Xbox is equal to around a Geforce 3 (Which in fact, is what it was made on), then obviously the GC wouldn't be better then a mx, even if it's just an upgraded geforce 2.

You are perhaps under the belief that the GeForce 2 was some worldbeater. In fact, it was outperformed quite well by concurrent 3Dfx boards. At best, the GeForce 2 was the first video chip in which hardware T&L became genuinely effective. The original GeForce 256 was a great disappointment in failing to outperform more tradition designs from the then leader. In any case, unless that board has features which serve to accelerate a software shader the performance of that board is inconsequential for a CPU intensive task.

Continued next post
 
For another cutting edge example of a function that has long been purely software and a source of major cPU loads, there is physics. The situation is the same. You have a set of mathematical function that are used in different combinations over and over to achieve a given goal. A realistic animated rock slide in a game, for instance. If those complex functions are implemented in hardware you could get a lot more rocks tumbling towards the player nad offer more realistic behavior to their movement. The trick is being able to identify those mathematical functions that are most needed and create a silicon version that has a low enough cost to sell to consumers. High-end simulation systems have long used expensive FPGA setups for such acceleration. A $5,000 board is reasonable for a film studio rendering such for a movie. To get this into computer games the price has to be a lot, lot lower.

And you needed to go off topic and say this, why? For any other reason then attempting to influence others into thinking your 'intelligent'?

If I owned a store, and was talking to someone about, god I dunno, their day, and then started trying to sell them something, isn't that just a bit...[/quote]

No, this was entirely on-topic and germane to the issue. Given wide adoption the hardware described above could lead to a time in a few years when complex physics in games involving thousands of objects becomes commonplace. So much so that most games appearing at that future point would become hopelessly bogged down depsite this only performing task that had been entirely software before. Thus things evolved and become used more deeply as it becomes economically feasible to devoted transistors to the function.
 
This thread just shows why epobirs is my favorite poster on this site.

He makes inteligent comments and debates issues without resorting to "YoUr MoThEr is the Sux0rzzz" or other such comments all the time and just about everytime he posts something I learn a little bit.
 
I know there's a connection, and I'm going to prove it.....

9903456434217092d02d6a.jpg


1290545604111e3d346ce0.jpg


829060811415c0d1c4bb08.jpg
 
I don't know... a good argument has two sides you may partially agree with; you see both viewpoints. This thread is pretty much somebody teaching a new subject to somebody else.
 
Come on guys!!! Everyone knows that shading on the gamecube and most PC's are done by the magical gnomes that randomly appear within the system to give it a magical boost.

Sometimes it's just more fun to be a smartass.
 
[quote name='Stoneage']I know there's a connection, and I'm going to prove it.....

9903456434217092d02d6a.jpg


1290545604111e3d346ce0.jpg


829060811415c0d1c4bb08.jpg
[/quote]

Great detective work sherlock. I can't believe that I missed that connection :? :lol:
 
[quote name='jkam']Come on guys!!! Everyone knows that shading on the gamecube and most PC's are done by the magical gnomes that randomly appear within the system to give it a magical boost.

Sometimes it's just more fun to be a smartass.[/quote]

I think they should call it Blast Processing!
 
[quote name='WhipSmartBanky'][quote name='jkam']Come on guys!!! Everyone knows that shading on the gamecube and most PC's are done by the magical gnomes that randomly appear within the system to give it a magical boost.

Sometimes it's just more fun to be a smartass.[/quote]

I think they should call it Blast Processing![/quote]

Gotta love that.

"We've been selling this bugger for four years now and somebody in engineering found a page that was supposed to be in the dev kit doucmentation but it fell behind the copier. Apparently we have some miracle called Blast Processing in this thing. I think we should demonstrate it to the public by running commercials where the actual play part of our flagship game is compared to the title screen of the competition's current hit."
 
Word to the wise AdultLink...you dont argue with epobirs about technical stuff, you will not win, the man has an endless amount of knowledge about it. Just as you do not argue with JSweeney, although I havent seen him lately, you most like wont win an arguement with him either. These two people I feel are in the elite status of CAG. So back off and throw in the towel.
 
bread's done
Back
Top