"Great spirits have always encountered violent opposition from mediocre minds."
- Albert Einstein

Humus takes a look at the world today
Thursday, November 21, 2002 | Permalink

I guess by now noone really have missed the launch of the GeForce FX. This next chip from nVidia which is supposed to move the King of 3D title back to nVidia again has been met by mixed feelings from media, gamers and developers. We have heard everything from "rocks" to "sucks". Here's my take ...

I'm certain the card will rock. There are certainly strong points of this chip where it outdoes the Radeon 9700, like longer fragment programs, up to 32 bits/channel operation, more flexible math etc. There are also weak points, like floating point textures being limited to texture_rectangle only, while the 9700 can do any of 1D/2D/3D/cubemap/texture_rectangle. All this is nice and dandy, but there's a trend that worries me. I have been expecting that the graphic market would mature over the years, but I guess I've been proven wrong, with both good and bad consequences ...

The good part first, I expected graphics to slow down in progress a little, less performance boost between generations, less new and exciting features, like where the audio card industry is today. The Radeon 9700 really proved me wrong. It outperformed anything on the market, especially with anisotropic and antialiasing maxed out where it won with an order of magnitude higher performance than earlier and competing chips. It also came with a large set of new features and I currently find myself in a situation where I have a lot more possibilities than ever. The step is larger than what I felt going from my G400 to Radeon and from the Radeon to Radeon 8500.

The bad part, which worries me a little, is that the extremely competive nature of the graphic industry is leading us (maybe with no return) into a direction where we don't really want to go. What I'm thinking of is the vacuum cleaner nVidia has attached to their GeForce FX chips.

Let's go back in time a little. Back in the old days, not even CPUs had active cooling. When chips started to appear which required active cooling people had a little hard to accept it at first, but over time people have gotten used to it. Today everyone takes for granted that a CPU should have a monstrous heatsink with a loud fan attached to it and noone reflects over the fact that the little thing uses more power than a lightbulb. I guess electricity simply is to cheap for anyone to care, but many do care about the noise. I certainly do.
Now take a look at graphic cards. My VoodooII was fine with with passive cooling. So was my G400. My Radeon had a small fan. My Radeon 8500 had a small fan. My Radeon 9700 has a slightly larger fan and requires extra power from a HD connector. The Voodoo 5 had a similar solution, but people bashed it for it's ridicolous power consumption. Noone complained about the 9700 requiring it. Maybe because it was so much ahead of everything else, something you couldn't say about the Voodoo 5. Now the GeForce FX arrives, also requiring extra power from a HD connector. Additionally, it comes with a vacuum cleaner attached to it to keep it cool. Of course nVidia pushes the thing as a revolutionary cooling solution and some people will drool over it as if it's the coolest thing ever. But I'll tell you something, this is NOT progress. This is a step towards somewhere we don't want to go. Computers are already noisy enough. I was delighted to read that the new Athlon 64 was running quite cool due to the SOI process. That's progress! If a vacuum cleaner attached to your graphic card is what it takes to beat a competitor, then I think it's time to get back to the drawing board; and rethink your strategies.

It keeps amazing me that performance still is the main selling point of graphic boards. One would expect that in this time and age people would start to value other attributes a little higher, most games do already run fine in high resolution with anisotropic and antialiasing. It's time to move along. Graphic companies need to start pushing other attributes. There's no shame of being slower, even if you release your card later. The GeForce FX has enough new cool stuff to be able to sell without requiring it's own ventilation system and filling both the AGP and a PCI slot. Just clock that darn thing at level it can handle! Power consumption scales quite linearly with clock speed, and quadratically with voltage. Just lower the clock a little; which might mean you can turn the voltage down a little too; with the result of slightly less performance, but without the need of a vacuum cleaner; and thus probably a more affordable price in the end for the customer. There are so many attributes of the card that nVidia could push instead, like features, driver quality, linux support (a large market I'm still waiting on ATi to give a rat's ass about) etc. It doesn't need to outperform the 9700 to be worth it's price. In all honesty though, the "silent computing" thing is progress. That's a really good feature I hope to see from more vendors. No need to run a fan at full speed when the chip doesn't have a whole lot of work to do. I don't buy the claims that it doesn't matter if the thing is noisy during a gaming session. Sure, partly true for high speed action games like UT, but not true for other games. You don't want air flow noise to disturb you while playing Unreal/Doom3 style games.
Anyway, lots of words here; what I really wanted to say is just this: Please, don't make noisy computing the norm of the future.

- Humus

Name

Comment

Enter the code below