High-Def GPU Showdown: nVidia or ATI/AMD?

nvidatihdtv.jpg

Now that GPU manufacturers are touting their ability to tackle high-def content, it’s a good idea to investigate exactly which GPUs you should spend your money on. The two main camps, nVidia and ATI/AMD, both have several GPUs on the market, at several different price points, so choosing one is quite a hassle. Thankfully, AnandTech put several GPUs to the test, seeing how they fared in various videophile-geared tests and seeing how they worked with various CPU configurations. It’s actually disgustingly thorough and unless you’re really into PC-based high-def watching, you’ll probably have no idea what’s going on.

If there’s a one-line summary, it’s this: choose a GPU that gives you options. The ATI/AMD GPUs are designed in such a way that the end user has no control of settings like noise reduction and how much should be applied to an image. Conversely, nVidia lets you fiddle away with setting after setting. That matters for video freaks; normal people can probably pass right by “Go.” Another point, though it’s not exactly a state secret: if you plan on watching high-def content on your PC, don’t be a tightwad and choose a “value” card. These cards are complete rubbish as far as high-def goes.

HD Video Decode Quality and Performance Summer ’07 [AnandTech]