Video cards are supposed to be supporting actors in computers; for the most part, their only job is to take the strain off the processors that do "real" work when users play video games. But these components are now becoming as complex and expensive as the Intel or AMD chips that still take top billing in computer ads -- and to many gamers, they're even more important.

Nvidia's latest cutting-edge graphics card, the top-of-the-line GeForce 6800, for example, cost $300 million and took 18 months to develop. It sells for about $500, a price that some hard-core gamers will gladly pay for the thrill of seeing more realistic water or shadows in a computer game.

Producing such capabilities has taken a toll on the industry. Five years ago, there were more than 50 makers of graphics cards. Now there are just two major players, ATI and Nvidia, which enjoy roughly equal standing in the market.

That's probably good news for consumers, as the proliferation of cards once made playing games on computers a confusing process because some games could be properly displayed only by one manufacturer's graphics card and not another's.

Graphics card companies today work so closely with the game industry that some cards are starting to come with stamps of approval from game makers. For instance, Id Software recently highlighted certain cards for the upcoming release of its Doom 3 game.

Cards with 256 megabytes of onboard memory are the current standard, but graphics card makers say users might be better off considering what "architecture" a card was designed for. This usually refers to the various versions of a Microsoft multimedia-software framework called DirectX.

The current release is DirectX 9, or DX9 for short. It's been on the market for two years (its successor isn't expected until 2006), but game developers are only now beginning to take advantage of it. Many typically wait until enough cards are in circulation before they begin pushing the envelope -- otherwise they risk limiting their ability to sell new games to only the early adopters who snap up the new cards.

What kind of difference does a card make in practice? Play a game built for DX9 with an older card and you'll miss some of the eye-popping details that new cards are designed to display. "You'll still be able to play but you won't see certain things," said David Roman, vice president of corporate marketing at Nvidia. In a new golf game, for example, "the light shimmers, the grass undulates in the winds, and you see the shadows of Tiger Woods on himself as he moves certain ways."

Card makers typically cater to the gaming crowd, but users of other graphics-intensive programs can also benefit from the advances. Apple's Mac OS X operating system relies on graphics cards to display that operating system's slick visual effects. Many other household applications, however, don't need the capabilities of high-end cards.

Even when a high-end card is needed, the differences between competing offerings at the same price points can be hard to notice. William O'Neal, hardware editor at Computer Gaming World, just took a deep look at two cards now hitting the market from ATI and Nvidia and had a hard time distinguishing between the two. "It's pretty much a tossup at this point," he said.

Benchmark tests showed a difference of a mere two refreshed images per second between the two.

Maybe that's why, at this point, some shoppers simply get the card with more freebies. For example, some cards from ATI last year shipped with a $50 coupon for Half-Life 2, a blockbuster action title -- that hasn't shipped yet.