May 31, 2013

Graph-a-Bits Soup

Filed under: Main — Tags: — admin @ 12:01 am

One of the most mysterious aspects of computer technology is the video hardware. It’s a soup of acronyms, a knot of numbers, a maze of mystery. It’s always been that way.

It must be a tradition, because the industry has always used weirdo terms and numbers to describe graphics inside a PC.

I bring up this issue because I recently upgraded my PC with a brand spankin’ new graphics adapter. I don’t even recall off the top of my head the card name or number, because it’s just so bothersome to remember that stuff.

My system was crashing, but just the video. The error logs showed the video driver was to blame (which I wrote about here), but rolling back the driver didn’t work in the long run. So I ordered a new graphics card.

For advice, I turned to an expert, my son Jeremiah who is very good (no kidding) at video games. (He routinely gets banned because the other players think he’s cheating. He’s not; he’s just very good.)

Jeremiah recommended that I get the NVIDIA GEFORCE GTX 690. At about $1,000, that was a wee bit too spendy. I can buy an entire computer for $1,000. So I opted for the less expensive but still quite impressive GTX 680. The sucker occupies two slots on the motherboard and requires two extra power hookups. Sheesh.

The card works great, of course, and the games and graphics are lovely. But what the heck is a GTX 680? The card I removed — the older card — was an NVIDIA VERTO 9600GT. Is it just me, or wouldn’t the 9600 graphics card be a better, faster card than the 680?

Apparently not.

That’s because the card names aren’t really graphics standards, but rather brands. How NVIDIA is able to successfully entice gamers and others into knowing — instantly, just as Jeremiah knew — that the 690 is a better card than the 9600 is beyond me.

When the IBM PC was introduced, it had one graphics standard: CGI, the Computer Graphics Interface. The graphics were very primitive by today’s standards. (The other standard was no graphics, just monochrome text.)

After CGI came EGA, then VGA, then the acronym dam burst and anyone wanting to save their sanity gave up on remembering what the graphics standards stood for.

I’ve attempted to commit to memory the NVIDIA graphics standards, but had to give up as they keep changing them. Their nomenclature is bereft of logic. I ranted over the issue a few years ago.

As with most high tech things, the bottom line always seems to be price. A $1000 graphics card, no matter what its acronym, is probably going to smoke. The $400 model, which may have the same amount of video RAM and similar manufacturer, is going to be good but not as good. Forget the acronyms, buy based on price!

2 Comments

  1. Be thankful that wasn’t an IBM Professional Graphics Controller, which used three slots on the motherboard from all I’ve heard.

    Also, just to nit-pick, the standard was Color Graphics Adapter, or CGA 😉

    Comment by linuxlove — May 31, 2013 @ 6:42 am

  2. Yeah, you nailed me on that one. It was CGA. I suppose my brain is too crowded with acronyms!

    Comment by admin — May 31, 2013 @ 8:07 am

RSS feed for comments on this post.

Sorry, the comment form is closed at this time.


Powered by WordPress