back to article Are you getting Double Vision?

PCs and diversity go hand in hand, whether it’s how we choose for them to look, feel and perform, or what we expect them to do. As well as business machines, they have become gaming machines, communicators and media centres. They have also become smaller and mobile. Part of the evolution enabling all this choice is the ongoing …


This topic is closed for new posts.
  1. Trevor_Pott Gold badge

    Graphics cards are about more than FLOPS

    Graphics cards exist to do one thing: display information. In the breakneck rampage towards "FLOPS, FLOPS, FLOPS!!!!!!!!!!!!" some critical bits have been cast aside. An example is all the additional elements that allow you to (for example) calibrate a suitable monitor using a colorimeter.

    Where I work, this is a Big Thing. We need cards capable of calibration; our livelihoods depend on it! Getting these cards in a laptop is miserably difficult at best, damned near impossible when you start tossing this hybrid crap in there.

    A few things are /required/ for auto-calibration to work, things that keep getting left by the wayside in the rush for numbers.

    Auto calibration is completely dependent on a Display Data Channel (DDC) connection. You need the right equipment in the video card itself, a real DVI-D cable, and a monitor (such as a LaCie) that will actually respond.

    You need a colorimeter and appropriate software (such as Blue Eye Pro.)

    And most importantly, you need your video drivers to not be sacks of rancid monkey excreta.

    Apparently Intel is completely and utterly incapable of either putting DDC hardware into their gear. Or, if they are, they couldn’t write a display driver if you were deorbiting a small moon over their heads.

    nVidia are almost as bad. Their drivers are crap…but at least one in every three or so are workable. The problem is that nVidia isn’t exactly what I would call “strict” on how cards based on their chips must be built. You can find eight cards from different manufacturers, all running the top-of-the-line chip, and whether or not it will actually choose to calibrate a La Cie monitor is about as random as neutrino detection.

    ATI on the other hand are consistent. Their ?5?? series are a crap shoot, half will calibrate, half won’t…but if it is ?6??, then you are good to go. It /will/ work. Drivers are almost never an issue, and ATI seems to force it’s partners to meet minimum standards of not sucking.

    What do hybrid solutions do to something like this?


    You have your crappy little Intel, or dirt-bottom AMD in-CPU chip with no extra blue crystals whatsoever. Then somewhere behind it, (and a few layers of extra obfuscation, just for fun,) you have something that can crunch numbers faster than the crappy “mini-GPU.”

    Nowhere however does this address the issues related to “we cheaped out on the non-number crunching hardware” or “in order to enhance shareholder value, we fired all our driver devs.”

    Hybrid graphics processors are TERRIBLE. They are going to bring mobile graphics down to the lowest common denominator, rather than the existing market where at least there is some differentiation possible due to actual competition.

    We enter a world now where the interface to the outside world is an Intel GPU class crappy front-end, or you can dig REALLY deep and get a FireGL or equivalent. If you want more than just raw number crunching, this hybrid nonsense is destroying the midrange.

This topic is closed for new posts.

Other stories you might like