X2 in cross fire?
So can you put 2 X2's together, in crossfire, for a 4 GPU pixelfest?
(not that i have 2 of them, just curious..)
Everything about the Sapphire ATI Radeon HD 3870 X2 graphics card is big, including the model name, so we’re going to stick to calling it the X2 here. The principle behind the X2 is simple enough. AMD's ATI Radeon HD 3870 is a decent graphics card but it's having a job competing with Nvidia's GeForce 8800 family, which is the …
It looks possible for a number of reasons. Crossfire was initially designed with up to 4 GPUs. The pictures in the review article also shows a bridge interconnect at the top of the card.
Given the difficulties in getting games to accept 2 cards in parallel though, I imagine this is rather taxing to get done in the real world (not to mention rather expensive...)
The one thing that puts me off cards like these are the price. I could buy a PS3 for that...
Yes you can, no problem. But unfortunately, you can't hook up four of these for 8 GPUs as each card has only one spare CF connector. Regular cards have two CF connectors and the only thing stopping one from putting 6 or 8 cards in a system is the lack of 16x/8x slots on a motherboard and driver support.
"The problem is that CrossFire can be erratic"
Just like SLi.. and the dual chip Nvidia cards? STILL NOT SUPPORTED in vista properly even though they have been out for 2 years..
£150 quid for a PSU to do 6pin and 8pin for graphics? Err no - more like £60 quid and ALL decent sli or morderm PSU's have these connectors and if they don't they have the power with the always included adaptor.
Oh and to answer Rob - Yes, yes you can run 2 X2's together for quad GPU power.
El Reg can you please up the quality of your hardware reviews they need a bit more work to be top notch.
I find it hard to believe that the reviewer has obviously overlooked the fact that this card is only supported by a single x16 PCIe slot, thus creating a bottleneck when the card is pushed to the limit.
The 2x 2900XT's have at least 1 x 16xPCIe +1x 8x PCIe slot creating 150% of the bus bandwidth to play with for highly intensive graphics.
I may be mistaken, the 2 x 2900XT's may have 2 16x PCIe's further improving bus throughput.
Nvidia fan? oh please. Most high-end power supplies have a bunch of six-pin PCIe connectors but eight-pins are rarer than a rare thing. If you can provide a couple of make and model numbers that are appropriate to overclock the X2 then I'll eat my words.
Brian and Bus bandwidth: in principle you are correct with a great big 'However'
I don't know how much bandwidth a PCIe graphics card requires. I know the speed of PCIe 1.1 and 2.0 and how many lanes the various PCIe slots provide but I don't know how much bandwidth, say, a GeForce 8800GT requires. The Intel P35 chipset only supplies four lanes of PCIe 1.1 to the second graphics slot which is one quarter of the main slot and one eighth of the bandwidth of the PCIe Gen 2.0 slots you'll find on the latest crop of motherboards but even so it seems to be adequate in most cases.
Turning that the other way round a pair of HD 3870 cards in a single x16 slot a la X2 is the equivalent of eight lanes of PCIe per chip provided the bridge chip and drivers do their jobs correctly, in which case I doubt the graphics slot will be the bottleneck in performance.
I had two ATI cards in the past (7500 and 9500 based). Unless things have changed - you'll be waiting a long time for ATI to "straighten out" the drivers, because they didn't do a very good job back then, with (supposedly) "release) drivers.
One of the reasons I switched to nVidia. Haven't had nearly the driver agony since.
(I know some will say "That's all fixed now!" - but, well, it wasn't fixed when I had the cards. That whole thing about leaving a lasting impression ...)