back to article AMD ATI Radeon HD 5870 and 5850 DirectX 11 GPUs

Before we dive into our review of the HIS Digital HD 5850 and Sapphire HD 5870 graphics cards, let’s take a quick look at the technology behind the AMD’s new family of graphics chips. Sapphire Radeon HD 5870 Sapphire's Radeon HD 5870: a 50 per cent gaming performance boost... These DirectX 11 graphics cores use the same …


This topic is closed for new posts.
  1. Matt 58

    Still waiting

    I'l still waiting for some DX10 games that look much better in 10 rather than 9, ok they look better but not "good lord look at that" better. Personally I don't think DX 9 has had its limits pushed, DX10 games need some work.. so DX 11???? no need other than to make DX10 cards cheap as chips for your average gamer.

  2. Leo Waldock


    Matt, there are plenty of features in DX11 to stir the juices but personally I'm not all the thrilled about cloth textures and the like. For me the main point of interest is threading as DX11 promises to unlock multi core processors and let's em rip. At present you need clock speed and very little else. If DX11 delivers on its promises we will see a real benefit from quad cores.

  3. Annihilator Silver badge
    Thumb Down


    Is it just me or are gfx card's just getting more and more unweildy in terms of size? Me, I watercool and wouldn't let one of these behemoths anywhere near my system.

  4. Cameron Colley


    Yawn. How's their OpenGL support?

  5. Bannor


    Very informative review and its nice to see the power consumptions reducing.

    As someone who has tried to keep up with the new graphics cards over the years I've got to wonder if the number of truly good games released justifies the purchase of a £300 or even a £200 card. Are we finally the end of the PC as a high end gaming platform?

  6. Anonymous Coward
    Anonymous Coward

    ... given up the game ...

    Mild rant that has little to do with this review, but something I have to get off my chest RIGHT NOW dammit.

    I got my last graphics card a year ago and have since completely given up on this game, as it's begun to get ridiculous and bloody expensive.

    My first 3D card was a Voodoo1, followed by a Matrox, a TNT1, a series of TNT2 cards, a switch to ATi, back to Nvidia to the Geforce and a series of Geforce cards since then.


    To play the latest, coolest, graphically amazing PC game.

    The cost?

    A stupid amount of money - some cards costing as much as the equivalent cost of an entire games console.

    But it's not just the video card, sometimes you need to upgrade your motherboard to ensure your getting the most performance.

    I can't believe I followed the course I did for so long - literally thousands of pounds over the years just to eek a bit more performance out of a game.

    No more - I've had enough - it's time to switch to console gaming.

    Will this ATI/Nvidia battle ever end?

    I think at some point, it has to - PC gaming is becoming more and more an elitist area, due to the high cost factors involved. The more elitist it becomes, the smaller the target market becomes and the more expensive the hardware.

    Rant over.

  7. bigphil9009

    Windows 7 RC?

    How come you are still using the RC? Surely a site like The Register would have a Technet or MSDN account? The final version has been on there for ages!

  8. Greg D
    Thumb Up

    I know what my next graphics card will be`

    At £199, the 5850 is a definite next upgrade for me!!

    I'm just hoping I dont regret it. I've stuck with nVidia for years, as each time I've gone to ATI I've had problems. But that was before AMD bought them out, so hopefully they've addressed the poorly written drivers!

  9. Inachu

    Computers fab processesing needs to be remade.

    I am very concerned about the heat output these newer video cards make.

    Currently the only politically correct PC case is the ANTEC 900 case.

    But even still I would like to see the video card next to the power supply instead at the opposite end.

    Also would like to see a rocker switch on all devices that use fans so we can decide how to keep our pc the coolest as we see fit. As it is now the video card ejects the heat behind the pc and it goes up and some of it gets sucked back in via the power supply.

    I am thinking of making a tailpipe that truly would keep all heat from the back getting sucked back in. The Antec 900 is not a 100% cure to heat problems but so far is the best case that truly addresses heat issues as much as possible.

    My current quick fix is to place the pc over an AC vent or on wintery days let the pc sit on the window sill.

    Next time please report on how much heat these new cards put out.


  10. adnim

    @Matt 58

    I agree, DX10 has been nothing but hype to sell windoze vista and the latest GPU's. With the vast majority of PC games being poor console ports, I wouldn't expect any full featured DX10/11 titles until 12-18 months after next gen consoles hit the shelves. By that time PC graphics quality/features will again be well beyond that which will be offered by those next gen consoles... And it will be another case of deja-vu. PC gaming may not be dead, but PC gaming innovation appears to be.

  11. jon 44

    @still waiting

    I think it all boils down to the vista argument. Vista brought 2 thinks to the table... DX10 and resource hogging. Oh and it is terrible, much like win 7...*cough*

    Now win 7 is out (7 days officially), the DX 10 market will grow because DX 10 cards are now ultra cheap.

    Programming for dx9 and dx10 was always a problem as it required more investment from developers to develop sm4 shaders that gave no real benefit over dx9 from the gamers perspective. (what they don't know won't hurt them!)

    DX11 is a big leap of technology and can be used in more areas with a game engine. This will get the technical leads involved (mind share/curiosity) and games supporting full dx11 ai/physics should be quick to arrive even if they have to do alternative paths for the DX9/10 users.

  12. Jason Togneri
    Thumb Up

    @Matt 58

    "no need other than to make DX10 cards cheap as chips for your average gamer."

    Sounds like a good plan for me, given that I only swapped my P4/2GB/N6600 for a dual-core system last Christmas - and that only due to bad caps on the P4's board. Anything that drives down the price of circa-2007 technology can only be good for me, since I'm living a perpetual 3-4 years behind the technology curve... when C2D came out, I'd only just gotten a P4. When i7 came out, I'd only just gotten C2D. I'm sure that when I get an i9, some equally amazing step forward will already have long since trumped it...

  13. b 3

    5850 for me! :D

    yup the 5850 is a sweet spot for the idle power draw as well, about time!

    my tired old 9800GTX can be retired..

    REALLY waiting for battlfeifled bad company2! (march 2010 ARG!)


  14. Allan Rutland
    Thumb Up

    @Greg D

    Used to be the same, as all the old ATI's I had tried were awful in the driver department. But after two exploding 8800's thought I would give them a try with a 4850...which ran cooler, faster and cost half the price. But the real surprise was the drivers. Utterly fantastic! Well written and utterly rock solid. Has taken me utterly by surprise after suffering for years with the crap nVidia's been shovelling.

    Give to the mad dash bleeding edge charge is over, and for the graphics card makers to tweaks and customise these things. And no doubt an overclocked 5850 will be around £150 come the new year...and that's a much nicer option for me :) and the DirectCompute bonus with Win7, gotta love it!

  15. Anton Ivanov

    I read till the bottom of the first page and stopped

    I read till I reached 190W draw and 90W idle. Sorry, that is absolutely disgusting power management. If I want to buy a convection heater, I will buy a proper convection heater, not one that pretends to be a video card as well.

    Disclaimer - not a single one of the systems in my house uses more than 80W total and they are perfectly usable and fit for purpose.

  16. Chris iverson

    @Anton Ivanov

    maybe I missed it but the numbers are lower for the new cards somewhere around 27W idle then again I may have misread. Also as a PC gamer the power draw from the GPU and the system in general is a fact of life. thats why the netbook does the day to day stuff and desktop only gets cranked over for game time. Its like the guy who has the v12 Jag in the garage and drives the civic to work. Both can do the same job within reason, one does it faster while the other does it more economically.

  17. Ermie Mercer

    "Next time please report on how much heat these new cards put out"

    The power draw is entirely converted to heat. Take the power consumption of the video adapter, say 188W, and use a calculator like the one below to convert it to the units of heat you want.

    Of course, some of the heat is vented outside the case, and some inside. Information about that might be useful.

  18. Marco Alfarrobinha

    Or you could... an Xbox 360/PS 3. No need to upgrade every year... (HDD excluded).

  19. Anonymous Coward
    Paris Hilton

    @Cameron Colley

    Errr.. how's the OpenGL? Horrid, especially if you're not a Windows user.

    AMD's linux drivers still cause endless hard locks- I use an NVidia card in my main work desktop, and it has an uptime defined by the time between interesting kernel updates.

    The old joke about Radeon cards being like London buses still holds. They're big, red, and have terrible drivers. This is unfortunate, as it makes NVidia lazy and greedy... well, moreso than usual.

  20. Sorry that handle is already taken. Silver badge

    Re: I read till the bottom of the first page and stopped

    And then you didn't even manage to read the bottom of the first page correctly.

  21. Leo Waldock

    Windows 7 RC and power draw

    Phil, The Register may or may not have an MSDN account. Your humble freelancer does not.

    Anton, 'fit for purpose' is a fraught topic. If I was building a HTPC with low power CPU, integrated graphics, DDR3 and an SSD I would expect a tiny power draw of less than 50W. On the other hand a system power draw for a Core i7 PC with an HD 5870 of 225W or 310W with dual cards in CrossFireX strikes me as quite remarkably low.

    I should probably have shouted long and loud that this is the power draw for the entire system and not just for the graphics card.

This topic is closed for new posts.

Biting the hand that feeds IT © 1998–2021