back to article Intel’s first discrete GPUs won't be a home run

As Intel's first wave of discrete Arc GPUs slowly make it out into the wild, the chipmaker is making clear that its latest attempt in the graphics market won't challenge the best from Nvidia or AMD. The semiconductor giant signaled this on Friday when it said one of its upcoming flagship graphics cards for desktops, the A750, …

  1. This post has been deleted by its author

    1. Proton_badger

      Re: To be fair...

      Yeah, I doubt anyone really expected their first try to be any better than this. They have always hyped up their new GPU archs, whether integrated or discrete, and then come out with something less exciting.

    2. Geez Money

      Re: To be fair...

      This article is entirely ridiculous, implying "knocking it out of the park" means making a card nobody actually buys because it costs more than a car for the internet updoots. The *060 series has long been the most popular price point in nVidia's lineup, their performance in the current gen is pretty oustanding, and they're what actually make the money. If Intel can actually provide a competing product in terms of performance and quality in their first discrete gaming GPU they've knocked it into the stratosphere.

  2. John Robson Silver badge
    WTF?

    3060

    Low end?

    Not quite sure I'd call a £350-450 card low end, though I'll admit it's a while since I've cared about GPU performance.

    1. Joe W Silver badge
      Pint

      Re: 3060

      It's like a €400 / $400 smartphone being "entry level"...

      I'd rather buy a couple of those --->

      (and @John Robson can have one too)

      1. Tom 7

        Re: 3060

        My 16 yr old thought I'd hand over £600 for a new phone to replace the one that's less than a yr old and only failing through percussive interference. A charge was not made for the beer that came out my nose.

      2. georgezilla Silver badge

        Re: 3060

        " ... I'd rather buy a couple of those ---> ... "

        At " €400 / $400 ",

        If that's true, I'd give up drinking!

    2. Anonymous Coward
      Anonymous Coward

      Re: 3060

      Fairly, I'd call it mid-range, but a generation behind. So that makes the Intel part "mostly adequate" at it's best. Plus expect a ton of compatibility/driver issues.

      But it will squeeze AMD on the "we need something better than IRIS graphics" for all the biz machines out there. So it's probably playing it's part keeping the status quo of the duopoly intact. Also, it will probably run windows 11, if for some reason you want to. So there's that at least.

      1. Wade Burchette

        Re: 3060

        If this came out in January or February, it would have been a hit even with driver problems. But it didn't. It is coming out now when both Nvidia and AMD are primed to release new models soon. Instead of competing against current generation, it will soon be competing against last generation.

        Because I couldn't get a video card at MSRP last year or earlier this year, I decided to wait until both Nvidia and AMD release their next models. I might would have bought this Intel GPU earlier this year; but I won't buy it now. And since I have had time to think while waiting for prices to drop, I decided to try get a next generation GPU that uses about the 150W of my RX480 -- or at least one I downclock to 150W.

        Still, a third player is always welcomed. I wish Intel much success in the future. I hope they can compete with Nvidia and AMD one day, so that neither can price gouge us.

    3. Neil Barnes Silver badge

      Re: 3060

      Can't recall having ever bought a video card since the days of VGA, but then, I'm not a gamer. Whatever the second-hand business laptop comes with seems to work for me...

      (Though I have an interesting problem with one laptop's video: it works on the laptop screen *only* until the user logs in; thereafter I can only get video via the HDMI port. The display management shows the laptop screen but doesn't do anything useful with it. Using a bootable ISO on a USB stick, things appear to work. Using Cinnamon it fails; using Xfce it works. I conclude something weird has happened to the video acceleration...)

    4. Loyal Commenter Silver badge

      Re: 3060

      Prices are bit lower than that now, since the recent price drop (largely fuelled by the falling Ether price, rather than any increased supply). However, that is the low-end price, when the mid-range cards are in the £600-£700 range, and the high-end ones will set you back well over a grand.

    5. localzuk Silver badge

      Re: 3060

      My thought too. I thought the GTX 16XX series was the low end one these days. RTX series starts at mid range and goes up to "I have a private jet to fly my butler to the shop" range.

  3. Anonymous Coward
    Anonymous Coward

    Those that strive for mediocrity often find it.

    Intel's problems with GPUs run deep and appear to be as much about culture as technology. Unlike the skull trail team, which gets it, the graphics team is culturally stuck in "middle of the roadism". Some of the skull trail stuff is overhyped trash, but it is internally aspirational to kick ass. Sometimes it get there, sometimes it's a miss. But what to they say about the swings you don't take right?

    For years Intel has been the main pusher of defective shared memory graphics chipsets. A tragic crapfest that has crippled many a value conscious laptop or SFF PC. Worse, tons of parts for desktop PC's had the same class of graphics as an obligatory pack in, leading to years of systems that shipped with a missconfigured BIOS and had dedicated graphics hardware that weren't actually used because the on-die GPU was set to the default.

    While it may take a while for a new entrant to get in the game, I'd feel less like this is just history repeating if they at least showed they were trying to swing for the fences, not aspire to bunt at every pitch. This feels like little league ambition driving t-ball engineering. Even if it's a few years out, I want to see them shooting for where AMD or NVIDIA will be a year from now, not where they were 5 years ago.

    1. ecofeco Silver badge

      Re: Those that strive for mediocrity often find it.

      Not sure why all the down-votes.

      My experience of over a decade with Intel graphics has proven to me to never buy one again. And I thought nVidia was bad.

      The CPUs as well. Intel lost the plot years and years ago.

  4. Anonymous Coward
    Anonymous Coward

    I suspect these will be a little hard to find: Intel have declined to deliberately hobble their mining performance.

    1. Loyal Commenter Silver badge

      The LHR limiter on Nvidia cards was cracked a couple of months back, probably related to the hack that Nvidia suffered earlier in the year, and (alleged) sale of their internal data to Nicehash.

  5. ilmari

    These probably won't make a change to the chip shortage or surplus, since all GPU chips are made in the same factory regardless.

    1. FIA Silver badge

      Possibly not GPUs, no. However intel also has it's own fabs, so taking up capacity that could be used by your competitors whilst also having capacity to manufacture your other stuff yourself could be a smart move.

  6. Anonymous Coward
    Anonymous Coward

    Why does Intel use 3,5,7 for processors and now graphics cards? Does anyone else find it odd?

    1. Sampler

      I can't even believe you went there..

    2. Sceptic Tank Silver badge
      Go

      It's a prime example of how to name things. I guess they stole the idea from BMW.

      1. Gordon 10

        Wait til they expand the model range like BMW did. Things will even out in the end...

        1. MrDamage Silver badge

          Wait til they expand the model range like BMW did

          Subscription based thermal management?

  7. GraXXoR

    Proof just how effective modern marketing is

    When someone calls a $400 card "entry level".

    Feels like, if the price of something has only three digits, it's mid range at the very best.

    A FTW3080 still costs about 40,000 yen more than it did when I bought mine in October of 2020. (weak Yen is not helping, I'll admit).

    Same goes for phones. In Japan we pay the same a basic iPhone 13 Pro Max 256 basic model as we do for the MacBook Pro M2 16Gb/512.

    Still, competition is generally a good thing and I'm very happy that we have Teams Red, Green and now Blue in the GFX Card space.

    1. Loyal Commenter Silver badge

      Re: Proof just how effective modern marketing is

      Still, competition is generally a good thing and I'm very happy that we have Teams Red, Green and now Blue in the GFX Card space.

      Sounds like we need to have team Alpha too to add some transparency to the market...

      1. deep_enigma

        Re: Proof just how effective modern marketing is

        > Sounds like we need to have team Alpha too to add some transparency to the market...

        Those might be a bit hard to see on the shelf though...

  8. Charlie Clark Silver badge

    Dodgy premise

    The addition of a third GPU vendor will also be a welcome sight after the semiconductor industry experienced more than two years of shortages.

    This completely ignores enormous vast market for GPUs in mobile phones with ARM, Qualcomm and others, including Apple, developing extremely potent GPUs. Just because these are not available as discrete chips doesn't mean they're not affecting the markets for both gaming and machine learning.

    1. Richard 12 Silver badge

      Re: Dodgy premise

      The fact that those are only available integrated into an Arm chip does mean they're not affecting the gaming market.

      Arm is basically non existent in desktop/laptop gaming.

      macOS gaming is long dead and buried because Apple refused to update OpenGL or implement Vulkan, so M1/M2 isn't even a rounding error.

      Arm is tiny in serverland (and so ML)

      1. Charlie Clark Silver badge

        Re: Dodgy premise

        You moved the goalposts: from "gaming" in general to "desktop/laptop" gaming.

        Mobile gaming is bigger than desktop/laptop/console gaming and growing faster even if indvidiual titles have smaller budgets. Mobile phones might have physically smaller screens but they often push as many pixels as desktops and do this using a battery.

        1. georgezilla Silver badge

          Re: Dodgy premise

          I'm sorry ( well no, not actually ), but .............

          Gaming on a 6 inch screen? Really?

          Come on.

          REALLY?

          But then I'm not really a gamer.

  9. Filippo Silver badge

    I wouldn't criticize Intel for not playing at the top of the range. The range that surrounds the 3060 may not be top, but it's still very much interesting for many gamers. Everything depends on the price. If they can compete there, it doesn't really matter that they can't (for now) make anything that runs the very latest games at 120 fps and 4k.

    1. Loyal Commenter Silver badge

      Given that most gamers don't have 120Hz monitors, and your average user is playing games on a 1080p 60Hz monitor, chucking out any more frames than this is pretty moot.

      The pair of 1440p monitors I use can be forced to run at 75Hz, and I have a graphics card that can make them do so, and supply that frame rate at full settings on most games. OK, so it's a few steps up from an RTX 3060 (it's a 3060 Ti), but for your average user, who isn't going to splash out on an expensive high frame-rate 4K monitor, Intel aren't going to care. Those who can afford the top-end gaming gear will be buying the top-end graphics cards as well, and then wondering why they've got a fan heater on their desk...

      I would imagine the market is very much a Poisson distribution, with a relatively small number of high-end users who can both afford and want the expensive stuff. Intel want to be in the middle of that curve, where all the sales are. Given that the RTX 3060 is the most commonly used graphics card, that is very much the sensible market segment to try to take a bite from.

  10. Binraider Silver badge

    Presumably these are being made in the same oversubscribed chip fabs as everything else. The pricing probably reflects a combination of bad timing, intel tax and development overhead. 2nd and 3rd gen cards will lose some of that penalty.

    Of course if 3060s or equivalents are readily available the Intel will sit on the shelf gathering dust waiting for a stock liquidation. This probably doesn't matter to Intel for the first or second gen cards. As with Music, the first album might get a band noticed. It's the third album that will determine if it survives.

  11. CommonBloke

    You don't aim for top performance if you want to sell in bulk

    The reason why intel celeron, despite being one of the worst processor lines, sells so damn well is because it's cheap and produced in bulk. It's also why the vast majority of laptops come with it and whatever shit integrated graphics intel threw together at the last second.

    Now, graphics cards like these have their use both for gaming and more serious work, such as render farms, 3D modellers and various CAD users. For the serious work, intel is possibly in a better position than nvidia to make bulk sales of "work ready" stations, something I don't know if AMD already does and, if they do, how profitable it ends up being.

  12. Steve Todd

    It will be interesting to see if they have the same problem as the A370

    The review I’ve seen says that it suffers from fairly noticeable glitches in frame rate, and it NEEDS a CPU able to handle resizable BAR. That and the fact that older games (using DX11 or prior) perform poorly in comparison. Intel still seem to have a way to go with their drivers.

  13. georgezilla Silver badge

    It's still something ..............

    ... which people seem to be missing.

    And seeing how as I'm still running an AMD 580, a 3060 competitor is good enough at the expected $325 USD ( aprox. ) price point really isn't that bad ( I paid $550 USD for my "renewed" 580 ).

    And competition is a "good" thing.

    Isn't it?

    I'm not a "gamer". And I use Linux. So as long as it works with Linux, I'm good.

    And yes, yes, I am going to wait a few months, and see if/how well, it does.

    Then there's also the expected 770 at $399 USD.

    So you know ............................

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like