back to article Two years after entering the graphics card game, Intel has nothing to show for it

Add-in board (AIB) market share figures for Q2 2024 are out and despite an uptick in overall sector shipments, relatively recent entrant Intel registered at zero percent. The data compiled by Jon Peddie Research (JPR) reveals a significant surge in global AIB volumes, up 47.9 percent year-on-year to 9.5 million units and up 9. …

  1. Zippy´s Sausage Factory

    Probably because their integrated graphics have been such poor performers, they have a mountain of bad image to overcome before any of the target market for add in cards would consider them.

    1. zimzam

      The fact that it was designed to only accelerate DX12 games and had to emulate DX11 and older was probably the biggest killer. They assumed that PC gamers only play the latest and greatest live service games. Weird, I don't know anyone else in the industry who... sorry I just spat over my monitor.

      1. Pascal Monett Silver badge

        I really don't care about the details. In my experience, Intel graphics has only ever been good enough to get a computer running until I can slot in a true graphics card (Nvidia or AMD, given the moment).

        If Intel thinks that putting its "graphics engine" on an add-in card is going to impress me, I'll put it right in the bin with the Matrox 3D card that I wasted money on way back when.

        1. Marcelo Rodrigues
          Boffin

          "I really don't care about the details. In my experience, Intel graphics has only ever been good enough to get a computer running until I can slot in a true graphics card (Nvidia or AMD, given the moment)."

          These first generation AIB cards from Intel weren't bad. No, they couldn't face the best AMD had to offer (nevermind NVidia) - but they weren't bad entry cards. At the very beginning they had a ton of drivers and optimizations problems - but they did their homework, and this isn't a problem anymore.

          If I remember correctly, their ARC770 would be about the same level of one RTX3070, maybe one 3060Ti. Far from stellar, but not bad per se either. After all, I'm running one RTX3060 (non Ti). As are about 5,86% of Steam gamers this month. And this is the first more popular graphics card on Steam survey. The second one is... RTX4060.

          They had a good video encoder too, and supported AV1 hardware encoding when no one else did (NVidia started hardware AV1 encoding on the 4000 series).

          To me the reality is simple: it was a first try, people were scared and money short, they had a lot of teething problems and although "everybody" buys something around the 4060/4070 series, reviewers praise the 4090 monsters. This steer people away from what they would realistically buy. Who cares if the 4090 is (say) 3x faster than the 7900XTX? I won't buy either! What matters to me (to vast majority of buyers) is what is the better: RTX4070 or 7700XT?

          1. Spazturtle Silver badge

            In terms of die size and transistor count it was closer to an Nvidia XX80 class GPU, but with the performance of a XX60 class. It is massively underperforming for what the hardware should be capable of doing.

        2. Groo The Wanderer Silver badge

          I was quite pleased with my matrox card. Early ATI adopter, too. But they served my needs at the time.

          Without a card designed for 2HD and 4K monitors, Intel was never under consideration when I opted for my RTX4070Ti.

          1. Sandtitz Silver badge

            "Early ATI adopter, too."

            Hey, I had an ATi EGA Wonder too!

        3. mirachu Bronze badge

          Some discrete Intel GPUs definitely have a reason to exist. You can't get an AV1 encode capable card for cheaper than an A310.

      2. gladhand

        I'll get a mop.

      3. williamyf

        Your statement is false.

        Intel emulated dx9 only.

        There is NATIVE support for OpenGL, Vulkan, DX12and crucially: DX11 too.

        I also play olden games, but you are lucky I was not the project manager for alchemist. I've focused on DX12 and Vulkan ALONE, and handled OpenGL via Zink, and the DX9/11 via BOTH microsoft's emulators (DX9on12 and DX11on12) AND DXVK (on a per game basis, wichever works best).

        1. SVD_NL Silver badge

          Exactly. DX11 was released in late 2009 alongside windows 7, of course adoption into games would've taken a couple of years, but in reality this card natively supports a DX version that's over a decade old!

          If i want to play some older DX7/8 games on my 3080ti, i also need to use dgVoodoo or a similar emulation layer. Technically the games run but it can be a rather painful experience without dgvoodoo or unofficial patches (which often implement dx emulation).

          I think having proper emulation and a compatibility layer in place will beat natively supporting an older API in the long run.

          1. Groo The Wanderer Silver badge

            All I know is only the most ancient of my games going back to the early 90s won't run any more under Windows 11 with an NVidia RTX4070Ti. What I'm currently learning is which of my favourites run well under Linux via Steam's Proton Experimental support. So far I'm impressed; Experimental can handle a good 90% of what I've thrown at it so far!

    2. Mage Silver badge

      Re: their integrated graphics

      For 30 years the Intel integrated graphics have been poor performance and dodgy drivers. Maybe in the last 6 or 8 years or so OK for basic laptops running wordprocessing.

      1. Yankee Doodle Doofus Bronze badge

        Re: their integrated graphics

        I have an intel NUC with an 11th gen i5, and under linux, at least, the integrated graphics are sufficient to run Grand Theft Auto 5 at between 40 and 45 frames per second in 1080p with default settings. Yes, that's nearly a decade old game, but I was still surprised that it was able to pull this off.

      2. Sandtitz Silver badge

        Re: their integrated graphics

        "For 30 years the Intel integrated graphics have been poor performance and dodgy drivers. Maybe in the last 6 or 8 years or so OK for basic laptops running wordprocessing."

        Nonsense. The first IGP, i810, was released in 1999, 25 years ago. It and its successors have been perfectly fine for regular 2D office work or web browsing and such, and I can't remember any great hassle due to driver bugs.

        1. SVD_NL Silver badge

          Re: their integrated graphics

          Exactly! whenever i need to debug graphics card driver issues, it's a godsent to have integrated graphics, because it always works!

          The only issue i have with intel integrated graphics drivers recently is when windows update decides to update the driver in the background and a restart is needed, but is this really intel's fault?

          1. nintendoeats

            Re: their integrated graphics

            My experience developing graphics systems was that the only one that "always works" is LLVMs software OpenGL implementation :p

            I've had fair chunks of my life stolen by both Intel and Nvidia driver bugs. ESPECIALLY Intel.

        2. Zippy´s Sausage Factory

          Re: their integrated graphics

          For office work and web browsing, maybe. But they're trying to target games.

          And drivers for Intel graphics have always been a nightmare. You download them from Intel and they say "these won't install, go get them from your vendor". But they won't tell you who the "vendor" is, and if you've no idea which Chinese OEM was used for the specific chipset on your motherboard, you're stuck with Windows' default drivers.

          Honestly, the main reason they don't sell is, imo, their reputation. Which is awful.

      3. GraXXoR Bronze badge

        Re: their integrated graphics

        I have a 2015 Macbook Air that I bought last month. It has a dual core i5 and a pokey graphics card that is somehow enough to run Stellaris until late game.

        Is it NVIDIA? No, but it's fine for casual gaming and office tasks. Not sure where you're getting your dodgy drivers opinion from, it's really not true.

    3. Snake Silver badge

      RE: two years, nothing to show, poor performers

      Maybe you are correct but you are missing the point [I tried to make in the Intel AI comment section]:

      nobody (reasonably) expects the first product from any company to be a 'home run' in terms of hitting all targeted market points. Intel's problem isn't that their first card wasn't a knockout punch to the industry, Intel's problem is that they haven't built upon that base and released better cards. The market is waiting for an alternative to nVidia and AMD but Intel is just sitting on its hands.

      But now Intel wants to push AI instead, and since everyone else's AI compute power is built upon their GPU architectures, Intel is behind the 8-ball but trying to push through with promises and hopeful dreams.

      In the GPU field, Intel's problem seems to be one of commitment, not ability - it is reasonable to assume that Intel has the ability to make more powerful GPU cards but simply hasn't done so, letting opportunity slip from their grasp. Nobody will sit around to wait and hope for some future time when Intel decides to actually make power moves in the GPU market, so they get left behind.

      There is still promise in Intel's GPU abilities...if they ever decide to get off their buttocks and actually be aggressive in trying to attain any real market share.

      1. Yet Another Anonymous coward Silver badge

        Re: RE: two years, nothing to show, poor performers

        >In the GPU field, Intel's problem seems to be one of commitment, not ability -

        That was always Intel's problem.

        They were an early member of OpenCL but never actually supported it to the point that it was usable as an alternative to CUDA

        They were/are the monopolist in CPUs but can't seem to offer any sort of commitment to anything else

      2. Anonymous Coward
        Anonymous Coward

        A different way of looking at why they failed.

        This was foreseeable and expected. While a first gen product isn't likely to realize the full potential of it's architecture, the curse of Intel graphics is that they have ALWAYS strived to be just barely good enough. So when they fall short of their mark, they are a useless annoyance.

        Intel couldn't have executed on a successful design regardless, their fab situation would have killed any project no matter how ambitious. But even if the fabs were firing on all cylinders, they would have failed because they tried launching a product that matched the back of the pack when they started. AMD and NVIDIA, on the back of TSMC were locked in an all out engineering war. Intel couldn't even compete on price, probably even dumping product at a loss. There were probably between a pair and a dozen largish accounts that had workloads their architecture favored, and hoped they could grow up from there. So their lack of ambition meant public ridicule, loss of investor and managerial enthusiasm, and starting over from scratch to try to hammer out a viable product.

        If instead when they started design they had been shooting for the next years NVIDIA 60 series as their hopeful performance target, even if they were late or under performed a little they might have survived to refine the design.

  2. eswan

    The i740 of video cards.

  3. elsergiovolador Silver badge

    Approach

    Problem with Intel is their approach to design.

    It starts somewhat like this:

    We have a 50 sq ft room at 15C. The PSU can give 300W of power. Create a rectangle that will heat the room to 25C in an hour and disguise it as GPU.

    You are free to use past designs if you relabel them.

    Now someone asks why the bother with GPU thing and not just create just a heater?

    Rumour is that Intel has been banned from all heating magazines, so they had to be creative.

    1. A. Coatsworth Silver badge
      Happy

      Re: Approach

      Upvoted! but the mixture of imperial and metric threw a wrench into my head.

      So, to ensure compatibility, it would be changing the temperature from -0.5 Hilton to 0.5 Hilton, for a room of 0.2236 nanoWales in 0.0009 Truss.

      1. The Dogs Meevonks Silver badge
        Thumb Up

        Re: Approach

        You get the upvote just for the creative use of the lettuce as a measurment of time.

    2. katrinab Silver badge
      Mushroom

      Re: Approach

      To be fair, NVidia GPUs can also get a bit toasty.

      1. Yet Another Anonymous coward Silver badge

        Re: Approach

        NVidia laughs at a 300W PSU

  4. Anonymous Coward
    Anonymous Coward

    How hard can it be?

    It's not like AMD and Nvidia spent decades of time and billions of dollars on research... Oh wait.

    1. Annihilator

      Re: How hard can it be?

      Well, more accurately ATI did...

      1. Sorry that handle is already taken. Silver badge

        Re: How hard can it be?

        At this point it's almost as long since AMD acquired ATi as it was from ATi's founding to the acquisition.

        1. Annihilator
          Meh

          Re: How hard can it be?

          Well if that's not a terrifying statistic that's made me ponder my own mortality, I don't know what is. The 1990s were 20 years ago, right?...

  5. abend0c4 Silver badge

    Intel has nothing to show for it...

    ...except inventory, presumably.

    1. An_Old_Dog Silver badge

      Re: Intel has nothing to show for it...

      Re Intel video AIBs -- I'll look for these at the Super Deal Bin Store on €3 Mondays. Awesome surplus/remainders store. I got a couple of RPi cases there which look like tiny NES consoles, for €6 apiece.

  6. sarusa Silver badge
    FAIL

    Their graphics are worse than my 8 year old Nvidia

    I just bought a new laptop to replace my 8 years old laptop - it's for productivity, not games, so I didn't really pay attention to the graphics card, just all the other specs (64 GB RAM!). And of course the CPU is much, much faster. Turns out it has Intel Arc graphics.

    So just out of curiosity I put some games on it, and... it actually runs them worse than the NVidia mobile card in the 8 year old laptop (it had a discrete mobile card). CPU bound games are of course much faster now, but anything that's fill/poly constrained is slower than on the old laptop!

    I'm still quite happy with this for what I bought it for, but if you actually cared about graphics I would never recommend the Intel graphics to anyone.

    1. cherries

      Suspicious of your comment...

      Is the CPU on your laptop a Core Ultra/Meteor Lake? or is it just Alder Lake/Raptor Lake (a.k.a normal 12th/13th/14th gen)? because the graphics on the latter is actually far slower, while the graphics on the former is literally equivalent to a GTX 1650 from Nvidia.

  7. mostly average
    Facepalm

    I have one

    Arc A750. It's good for games but not much else because pretty much everything else requires CUDA. SYCL is fantastic, if the software actually acknowledges it's existence. You have to compile everything from source because no one distributes binaries with anything but CUDA support. If you're lucky it's just a compiler flag, otherwise you have to switch out libraries, modify code and debug. It's all a headache carefully managed by Nvidia. Intel does try really hard to make it easy to switch to SYCL, but everyone uses CUDA because everyone uses CUDA. Suffice to say I have buyer's remorse. Not that I could possibly afford anything supporting CUDA anyway. It's definitely better than nothing. Usually.

    1. katrinab Silver badge
      Meh

      Re: I have one

      If you are doing video encoding, Quicksync is pretty good.

      1. mostly average

        Re: I have one

        It is indeed very good at transcoding. It's the backbone of my jellyfin server.

    2. Yet Another Anonymous coward Silver badge

      Re: I have one

      They had an opportunity with OpenCL.

      But nobody in the consortium really committed to it. They either secretly pushed their own propriety baremetal architecture to give themselves a commercial advantage or treat it as a box-ticking exercise to meet some purchasing requirement.

      Whatever you think of NVidia (and CUDA) they did bet the company on GPU compute - before AI was a thing - and it paid off.

  8. IGotOut Silver badge

    Oh yeah...

    ...I forgot they launched some graphics cards.

    Maybe they should just launch exactly the same cards but just put AI in the card name. Bound to work.

  9. karlkarl Silver badge

    > Two years after entering the graphics card game, Intel has nothing to show for it

    The fact that it is instantly as high as 3rd place in the dedicated GPU space for Linux and Windows is certainly a good thing to show.

    Who is going to steal its position exactly? It was an easy win for Intel to get on the leaderboard and earn some easy money.

  10. BinkyTheMagicPaperclip Silver badge

    Definitely unfair to say they've nothing to show for it

    They've got a series of GPUs that are extremely price competitive, even with AMD (who refuse to seriously compete with Nvidia), and offer decent performance providing you're running recent software. There are open source drivers, even if their quality is inconsistent.

    Intel aren't ploughing massing amounts of money into it, so the drivers target DirectX 12 and Vulcan, and emulate older APIs, which is quite sensible.

    Sadly, like many others my use case requires the precise areas Arc is awful at : older games, VR, and non Windows/Linux systems. Also, given the spotty quality of games, continual video card driver updates are required for optimal performance, and that requires a sizable and well funded driver team.

    1. Like a badger

      Re: Definitely unfair to say they've nothing to show for it

      I concur, and I doubt Intel expected much within a few years, and what they have delivered from a very low baseline is pretty impressive even if isn't wowing the buyers. They knew they'd not produce either a technology leader or a value leader in two years.

      The upcoming Battlemage cards could be very interesting, and might produce a value proposition. That of course assumes that the people running Intel don't simply cancel the graphics programme outright to try and please Wall Street. As current news elsewhere shows, Big Tech are pretty happy to cancel programmes they invested huge amounts in.

  11. williamyf

    What was the market share of innosilicon and moorethreads?

    Higher than Intel's? or lower?

  12. ecofeco Silver badge
    FAIL

    LOL, no suprise here

    Intel graphics have always sucked. From the user interface to the driver, it's always been subpar on a good day.

    I'm not surprised their latest foray is tanking because it's baked into their corporate culture. Intel is a dinosaur.

    1. collinsl Silver badge

      Re: LOL, no suprise here

      And yet by market share of PCs I'd wager they're over 90% of the total market due to the fact that their inbuilt GPUs are on pretty much all office machines in the world (AMD not having much of a share of the corporate space).

      Most laptops and most desktops for corporate functions run Intel GPUs.

      1. ecofeco Silver badge

        Re: LOL, no suprise here

        What's the old saying? Just because a million people so something wrong, does not make it right.

        My experience is first hand on literally tens of thousands of corporate PCs. Intel graphics is an effing nightmare POS.

        1. Sandtitz Silver badge
          Meh

          Re: LOL, no suprise here

          "My experience is first hand on literally tens of thousands of corporate PCs. Intel graphics is an effing nightmare POS."

          Well, my experience is also first hand on literally tens of thousands of corporate PCs. Intel graphics are quite harmless.

          Care to tell what your problem with Intel graphics is? Please be specific.

  13. Anonymous Coward
    Anonymous Coward

    Functional...

    ...but not good value for money.

    1. mirachu Bronze badge

      Re: Functional...

      Depends. Show me a non-Intel GPU in the 100€ bracket that does hardware AV1 encoding. (Could probably game on my A310 too but it's in a server, gaming boxen are separate.)

  14. CGBS

    It's not Intel's fault

    Hey, Pat Gelsinger had that entire department occupied every morning from 8-12 for Bible Study classes. Don't blame the Xe team for the leadership being too busy preaching, offshoring more jobs, and selling off parts of the company like it was a hunk of shawarma meat on some street vendor cart in the Middle East all while telling people about what it was like at Intel in the 1950s. Also, Raja. Where is that little grifter these days?

  15. cherries

    The comment section is quite full of.... weird people.

    Like, some people in here don't seem to know that Intel has moved on from the era of awful integrated graphics or something. It's been nearly a full year ever since the launch of Meteor Lake, and another one (or so, I don't remember the exact launch dates) since the launch of Arc dedicated GPUs, so how could they think that Intel graphics is still "awful"?

    1. Sorry that handle is already taken. Silver badge

      Re: The comment section is quite full of.... weird people.

      Maybe there are fewer gamers on The Register than I thought...

      Although even the current top end Arc GPUs really are only competing with AMD and Nvidia's entry level AIBs. And the driver issues have been severe at times*, which didn't do their reputation much good.

      * They might well have sorted that out but I haven't looked into it

      1. Anonymous Coward
        Anonymous Coward

        Re: The comment section is quite full of.... weird people.

        True, the drivers were really poor at launch. They have got a lot better since. Failure to cater for old stuff though is a problem for lots of users with 15 year steam catalogues.

        I appreciate there being competition to try and keep AMD and Nvidia in check though it’s not strong enough to be doing that just yet.

    2. Dan 55 Silver badge

      Re: The comment section is quite full of.... weird people.

      Perhaps it might be because they botched the launch of the flagship model (which wasn't that much of a flagship), discontinued it, then haven't launched anything in the 7xx range since then?

    3. Irongut Silver badge

      Re: The comment section is quite full of.... weird people.

      > how could they think that Intel graphics is still "awful"?

      Because everything Intel touches these days is awful. For instance the last two generations of their flagship CPUs that are burning out in short order...

    4. Ilgaz

      Re: The comment section is quite full of.... weird people.

      The image matters too. Not the image on screen or OSS drivers, the Intel graphics image. Once that kid says "never mind, their stuff doesn't perform" it is over.

      For example when people all gave up their hopes on 68K arch, Motorola had stuff that can even get compared to Pentium. Nobody except Amiga or Atari fans cared because it was decided that the i486 "won" and Motorola gave up 68K for computers.

    5. Anonymous Coward
      Anonymous Coward

      Bad is better than awful, no good.

      The best they produced wasn't really even mediocre. So yeah, it may be better than the system crippling built in's with shared memory from days gone by, but they are uninspiring even for desktop apps and web content.

      As windows insists on operating in 3d mode most of the time now, the gulf between and ARC and even a cheap AMD card from a couple years ago is jarring.

  16. amacater

    Parabolic Arc

    The hype was high, the launch spectacular - up like the rocket, down like the stick.

  17. RantyDave

    Made a new plan, Stan?

    Perhaps they've looked at the mega dollars around AI and "pivoted" to that. Would make sense?

  18. naive

    They should have designed an integrated and cost effective gaming solution with ARC

    The only way to get market share in a space controlled by two dominating parties is to offer something both of them do not have.

    It could be price/performance ratio or factors like energy use.

    Intel failed to achieve any of these, they ended up performance wise at the lower end of the NVIDIA/AMD offerings without being significantly cheaper.

    The gaming/enthusiast PC market is something on its own. Everyone believes one needs rigs exceeding $2000, with 400W CPU's and graphics cards, to run the latest and greatest.

    As a company, Intel would have been able to create a great gaming rig costing clearly under $1000, that would allow more people access to high-end games.

    A strong and price optimized gaming rig, with simple but fast 4 core cpu and integrated ARC would have great opportunities, since not everyone is able to fork out 4 digits for a gaming PC. Something like a gaming NUC achieving NVIDIA 4060 level performance would be a big hit.

    Probably the future will be ARM anyway, who wants Intel CPU's guzzling endless power with dozens of idle P/E cores nobody needs in games.

    1. BinkyTheMagicPaperclip Silver badge

      Re: They should have designed an integrated and cost effective gaming solution with ARC

      Why should Intel do that? It'd be gutting their own market.

      There's something to be said for creating a sensibly priced GPU, and the Arc series *were* sensibly priced in terms of price performance at least until recently, trading blows with higher end Nvidia cards, supplying ray tracing that worked better than AMD, and costing under 300 pounds which isn't exactly cheap but was cheaper than Nvidia/AMD alternatives.

      There's names for a 'great gaming rig under $1000 that allows people access to high end games', they're 'Playstation 5' and 'XBox Series X'

    2. mirachu Bronze badge

      Re: They should have designed an integrated and cost effective gaming solution with ARC

      4 cores isn't really enough.

  19. tiago.pelicari

    Intel's business revolves around milking patents. Period. They don't know how to produce, and I don't believe they know how to do anything other than marketing.

  20. Steve Kerr

    Late to party - I've gotn an Intel Arc 770 with 16gb RAM.

    For what I'm using it for it runs fine (mostly Eve), runs 2 clients in 1920x1200 OK - which is what my monitors are.

    Is it a world beater? Far from it, is is flakey? It certainly was in the beginning but much better now, it's pretty much OK.

    Price wise, it was cheap compared to the others on the market in the same space and I balked at paying 2x-3x the price for what was supposed similar performance.

    So, for the moment, for the price I paid, it does the job I want and for me has been pretty good value!

    One day I will probably go back to Nvidia or AMD but like I said, it does what I need for the moment.

    Doesn't help that it's in a PC with an AMD CPU so that was quite entertaining. in the beginning

    1. Anonymous Coward
      Anonymous Coward

      I would seriously have considered one too, however the software just wasnt quite there at the time I needed to swap cards.

      Intel were always going to have a rough time on the first or even second gen cards. If they can put out a usable competitor to 3060/7600 class cards at a price point that isnt stupid they will be onto a winner. But crucially, timing, the product needs to be there at the point of the consumer refresh cycle - where based on a sample size of not many, is probably about 5 yearly barring failures or feature deficit being a blocker.

  21. PenfoldUK

    A tangential question

    One thing that always puzzles me about graphics drivers is why they have to be updated so often, apparently even for individual games.

    I can get with things like DLSS a game by game approach must taken. But driver updates seem to be required even when rendering natively.

    Is this a case of newer games picking up bugs in existing drivers, so need updating?

    Or are drivers bring altered to accommodate individual games, and if so why?

  22. ecofeco Silver badge
    Gimp

    LOL

    I see the Intel fanbois are strong around here.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like