Probably because their integrated graphics have been such poor performers, they have a mountain of bad image to overcome before any of the target market for add in cards would consider them.
Two years after entering the graphics card game, Intel has nothing to show for it
Add-in board (AIB) market share figures for Q2 2024 are out and despite an uptick in overall sector shipments, relatively recent entrant Intel registered at zero percent. The data compiled by Jon Peddie Research (JPR) reveals a significant surge in global AIB volumes, up 47.9 percent year-on-year to 9.5 million units and up 9. …
COMMENTS
-
-
Wednesday 2nd October 2024 15:07 GMT zimzam
The fact that it was designed to only accelerate DX12 games and had to emulate DX11 and older was probably the biggest killer. They assumed that PC gamers only play the latest and greatest live service games. Weird, I don't know anyone else in the industry who... sorry I just spat over my monitor.
-
Wednesday 2nd October 2024 16:36 GMT Pascal Monett
I really don't care about the details. In my experience, Intel graphics has only ever been good enough to get a computer running until I can slot in a true graphics card (Nvidia or AMD, given the moment).
If Intel thinks that putting its "graphics engine" on an add-in card is going to impress me, I'll put it right in the bin with the Matrox 3D card that I wasted money on way back when.
-
Wednesday 2nd October 2024 20:05 GMT Marcelo Rodrigues
"I really don't care about the details. In my experience, Intel graphics has only ever been good enough to get a computer running until I can slot in a true graphics card (Nvidia or AMD, given the moment)."
These first generation AIB cards from Intel weren't bad. No, they couldn't face the best AMD had to offer (nevermind NVidia) - but they weren't bad entry cards. At the very beginning they had a ton of drivers and optimizations problems - but they did their homework, and this isn't a problem anymore.
If I remember correctly, their ARC770 would be about the same level of one RTX3070, maybe one 3060Ti. Far from stellar, but not bad per se either. After all, I'm running one RTX3060 (non Ti). As are about 5,86% of Steam gamers this month. And this is the first more popular graphics card on Steam survey. The second one is... RTX4060.
They had a good video encoder too, and supported AV1 hardware encoding when no one else did (NVidia started hardware AV1 encoding on the 4000 series).
To me the reality is simple: it was a first try, people were scared and money short, they had a lot of teething problems and although "everybody" buys something around the 4060/4070 series, reviewers praise the 4090 monsters. This steer people away from what they would realistically buy. Who cares if the 4090 is (say) 3x faster than the 7900XTX? I won't buy either! What matters to me (to vast majority of buyers) is what is the better: RTX4070 or 7700XT?
-
-
Wednesday 2nd October 2024 21:03 GMT williamyf
Your statement is false.
Intel emulated dx9 only.
There is NATIVE support for OpenGL, Vulkan, DX12and crucially: DX11 too.
I also play olden games, but you are lucky I was not the project manager for alchemist. I've focused on DX12 and Vulkan ALONE, and handled OpenGL via Zink, and the DX9/11 via BOTH microsoft's emulators (DX9on12 and DX11on12) AND DXVK (on a per game basis, wichever works best).
-
Thursday 3rd October 2024 10:08 GMT SVD_NL
Exactly. DX11 was released in late 2009 alongside windows 7, of course adoption into games would've taken a couple of years, but in reality this card natively supports a DX version that's over a decade old!
If i want to play some older DX7/8 games on my 3080ti, i also need to use dgVoodoo or a similar emulation layer. Technically the games run but it can be a rather painful experience without dgvoodoo or unofficial patches (which often implement dx emulation).
I think having proper emulation and a compatibility layer in place will beat natively supporting an older API in the long run.
-
Friday 4th October 2024 11:50 GMT Groo The Wanderer
All I know is only the most ancient of my games going back to the early 90s won't run any more under Windows 11 with an NVidia RTX4070Ti. What I'm currently learning is which of my favourites run well under Linux via Steam's Proton Experimental support. So far I'm impressed; Experimental can handle a good 90% of what I've thrown at it so far!
-
-
-
-
-
Wednesday 2nd October 2024 19:53 GMT Yankee Doodle Doofus
Re: their integrated graphics
I have an intel NUC with an 11th gen i5, and under linux, at least, the integrated graphics are sufficient to run Grand Theft Auto 5 at between 40 and 45 frames per second in 1080p with default settings. Yes, that's nearly a decade old game, but I was still surprised that it was able to pull this off.
-
Wednesday 2nd October 2024 21:05 GMT Sandtitz
Re: their integrated graphics
"For 30 years the Intel integrated graphics have been poor performance and dodgy drivers. Maybe in the last 6 or 8 years or so OK for basic laptops running wordprocessing."
Nonsense. The first IGP, i810, was released in 1999, 25 years ago. It and its successors have been perfectly fine for regular 2D office work or web browsing and such, and I can't remember any great hassle due to driver bugs.
-
Thursday 3rd October 2024 10:11 GMT SVD_NL
Re: their integrated graphics
Exactly! whenever i need to debug graphics card driver issues, it's a godsent to have integrated graphics, because it always works!
The only issue i have with intel integrated graphics drivers recently is when windows update decides to update the driver in the background and a restart is needed, but is this really intel's fault?
-
Tuesday 8th October 2024 10:11 GMT Zippy´s Sausage Factory
Re: their integrated graphics
For office work and web browsing, maybe. But they're trying to target games.
And drivers for Intel graphics have always been a nightmare. You download them from Intel and they say "these won't install, go get them from your vendor". But they won't tell you who the "vendor" is, and if you've no idea which Chinese OEM was used for the specific chipset on your motherboard, you're stuck with Windows' default drivers.
Honestly, the main reason they don't sell is, imo, their reputation. Which is awful.
-
-
Thursday 3rd October 2024 11:40 GMT GraXXoR
Re: their integrated graphics
I have a 2015 Macbook Air that I bought last month. It has a dual core i5 and a pokey graphics card that is somehow enough to run Stellaris until late game.
Is it NVIDIA? No, but it's fine for casual gaming and office tasks. Not sure where you're getting your dodgy drivers opinion from, it's really not true.
-
-
Thursday 3rd October 2024 13:24 GMT Snake
RE: two years, nothing to show, poor performers
Maybe you are correct but you are missing the point [I tried to make in the Intel AI comment section]:
nobody (reasonably) expects the first product from any company to be a 'home run' in terms of hitting all targeted market points. Intel's problem isn't that their first card wasn't a knockout punch to the industry, Intel's problem is that they haven't built upon that base and released better cards. The market is waiting for an alternative to nVidia and AMD but Intel is just sitting on its hands.
But now Intel wants to push AI instead, and since everyone else's AI compute power is built upon their GPU architectures, Intel is behind the 8-ball but trying to push through with promises and hopeful dreams.
In the GPU field, Intel's problem seems to be one of commitment, not ability - it is reasonable to assume that Intel has the ability to make more powerful GPU cards but simply hasn't done so, letting opportunity slip from their grasp. Nobody will sit around to wait and hope for some future time when Intel decides to actually make power moves in the GPU market, so they get left behind.
There is still promise in Intel's GPU abilities...if they ever decide to get off their buttocks and actually be aggressive in trying to attain any real market share.
-
Thursday 3rd October 2024 15:57 GMT Yet Another Anonymous coward
Re: RE: two years, nothing to show, poor performers
>In the GPU field, Intel's problem seems to be one of commitment, not ability -
That was always Intel's problem.
They were an early member of OpenCL but never actually supported it to the point that it was usable as an alternative to CUDA
They were/are the monopolist in CPUs but can't seem to offer any sort of commitment to anything else
-
Thursday 3rd October 2024 21:36 GMT Anonymous Coward
A different way of looking at why they failed.
This was foreseeable and expected. While a first gen product isn't likely to realize the full potential of it's architecture, the curse of Intel graphics is that they have ALWAYS strived to be just barely good enough. So when they fall short of their mark, they are a useless annoyance.
Intel couldn't have executed on a successful design regardless, their fab situation would have killed any project no matter how ambitious. But even if the fabs were firing on all cylinders, they would have failed because they tried launching a product that matched the back of the pack when they started. AMD and NVIDIA, on the back of TSMC were locked in an all out engineering war. Intel couldn't even compete on price, probably even dumping product at a loss. There were probably between a pair and a dozen largish accounts that had workloads their architecture favored, and hoped they could grow up from there. So their lack of ambition meant public ridicule, loss of investor and managerial enthusiasm, and starting over from scratch to try to hammer out a viable product.
If instead when they started design they had been shooting for the next years NVIDIA 60 series as their hopeful performance target, even if they were late or under performed a little they might have survived to refine the design.
-
-
-
Wednesday 2nd October 2024 14:32 GMT elsergiovolador
Approach
Problem with Intel is their approach to design.
It starts somewhat like this:
We have a 50 sq ft room at 15C. The PSU can give 300W of power. Create a rectangle that will heat the room to 25C in an hour and disguise it as GPU.
You are free to use past designs if you relabel them.
Now someone asks why the bother with GPU thing and not just create just a heater?
Rumour is that Intel has been banned from all heating magazines, so they had to be creative.
-
Wednesday 2nd October 2024 16:17 GMT sarusa
Their graphics are worse than my 8 year old Nvidia
I just bought a new laptop to replace my 8 years old laptop - it's for productivity, not games, so I didn't really pay attention to the graphics card, just all the other specs (64 GB RAM!). And of course the CPU is much, much faster. Turns out it has Intel Arc graphics.
So just out of curiosity I put some games on it, and... it actually runs them worse than the NVidia mobile card in the 8 year old laptop (it had a discrete mobile card). CPU bound games are of course much faster now, but anything that's fill/poly constrained is slower than on the old laptop!
I'm still quite happy with this for what I bought it for, but if you actually cared about graphics I would never recommend the Intel graphics to anyone.
-
Thursday 3rd October 2024 05:35 GMT cherries
Suspicious of your comment...
Is the CPU on your laptop a Core Ultra/Meteor Lake? or is it just Alder Lake/Raptor Lake (a.k.a normal 12th/13th/14th gen)? because the graphics on the latter is actually far slower, while the graphics on the former is literally equivalent to a GTX 1650 from Nvidia.
-
-
Wednesday 2nd October 2024 16:23 GMT mostly average
I have one
Arc A750. It's good for games but not much else because pretty much everything else requires CUDA. SYCL is fantastic, if the software actually acknowledges it's existence. You have to compile everything from source because no one distributes binaries with anything but CUDA support. If you're lucky it's just a compiler flag, otherwise you have to switch out libraries, modify code and debug. It's all a headache carefully managed by Nvidia. Intel does try really hard to make it easy to switch to SYCL, but everyone uses CUDA because everyone uses CUDA. Suffice to say I have buyer's remorse. Not that I could possibly afford anything supporting CUDA anyway. It's definitely better than nothing. Usually.
-
Thursday 3rd October 2024 16:06 GMT Yet Another Anonymous coward
Re: I have one
They had an opportunity with OpenCL.
But nobody in the consortium really committed to it. They either secretly pushed their own propriety baremetal architecture to give themselves a commercial advantage or treat it as a box-ticking exercise to meet some purchasing requirement.
Whatever you think of NVidia (and CUDA) they did bet the company on GPU compute - before AI was a thing - and it paid off.
-
Wednesday 2nd October 2024 18:10 GMT karlkarl
> Two years after entering the graphics card game, Intel has nothing to show for it
The fact that it is instantly as high as 3rd place in the dedicated GPU space for Linux and Windows is certainly a good thing to show.
Who is going to steal its position exactly? It was an easy win for Intel to get on the leaderboard and earn some easy money.
-
Wednesday 2nd October 2024 18:23 GMT BinkyTheMagicPaperclip
Definitely unfair to say they've nothing to show for it
They've got a series of GPUs that are extremely price competitive, even with AMD (who refuse to seriously compete with Nvidia), and offer decent performance providing you're running recent software. There are open source drivers, even if their quality is inconsistent.
Intel aren't ploughing massing amounts of money into it, so the drivers target DirectX 12 and Vulcan, and emulate older APIs, which is quite sensible.
Sadly, like many others my use case requires the precise areas Arc is awful at : older games, VR, and non Windows/Linux systems. Also, given the spotty quality of games, continual video card driver updates are required for optimal performance, and that requires a sizable and well funded driver team.
-
Wednesday 2nd October 2024 19:04 GMT Like a badger
Re: Definitely unfair to say they've nothing to show for it
I concur, and I doubt Intel expected much within a few years, and what they have delivered from a very low baseline is pretty impressive even if isn't wowing the buyers. They knew they'd not produce either a technology leader or a value leader in two years.
The upcoming Battlemage cards could be very interesting, and might produce a value proposition. That of course assumes that the people running Intel don't simply cancel the graphics programme outright to try and please Wall Street. As current news elsewhere shows, Big Tech are pretty happy to cancel programmes they invested huge amounts in.
-
-
-
Thursday 3rd October 2024 15:10 GMT collinsl
Re: LOL, no suprise here
And yet by market share of PCs I'd wager they're over 90% of the total market due to the fact that their inbuilt GPUs are on pretty much all office machines in the world (AMD not having much of a share of the corporate space).
Most laptops and most desktops for corporate functions run Intel GPUs.
-
-
Thursday 3rd October 2024 19:48 GMT Sandtitz
Re: LOL, no suprise here
"My experience is first hand on literally tens of thousands of corporate PCs. Intel graphics is an effing nightmare POS."
Well, my experience is also first hand on literally tens of thousands of corporate PCs. Intel graphics are quite harmless.
Care to tell what your problem with Intel graphics is? Please be specific.
-
-
-
-
Thursday 3rd October 2024 01:37 GMT CGBS
It's not Intel's fault
Hey, Pat Gelsinger had that entire department occupied every morning from 8-12 for Bible Study classes. Don't blame the Xe team for the leadership being too busy preaching, offshoring more jobs, and selling off parts of the company like it was a hunk of shawarma meat on some street vendor cart in the Middle East all while telling people about what it was like at Intel in the 1950s. Also, Raja. Where is that little grifter these days?
-
Thursday 3rd October 2024 05:35 GMT cherries
The comment section is quite full of.... weird people.
Like, some people in here don't seem to know that Intel has moved on from the era of awful integrated graphics or something. It's been nearly a full year ever since the launch of Meteor Lake, and another one (or so, I don't remember the exact launch dates) since the launch of Arc dedicated GPUs, so how could they think that Intel graphics is still "awful"?
-
Thursday 3rd October 2024 06:51 GMT Sorry that handle is already taken.
Re: The comment section is quite full of.... weird people.
Maybe there are fewer gamers on The Register than I thought...
Although even the current top end Arc GPUs really are only competing with AMD and Nvidia's entry level AIBs. And the driver issues have been severe at times*, which didn't do their reputation much good.
* They might well have sorted that out but I haven't looked into it
-
Thursday 3rd October 2024 07:17 GMT Anonymous Coward
Re: The comment section is quite full of.... weird people.
True, the drivers were really poor at launch. They have got a lot better since. Failure to cater for old stuff though is a problem for lots of users with 15 year steam catalogues.
I appreciate there being competition to try and keep AMD and Nvidia in check though it’s not strong enough to be doing that just yet.
-
-
Thursday 3rd October 2024 18:03 GMT Ilgaz
Re: The comment section is quite full of.... weird people.
The image matters too. Not the image on screen or OSS drivers, the Intel graphics image. Once that kid says "never mind, their stuff doesn't perform" it is over.
For example when people all gave up their hopes on 68K arch, Motorola had stuff that can even get compared to Pentium. Nobody except Amiga or Atari fans cared because it was decided that the i486 "won" and Motorola gave up 68K for computers.
-
Thursday 3rd October 2024 21:44 GMT Anonymous Coward
Bad is better than awful, no good.
The best they produced wasn't really even mediocre. So yeah, it may be better than the system crippling built in's with shared memory from days gone by, but they are uninspiring even for desktop apps and web content.
As windows insists on operating in 3d mode most of the time now, the gulf between and ARC and even a cheap AMD card from a couple years ago is jarring.
-
-
Thursday 3rd October 2024 10:00 GMT naive
They should have designed an integrated and cost effective gaming solution with ARC
The only way to get market share in a space controlled by two dominating parties is to offer something both of them do not have.
It could be price/performance ratio or factors like energy use.
Intel failed to achieve any of these, they ended up performance wise at the lower end of the NVIDIA/AMD offerings without being significantly cheaper.
The gaming/enthusiast PC market is something on its own. Everyone believes one needs rigs exceeding $2000, with 400W CPU's and graphics cards, to run the latest and greatest.
As a company, Intel would have been able to create a great gaming rig costing clearly under $1000, that would allow more people access to high-end games.
A strong and price optimized gaming rig, with simple but fast 4 core cpu and integrated ARC would have great opportunities, since not everyone is able to fork out 4 digits for a gaming PC. Something like a gaming NUC achieving NVIDIA 4060 level performance would be a big hit.
Probably the future will be ARM anyway, who wants Intel CPU's guzzling endless power with dozens of idle P/E cores nobody needs in games.
-
Thursday 3rd October 2024 14:57 GMT BinkyTheMagicPaperclip
Re: They should have designed an integrated and cost effective gaming solution with ARC
Why should Intel do that? It'd be gutting their own market.
There's something to be said for creating a sensibly priced GPU, and the Arc series *were* sensibly priced in terms of price performance at least until recently, trading blows with higher end Nvidia cards, supplying ray tracing that worked better than AMD, and costing under 300 pounds which isn't exactly cheap but was cheaper than Nvidia/AMD alternatives.
There's names for a 'great gaming rig under $1000 that allows people access to high end games', they're 'Playstation 5' and 'XBox Series X'
-
-
Thursday 3rd October 2024 15:28 GMT Steve Kerr
Late to party - I've gotn an Intel Arc 770 with 16gb RAM.
For what I'm using it for it runs fine (mostly Eve), runs 2 clients in 1920x1200 OK - which is what my monitors are.
Is it a world beater? Far from it, is is flakey? It certainly was in the beginning but much better now, it's pretty much OK.
Price wise, it was cheap compared to the others on the market in the same space and I balked at paying 2x-3x the price for what was supposed similar performance.
So, for the moment, for the price I paid, it does the job I want and for me has been pretty good value!
One day I will probably go back to Nvidia or AMD but like I said, it does what I need for the moment.
Doesn't help that it's in a PC with an AMD CPU so that was quite entertaining. in the beginning
-
Saturday 5th October 2024 18:55 GMT Anonymous Coward
I would seriously have considered one too, however the software just wasnt quite there at the time I needed to swap cards.
Intel were always going to have a rough time on the first or even second gen cards. If they can put out a usable competitor to 3060/7600 class cards at a price point that isnt stupid they will be onto a winner. But crucially, timing, the product needs to be there at the point of the consumer refresh cycle - where based on a sample size of not many, is probably about 5 yearly barring failures or feature deficit being a blocker.
-
-
Thursday 3rd October 2024 23:03 GMT PenfoldUK
A tangential question
One thing that always puzzles me about graphics drivers is why they have to be updated so often, apparently even for individual games.
I can get with things like DLSS a game by game approach must taken. But driver updates seem to be required even when rendering natively.
Is this a case of newer games picking up bugs in existing drivers, so need updating?
Or are drivers bring altered to accommodate individual games, and if so why?