Re: To be fair...
Yeah, I doubt anyone really expected their first try to be any better than this. They have always hyped up their new GPU archs, whether integrated or discrete, and then come out with something less exciting.
As Intel's first wave of discrete Arc GPUs slowly make it out into the wild, the chipmaker is making clear that its latest attempt in the graphics market won't challenge the best from Nvidia or AMD. The semiconductor giant signaled this on Friday when it said one of its upcoming flagship graphics cards for desktops, the A750, …
This post has been deleted by its author
This article is entirely ridiculous, implying "knocking it out of the park" means making a card nobody actually buys because it costs more than a car for the internet updoots. The *060 series has long been the most popular price point in nVidia's lineup, their performance in the current gen is pretty oustanding, and they're what actually make the money. If Intel can actually provide a competing product in terms of performance and quality in their first discrete gaming GPU they've knocked it into the stratosphere.
Fairly, I'd call it mid-range, but a generation behind. So that makes the Intel part "mostly adequate" at it's best. Plus expect a ton of compatibility/driver issues.
But it will squeeze AMD on the "we need something better than IRIS graphics" for all the biz machines out there. So it's probably playing it's part keeping the status quo of the duopoly intact. Also, it will probably run windows 11, if for some reason you want to. So there's that at least.
If this came out in January or February, it would have been a hit even with driver problems. But it didn't. It is coming out now when both Nvidia and AMD are primed to release new models soon. Instead of competing against current generation, it will soon be competing against last generation.
Because I couldn't get a video card at MSRP last year or earlier this year, I decided to wait until both Nvidia and AMD release their next models. I might would have bought this Intel GPU earlier this year; but I won't buy it now. And since I have had time to think while waiting for prices to drop, I decided to try get a next generation GPU that uses about the 150W of my RX480 -- or at least one I downclock to 150W.
Still, a third player is always welcomed. I wish Intel much success in the future. I hope they can compete with Nvidia and AMD one day, so that neither can price gouge us.
Can't recall having ever bought a video card since the days of VGA, but then, I'm not a gamer. Whatever the second-hand business laptop comes with seems to work for me...
(Though I have an interesting problem with one laptop's video: it works on the laptop screen *only* until the user logs in; thereafter I can only get video via the HDMI port. The display management shows the laptop screen but doesn't do anything useful with it. Using a bootable ISO on a USB stick, things appear to work. Using Cinnamon it fails; using Xfce it works. I conclude something weird has happened to the video acceleration...)
Prices are bit lower than that now, since the recent price drop (largely fuelled by the falling Ether price, rather than any increased supply). However, that is the low-end price, when the mid-range cards are in the £600-£700 range, and the high-end ones will set you back well over a grand.
Intel's problems with GPUs run deep and appear to be as much about culture as technology. Unlike the skull trail team, which gets it, the graphics team is culturally stuck in "middle of the roadism". Some of the skull trail stuff is overhyped trash, but it is internally aspirational to kick ass. Sometimes it get there, sometimes it's a miss. But what to they say about the swings you don't take right?
For years Intel has been the main pusher of defective shared memory graphics chipsets. A tragic crapfest that has crippled many a value conscious laptop or SFF PC. Worse, tons of parts for desktop PC's had the same class of graphics as an obligatory pack in, leading to years of systems that shipped with a missconfigured BIOS and had dedicated graphics hardware that weren't actually used because the on-die GPU was set to the default.
While it may take a while for a new entrant to get in the game, I'd feel less like this is just history repeating if they at least showed they were trying to swing for the fences, not aspire to bunt at every pitch. This feels like little league ambition driving t-ball engineering. Even if it's a few years out, I want to see them shooting for where AMD or NVIDIA will be a year from now, not where they were 5 years ago.
When someone calls a $400 card "entry level".
Feels like, if the price of something has only three digits, it's mid range at the very best.
A FTW3080 still costs about 40,000 yen more than it did when I bought mine in October of 2020. (weak Yen is not helping, I'll admit).
Same goes for phones. In Japan we pay the same a basic iPhone 13 Pro Max 256 basic model as we do for the MacBook Pro M2 16Gb/512.
Still, competition is generally a good thing and I'm very happy that we have Teams Red, Green and now Blue in the GFX Card space.
The addition of a third GPU vendor will also be a welcome sight after the semiconductor industry experienced more than two years of shortages.
This completely ignores enormous vast market for GPUs in mobile phones with ARM, Qualcomm and others, including Apple, developing extremely potent GPUs. Just because these are not available as discrete chips doesn't mean they're not affecting the markets for both gaming and machine learning.
The fact that those are only available integrated into an Arm chip does mean they're not affecting the gaming market.
Arm is basically non existent in desktop/laptop gaming.
macOS gaming is long dead and buried because Apple refused to update OpenGL or implement Vulkan, so M1/M2 isn't even a rounding error.
Arm is tiny in serverland (and so ML)
You moved the goalposts: from "gaming" in general to "desktop/laptop" gaming.
Mobile gaming is bigger than desktop/laptop/console gaming and growing faster even if indvidiual titles have smaller budgets. Mobile phones might have physically smaller screens but they often push as many pixels as desktops and do this using a battery.
I wouldn't criticize Intel for not playing at the top of the range. The range that surrounds the 3060 may not be top, but it's still very much interesting for many gamers. Everything depends on the price. If they can compete there, it doesn't really matter that they can't (for now) make anything that runs the very latest games at 120 fps and 4k.
Given that most gamers don't have 120Hz monitors, and your average user is playing games on a 1080p 60Hz monitor, chucking out any more frames than this is pretty moot.
The pair of 1440p monitors I use can be forced to run at 75Hz, and I have a graphics card that can make them do so, and supply that frame rate at full settings on most games. OK, so it's a few steps up from an RTX 3060 (it's a 3060 Ti), but for your average user, who isn't going to splash out on an expensive high frame-rate 4K monitor, Intel aren't going to care. Those who can afford the top-end gaming gear will be buying the top-end graphics cards as well, and then wondering why they've got a fan heater on their desk...
I would imagine the market is very much a Poisson distribution, with a relatively small number of high-end users who can both afford and want the expensive stuff. Intel want to be in the middle of that curve, where all the sales are. Given that the RTX 3060 is the most commonly used graphics card, that is very much the sensible market segment to try to take a bite from.
Presumably these are being made in the same oversubscribed chip fabs as everything else. The pricing probably reflects a combination of bad timing, intel tax and development overhead. 2nd and 3rd gen cards will lose some of that penalty.
Of course if 3060s or equivalents are readily available the Intel will sit on the shelf gathering dust waiting for a stock liquidation. This probably doesn't matter to Intel for the first or second gen cards. As with Music, the first album might get a band noticed. It's the third album that will determine if it survives.
The reason why intel celeron, despite being one of the worst processor lines, sells so damn well is because it's cheap and produced in bulk. It's also why the vast majority of laptops come with it and whatever shit integrated graphics intel threw together at the last second.
Now, graphics cards like these have their use both for gaming and more serious work, such as render farms, 3D modellers and various CAD users. For the serious work, intel is possibly in a better position than nvidia to make bulk sales of "work ready" stations, something I don't know if AMD already does and, if they do, how profitable it ends up being.
The review I’ve seen says that it suffers from fairly noticeable glitches in frame rate, and it NEEDS a CPU able to handle resizable BAR. That and the fact that older games (using DX11 or prior) perform poorly in comparison. Intel still seem to have a way to go with their drivers.
... which people seem to be missing.
And seeing how as I'm still running an AMD 580, a 3060 competitor is good enough at the expected $325 USD ( aprox. ) price point really isn't that bad ( I paid $550 USD for my "renewed" 580 ).
And competition is a "good" thing.
Isn't it?
I'm not a "gamer". And I use Linux. So as long as it works with Linux, I'm good.
And yes, yes, I am going to wait a few months, and see if/how well, it does.
Then there's also the expected 770 at $399 USD.
So you know ............................