back to article Intel debuts Arc discrete GPUs for laptops

Intel hopes to compete against Nvidia and AMD in the discrete GPU market with now-launched standalone graphics chips for laptops. These components are said to feature a slew of technologies designed to provide smoother gaming experiences and faster content-creation performance. The chipmaker kicked off its Intel Arc A-Series …

  1. TM™

    Seems weird that when AMD and games games consoles have demonstrated that APUs are able to produce (consumer) state of that art 3D graphics that Intel would go in the opposite direction. If anything needs the space, power and heat saving features of an APU it's a laptop. The only reason I look for a laptop with discrete graphics is because I need to program in NVidia's propriety CUDA platform. Then I look for the lowest power dGPU variant. I'd much rather be carrying around a Thinkpad X1 carbon than a Thinkpad X1 Extreme (not that I can afford either).

    Yes, I know everyone is different, just talking about the average laptop usecase - a computer you carry around everywhere with you and use away from a power source. Not talking about lug-able workstations.

  2. DS999 Silver badge

    Those are supposed to be mobile GPUs?

    With that much power going to just the GPU? Those high end ones will throttle to hell except in some nightmarish laptop that allows its fans to drown out a 747's main engines.

    I'm old enough to remember when 120 to 150 watts was considered slightly insane for a desktop GPU. And when drawing over 100 watts for everything including the display was only available in 10 lb DTRs used by people who had to run CAD software on the go.

    1. Anonymous Coward
      Anonymous Coward

      Re: Those are supposed to be mobile GPUs?

      Have you looked at the power requirements for the video on NVidia 20 and 30 series laptop graphics?

  3. Pascal Monett Silver badge

    "This is sort of the future of rendering as you know"

    Of course, Intel.

    Nvidia and AMD have been slugging it out over video performance for 3 decades, but you are the future.

    Yeah, sure.

    Meanwhile, Nvidia's RTX 3080 draws 320 W of power, has 8704 "stream processors" (damned if I know what that is), and can display any game on a 3840 x 2160 screen without trouble.

    I'm left wondering if your piddly little 150W part is good enough to play Minecraft on.

    But hey, competition is a Good ThingTM, so you go for it. Who knows, maybe 32 ray-tracing units is the future ?

    1. quxinot

      Re: "This is sort of the future of rendering as you know"

      I am so glad to know that i'm not the only one with an immediately cynical response to this.

      It's Intel. It'll cost drastically more than any other option, or it will perform like trash. That's held true for their CPU's, SSD's, and motherboards--I see little hope in expecting otherwise.

      1. ArrZarr Silver badge

        Re: "This is sort of the future of rendering as you know"

        I'm almost curious what the price will be if the Intel cards are priced higher than their Nvidia counterparts.

    2. Dave 126 Silver badge

      Re: "This is sort of the future of rendering as you know"

      >"This is sort of the future of rendering as you know"

      Read it again in context Pascal. They didn't say they [Intel] were the future of rendering, but that some sort of ML upscaling [already implemented in one form and another by AMD and Nvidia] was [a part of] the future of rendering.

    3. Boothy

      Re: "This is sort of the future of rendering as you know"

      Quote: "Meanwhile, Nvidia's RTX 3080 draws 320 W of power, has 8704 "stream processors" (damned if I know what that is), and can display any game on a 3840 x 2160 screen without trouble.

      I'm left wondering if your piddly little 150W part is good enough to play Minecraft on."

      Bear in mind these are the mobile parts. So you need to compare against the mobile 3080, not the desktop version.

      Mobile RTX 3080 draws 80W to 150W (depending on configuration and laptop), only has 6144 "stream processors", running at max boost of around 1,710MHz. (The current top 3080 Ti mobile version, pulls around 175W).

      The top end Intel A770M draws 120 to 150 watts, and clocks at 1,650MHz.

      So at least in clock and power, it's very close to matching nVidia's mobile 3080.

      Obviously we don't know how the Intel's new Xe cores compare with Nvidias stream processors yet, but I'm sure we'll find out once places like Hardware Unboxed and Gamers Nexus etc get hold of one or more of these to actually test out.

  4. Binraider Silver badge

    I'm quietly hopeful for Intel on this one. Intel Driver support on alternative OS has generally been good, and competition will help eat into the duopolys price gouging.

    The onboard gfx on Skylake were, when it launched actually quite reasonable; could run Elite Dangerous playably.

    I might not buy one myself but anything to spread out demand otherwise is a good thing.

    1. Fursty Ferret

      Quite reasonable in comparison to the existing Intel graphics (UHD 620), which can't draw a (Windows or Linux) desktop without stuttering. The latest Intel Xe is roughly comparable to the Geforce MX150, Nvidia's cheapest, nastiest, low-power discrete GPU from nearly six years ago.

      1. Tom 38

        UHD 630 here happily driving two 4k@60Hz screens and a 1080p screen on a linux desktop.

  5. knarf

    Buy the time Intel pulls its finger out it will be too late.

    They had a chance and they blew it, took too long to get to market.

    6-12 months ago as they promised they could have make an impact and gain some customer loyalty and buy in, but its too late, they're offerings are lackluster as best and way too late for a market that is now swamped with over stock.

    The folk that would have purchased a GPU for 2-3 times what its is worth already have a GPU or have waited out the shortage.

    Hopefully those scalping &£&&^& &^&^&:@ gits have lots of very overpriced stock they can't sell and make massive losses.

  6. VoiceOfTruth Silver badge

    I've never had much faith in Intel graphics

    -> Intel hopes to compete against Nvidia and AMD in the discrete GPU market with now-launched standalone graphics chips for laptops

    Yeah. One of my requirements when buying a laptop (I've been looking today so am up to date) is NOT having Intel graphics. I would prefer Nvidia but will consider AMD. But Intel Iris or Intel anything for graphics is a deal breaker. Every time.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like