back to article AI PC hype seems to be making PCs better – in hardware terms, at least

What is an AI PC? What is its killer app? Both are open questions , although we do know that running AI models locally – not over the network like when you're playing with ChatGPT – requires significant grunt on your desk or lap. It seems certain that to run AI models well, PCs must either become more powerful or AI models …

  1. tiggity Silver badge

    Do I need "AI" on my computer

    For what use?

    To offer to "improve" a photograph I have taken of relatives? If Aunty Sue wants "AI" to conceal her grey hairs, remove some wrinkles / lbs that's her choice & she can do it (or get someone else to)- I'm just going to be uploading and then emailing photos I took at a family meetup - I'm not a "tweaker" of photos (Cromwell - warts and all approach here, but to photos instead of art).

    Maybe an "AI" will offer to "improve" my grammar, spelling, writing style etc when I type a document. My writing has lots of flaws but I would sooner have a letter of mine where the recipient can tell it was written by me rather than it having a soulless, homogenized "recommended style".

    Do I want copilot code suggestions to improve my dev work? Currently no as (I have played with it) it can suggest some very iffy looking code (TBF, occasionally OKish looking code too) & critically you have no idea what licence implications (if any) applies to the code fragments it suggests, which is the deal breaker (commercial software lawyers are risk averse and really like to know there's no licence issues as they live in fear of licenced code sneaking in & that then forcing reveal of proprietary money making code)

    1. Anonymous Coward
      Anonymous Coward

      Re: Do I need "AI" on my computer

      this question is irrelevant. UNFORTUNATELY.

  2. Neil Barnes Silver badge

    to make it an enjoyable experience

    Is it possible to have an enjoyable AI experience?

    1. bufferDuffer

      Re: to make it an enjoyable experience

      <cough> asking for a friend...<cough>

    2. NLCSGRV

      Re: to make it an enjoyable experience

      Share and enjoy!

      1. The Oncoming Scorn Silver badge
        Thumb Up

        Re: to make it an enjoyable experience

        Go stick your head in a pig!

  3. Doctor Syntax Silver badge

    Over the last few years PC makers have had the problem that PCs have become good enough and upgrades were no longer driving it - hence the W11 & TPU saga. But if they were "good enough" before why do we suddenly "need" sucha hike in performance (and power consumption). We're in danger of finding the typical business PC grossly over-specced and over-priced for what's actually needed because it's expected to run a love-child of Clippy and Cortana. I suppose there'll be some excellent bargains of good enough stuff around in the near future.

    1. Pete Sdev Bronze badge
      Thumb Up

      Graphics cards are rapidly becoming larger than the rest of the system.

      It's not like electricity prices are going down either.

      For most users, MLM processing makes more sense being run on a server somewhere on-demand rather than locally.

      1. Handlebars

        Is MLM a Freudian slip?

        1. Pete Sdev Bronze badge

          Is MLM a Freudian slip?

          Nope. MLM = Machine Learning Model

          1. The Oncoming Scorn Silver badge
            Joke

            MLM

            Phew, I thought someone was trying to sell Amway.

    2. LybsterRoy Silver badge

      -- We're in danger of finding the typical business PC grossly over-specced and over-priced for what's actually needed --

      Hate to tell you but we're already there.

  4. chuckufarley Silver badge

    Reaching Critical Mass...

    ...will need more than hardware solutions. Most Generative AI models a person can run on their own hardware use CUDA or OpenCL for inference and not the Tensor Cores that have been built into the latest generations of GPU's. Keeping the total power draw and the waste heat to a minimum will likely involve making sure computers are using every software and hardware optimization available. Of course the Extra Oomph we are seeing could just be "Window's Dressing" and most users could find themselves being billed for cloud services when little Timmy asks his computer to write an essay about the dangers of plagiarism.

  5. Missing Semicolon Silver badge

    What a waste of sand

    and power.

  6. DS999 Silver badge
    Facepalm

    The hype is so extreme

    I just saw an article yesterday where some Wall Street analyst was claiming that if Apple didn't introduce an "AI Mac" this year that the Mac platform would be effectively dead.

    Nevermind that Apple is the only PC OEM where every model sold has "AI" built into the CPU. They are way ahead of Intel & AMD on that front. But let Microsoft talk about Windows 12 and the "AI PC" for a few weeks and suddenly AI is a must have that every rube is going to be asking for when buying a new PC? Not that I don't expect someone like Dell to advertise their PCs as "AI ready" to sell the (no doubt upscale) models containing the upcoming Intel and AMD CPUs that add NPU functionality.

    1. anonymous boring coward Silver badge

      Re: The hype is so extreme

      I don't really care about "AI", but what is this built AI? Asking since I have a Macbook.

      1. Anonymous Coward
        Anonymous Coward

        Re: The hype is so extreme

        On M-series you have the neural engine cores dedicated to AI:

        https://www.apple.com/fi/newsroom/2020/11/apple-unleashes-m1/#:~:text=The%20M1%20chip%20brings%20the,15x%20faster%20machine%20learning%20performance.

        1. anonymous boring coward Silver badge

          Re: The hype is so extreme

          Oh.. That's some extremely generous definition of what "AI" is.

          But thanks for clarifying where this nonsense claim comes from.

  7. anonymous boring coward Silver badge

    My gaming PC has 32GB. Don't see a reason to have any less.

    I regard 16GB as a minimum spec for a laptop, but have had to settle for less than minimum spec on my M1 (as I can't justify the steep second hand price jump).

    I do like having a lot of stuff going on at the same time. For example "Preview" is using about 3GB due to many PDF manuals being open. (And it's so sh*t at releasing memory.)

    1. DS999 Silver badge

      Apps won't release memory if there is still free memory left. If you run more stuff and use up that free memory (including a good chunk of the file cache which is also free memory though not listed that way) then stuff will start getting paged out and preview won't be using 3 GB. Since that's an app that views read only documents it could ALL be paged out without a problem, other than a brief (probably unnoticeably so, if you have an SSD) pause if you click on that app and start paging through the documents.

      1. aerogems Silver badge

        Exactly. This is one of those things that a lot of people just don't seem to understand. Free RAM is wasted RAM. It's like buying a huge mansion and then cramming everything into a single room, leaving the entire rest of the house empty. What is the point? Apps should be kept in RAM as long as possible to improve performance. If you have some app that has been open for a long time, but hasn't been used recently, you can always dump that to disk if some other app needs more space, but if you can keep everything in memory that's going to be significantly faster than even NVMe SSDs.

        It's amazing how much of that old DOS upper and lower memory mentality has survived, and even been instilled in people who were born well after the death of DOS.

        1. chuckufarley Silver badge

          This is why I like keeping things on a central file server accessed with a stupidly fast network. I can access my pics or my games or some other part of my data and even though the laptop might only have 4GB of RAM the iSCSI server has 128GB plus a dedicated 256GB NVMe filesystem cache. The flash storage and the RAM are faster than my 10Gbit network and those are much faster than the SATA SSD in the laptop.

        2. anonymous boring coward Silver badge

          You display a very superficial understanding yourself. (Even bringing in DOS into it..)

          I would be careful of saying things like "a lot of people don't seem to understand" in your case.

        3. doesnothingwell

          'Free RAM is wasted RAM.'

          Tell that to my "xtg-desktop-portal" process which grows endlessly and after a couple weeks will use all my memory. PC gets real slow and is unusable for many minutes if I don't kill the process. So yeah I want 25% free memory until all my software learns to play mice, which is mostly never. I even replaced the whole PC with a R7 5800x and same crap, kernel and version updates don't fix it. Linux Mint Mate 21.3 16GB

      2. anonymous boring coward Silver badge

        Nah, that's not paged out memory.

        Well written apps should release memory, even if there's memory left.

        Some moron spread the gospel that they shouldn't but that's just nonsense. (A misinterpretation of some more enlightened views.)

        As I recall, Firefox programmers refused to understand that they should play nicely with the rest of the system.

        And even if these app held on to memory to re-use it, that's not what actually happens. They forget they have it and just claim more memory. A.k.a "a memory leak".

        1. anonymous boring coward Silver badge

          P.S:

          The Activity Monitor says memory pressure is high. A lot of RAM has been compressed (at a cost) or swapped out (at a cost). But does Preview release memory? No it doesn't.

          Most of the viewed pages haven't been accessed for some time and could easily be freed up and re-loaded on demand. Not rocket science to implement.

  8. PhilipN Silver badge

    "Can't do that, Dave"

    My GPU is not powerful enough.

  9. Anonymous Coward
    Anonymous Coward

    AI is nearly as big as the discovery of electricity

    Guys, when talking about AI pc you need to think beyond photoshop, cats and dogs. think about a pc where the whole os is supported by AI making many current OS UI features completely obsolete (context menus, drag&drop, folder structures etc).

    think of an os that can interact with your home router and switch and configure virtual LAN, VPNs etc completely tailored to your network. These are just a couple of examples, AI invention is nearly as big as the discovery of electricity!

    1. Scotthva5

      Re: AI is nearly as big as the discovery of electricity

      Will "AI" improve the recipe of the drugs you're on?

    2. LybsterRoy Silver badge

      Re: AI is nearly as big as the discovery of electricity

      I just tried your suggestion - any good recommendations for suicide?

      1. The Oncoming Scorn Silver badge
        Alien

        Re: AI is nearly as big as the discovery of electricity

        ZEN Repair monitors report explosive device attached to primary power channel.

        BLAKE Where?

        ZEN Hold three, access duct seven.

        BLAKE Can the automatics neutralize it?

        ZEN No.

        BLAKE Why not?

        ZEN There is no damage.

        AVON Computer logic. Until the bomb explodes there is nothing for the repair system to repair. Zen, can you reprogram the automatics?

        ZEN Preemptive interference in crew activity is forbidden.

        BLAKE Oh, he'll clear up after us, but he won't stop us making a mess.

        AVON You made this mess.

        JENNA We're all in it, Avon.

        AVON Yes, aren't we.

    3. jdiebdhidbsusbvwbsidnsoskebid Silver badge

      Re: AI is nearly as big as the discovery of electricity

      I wanted to downvote because that future you describe sounds horrible and I don't want it to happen. But I upvoted you because as ghastly as that future sounds, I think you're right and it will happen.

      And that's the whole point of the article: regardless of if it is wanted or needed, AI will be everywhere and will drive hardware demands up and up. The analogy of how Windows drove hardware in the same way is something I view as a lesson from history that we have failed to learn. Compared to 20 years ago we now have massively powerful laptops and desktops that are little more productive because of the extra power having to be diverted to pointless GUI shininess or other useless guff (chrome on my laptop runs 7 processes and swallows quarter of a gig of ram to merely exist and do nothing). Gaming and hard core number crunching like CAD are legitimate use cases, with useable outputs. AI proliferating like suggested, not.

  10. Andrew Hodgkinson

    It's not going to make PCs better; it's just going to make software worse

    If the base expectation goes from 8GB to 16GB, then what people did in 8GB today will take 16GB tomorrow.

    There won't be any more you can get out of the more powerful hardware after a year or two; it'll just run two or three more layers of bloated and buggy abstractions in order to do exactly the same kinds of tasks it's been doing for the last 10 or 20 years.

    1. aerogems Silver badge

      Re: It's not going to make PCs better; it's just going to make software worse

      Not saying I disagree with what you say, but those abstractions also make it a lot easier to crank out software on a faster time scale. Imagine trying to write an entire modern OS in assembly, or machine code if you're really hardcore. Sure, the results would probably be pretty impressive, but how long would it take you to write it? And then what if you spend all that time writing an OS in x86 assembler, only for the world to move to ARM or RISC-V? Pretty much everything now has to be thrown out and you need to start from scratch. Those abstractions allow developers to do sort of like Java promised in write once, run anywhere.

      It is a fair point that the creators/maintainers of the abstraction layers are often derelict in their duties to make sure they are optimized and bug free, but the whole point of abstractions is so that you don't have to constantly reinvent the wheel. Someone makes a good library for wheels, and then everyone can use that instead of rolling their own. Saves a lot of time and effort. And, in theory at least, it means one set of developers can really focus on refining that wheel lib to make the best wheel possible, which then means everyone using that wheel lib benefits.

      If we went back to the days where developers were obsessively refactoring code to eek out every last bit of performance, we'd be lucky to get new Windows versions every decade. And as much as someone is bound to say, "Good, I hate upgrading," rarely do their actions bear that out. A lot of times you'll find these people upgrade almost immediately as when some new version is available. Just imagine if we were still using the iPhone 5 because new devices only came out once a decade. Try buying a computer running Windows95 off ebay or something and try using it, just to see how much you can't do with it. You'll probably want to throw it off the nearest cliff within a week. We like new shiny things, it's hard coded into our brains. Rarely are people the luddites they proclaim to be. They certainly wouldn't be reading a rag like El Reg if they were.

      1. anonymous boring coward Silver badge

        Re: It's not going to make PCs better; it's just going to make software worse

        "If we went back to the days where developers were obsessively refactoring code to eek out every last bit of performance, we'd be lucky to get new Windows versions every decade."

        No need to be obsessive. Just some basic testing and pride in the work needed. The memory leaking trash that's churned out now is just an embarrassment to the industry.

  11. aerogems Silver badge

    Finally, a practical application

    The idea of allowing NPCs in games to be a bit more dynamic is at least... not quite useful, but a good use case at least. It still doesn't solve the fact that most games these days have graphics as their first through like fiftieth priorities, and gameplay falls somewhere between 51 and never. It's been like 30 years since Wolfeinstein 3D created the modern FPS and very little has changed in that time. If you've played any FPS made in the last 5-years, you could sit down and figure out Wolf3D in about 10 seconds or less, even if it was released before you were born.

    Having NPC helper monkeys who aren't complete idiots would be kind of nice, or enemy characters that can adapt as opposed to always following scripted routines. So, if that actually pans out, it'll at least be one application for AI that a number of people will find interesting. Bravo nVidia for coming up with something even if it's a bit niche.

    1. ldo

      Re: allowing NPCs in games to be a bit more dynamic

      Better than a real person can do it?

      (A minor classic from over a decade ago.)

  12. BinkyTheMagicPaperclip Silver badge

    Stick it in 'the cloud'

    Leaving aside whether AI is actually going to be worth it for the majority of users (a very large if) that doesn't equate to needing gobs of local compute power.

    Stick in a hosted or departmental server with a few GPUs in it, offload tasks to that. There is no need for every user to have a powerful GPU and huge amounts of memory when it will remain unused 98% of the time.

    No argument that 8GB isn't enough, though. My rarely used main system has 64GB, because DDR3 was cheap and I like virtualisation. This fanless Dell system used for browsing has 20GB in it, because again, one 16GB stick was cheap. It's currently using 9GB just to run FreeBSD with Wayland and Firefox with 25 tabs open! That includes 5GB of ZFS cache, and zero use of swap.

  13. Badgerfruit

    Are you sure about this?

    I mean, what could go wrong providing AI capable machines to anyone who wants one. Terrorists, bad actors etc.?

    No? It's a good idea because it boosts some company bottom line. Right. Yeah, sorry, ignore me.

  14. Anonymous Coward
    Anonymous Coward

    Two Questions...

    1. Will an AI computer be able to fix Skyrim every time Steam changes its own permissions to several levels above administrator and forces an update, or will I still have to spend as many evenings and weekends re-re-...-re-re-fixing my mod setup every time Bethesda issues a trivial patch absolutely nobody wants?

    2. Will anyone reading the article still be alive when all the own-goal problems that undeserved faith in AI will create still be alive when AI becomes a boon to the common human, or it is all just a downward spiral from here?

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like