To paraphrase Dilbert
"There's an AI solution to every problem". Maybe this from the Telegraph sums up why we all need AI-enabled computing.
The AI-capable PC is coming to save a shrinking market, according to Canalys, although vendors need to be far clearer about any benefits to charge higher margins for the devices. The definition of an AI-capable PC is, says Canalys in its "Now and next for AI-capable PCs" report, at minimum, "a desktop or notebook possessing a …
If someone wants to make a bundle, they'll make motherboards with a connector that can go to a mechanical switch on the laptop or desktop case that disconnects the AI chip(s). Kind of the reverse of the "Turbo" switches of decades back.
Bigger problem will be "AI" rubbish embedded in the CPU/GPU.
Yeah, the bandwagon is both wide and long on this one—plenty of room for all the chancers, PR-droids, and money grubbers to climb onboard to boost their lies and honeytraps.
Like most (ahem) tools, generative 'AI' has it's uses, and within a limited envelope of 'usefulness' can be very useful indeed. Outside that envelope: utter unreliable drivel.
Of course, it'll take a while, and a few deaths and other tragedies, before we all settle down and let the tool have its sensible place.
Looking forward to the many UI and OS modifications from intelligent people who cut this shit out of Windows to put the Personal back into PC.
The harder you push, the more resistance you encounter. I'm betting that this is going to raise shields like Borkzilla has never encountered.
For Apple, probably less. Jobsians are already brainwashed, they'll just think that this is the new reality field and will be fine with it.
In my limited experience of machine learning, nearly all of the processing power is required to train the model on a huge data-set. Once that is done then a decent PC is fine to run the model. If the intention is to train new models on the PC, then each PC will need a ridiculous amount of power that will be unused almost all the time. Also, where is the huge data-set coming from and where is it being stored and will its contents be available to each PC user? I suppose this is where the "missing use-case" comes in. Maybe some form of tweaking models based on data generated on and local to the PC. Can't really picture how this benefits anyone myself but I don't work in PC sales so my imagination is limited.
I'm glad that I always build my own PCs and only pay for the bits I need. That won't save my employer from this madness though (for those few 'power users' that successfully argue that they couldn't possibly work on a virtual desktop).
"Also, where is the huge data-set coming from and where is it being stored and will its contents be available to each PC user?"
That's where the mandatory subscription fee just to boot Windows 12 comes in. You don't expect that cloud storage to be free, do you?
If the so called AI (It isn't but that's another question) wants to build a model that is based upon what I do on the machine, you can be sure that it will NOT remain private to your machine.
This is just nirvana to the likes of Google, Amazon and MS. They don't have to do any work to get the AI model delivered to them on a plate.
As I avoid all three like the plague, I won't have an issue but if you want what you do to remain secret then tell them to STFU and disable (or NUKE) the model before it even starts working.
Alexa will be regarded fondly if this load of bovine excrement takes off.
So how hard is this going to hit the electricity bill?
My old Mum uses a PC to order some shopping, write the odd email to renew a prescription, and play cards. How on earth is AI to be of use to her?
The PC she currently uses is massively overpowered, but we have to scrap it as it won't run Windows 11. And now they want to add a psychotic Clippy room warmer to it.
It is sad as to how out of touch these companies have become with real people.
Hey, someone's even been playing with an RPI5: What I learned from using a Raspberry Pi 5 as my main computer for two weeks. It'll be nice when everyone can tell Microsoft to fuck off.
I suspect that a lot of us software developers will throw in the towel at this juncture. Hand-crafted, efficient code will be replaced by brute-force, extremely wasteful in energy and hardware resources code. The latter will occasionally spit out some random result as part of its AI basis, but guess what, from my own experience with "trends", sheeple will shrug and say that it is acceptable.
Depressing.
The alternative is to start marketing the "no AI in our products" mantra.
"will be replaced by brute-force, extremely wasteful"
Grammatical error - I think you need the past tense, not the future, cf the discussion of Wirth's Law that we have had recently.
Actually, my thought was that this is yet another scheme to stop me from installing Linux on my own computer hardware.
So companies thought why should we invest in GPUs if we can offload training to customer machines?
Then you slide a brown envelope in front of a policymaker and they say:
"I can't believe it's NOT theft! But it's not! Because I say so." *then they go browsing beachfront properties*
Hmm. Which would I rather buy? A shiny new windows laptop with AS installed? Or last year's top end model, recycled by some exec who simply has to have the latest and greatest?
Hint: option two, followed by the immediate installation of penguins, is my normal way of buying computers...
I know this was meant as a joke, but it’s going to be all too real: running these AI engines will likely require a Linux kernel. So even though your “AI PC” will still be sold as running “Microsoft Windows”, it will be Windows with WSL2 as a mandatory part of the install.
I predicted that something along these lines would happen, from the moment WSL2 appeared. Not from any conscious plan on the part of Microsoft, but because building on top of the Linux kernel will be the path of least resistance for new technologies like this.
AI is something I sometimes use for fun - getting a film suggestion, writing an imaginative story or news report on some concept in another language so I can learn the language - but would never use it in a serious capacity. The thought of it on my machine makes me shudder, it would be like having Peter Sellers or Jim Carrey running around the system...
OK, with all the negativity being hurled at AI here, I'm going to make a case for how this might be useful for something other than letting the OEMs shift more gear.
The machine learning systems that get hyped as AI at the moment have a stack of major problems, but high up the list are three: they typically are poorly tuned to individual users, they typically need you to be on-line to use them and (as a result) they are terrible for privacy. Having a capable, reasonably efficient neural network accelerator on your laptop, it not only becomes possible to run decent models while off-line but it allows local, privacy-preserving training and it allows the burden of fine-tuning for personalisation to be distributed, which makes it more feasible at scale.
I don't know if there will ever be a "killer app" for AI, but there are plenty of things for which it's rather useful. Generative AI for chat is at best a gimmick and at worst a fountain of BS, but generative language models are rather good at grammar checking, paraphrasing and writing style adjustment. Transformer language models have allowed a step change in the quality of speech-to-text systems (go track down Whisper.cpp on GitHub; it's better at TTS than most commercial systems and the code models are open source). Neural networks are great at network and process anomaly detection, which can make malware detection more responsive.
Yes, there's a boatload of hype for AI at the moment, and the "AI PC" is currently a solution in search of a problem, but I think that there are plenty of problems out there for which it can help. I don't know if the rest of you are just going to sit around and complain; I for one am going to go write some code in anticipation of having a 45 teraflop accelerator in my laptop.
Imagine if Spotify sold their music selection algorithm but tweaked so it wasn't just your music choices it could anticipate and wasn't just new bands it (generally correctly) could suggest, instead it could anticipate what programs you wanted to run at the same time or next (in my hobby case fusion360 and then Cura), what sites you wanted to have open, what you named that file you were working on when the phone and door both went and where you saved it (or better still named it appropriately and put it in a more appropriate folder, perhaps with the option to go through and rrarrange your documents etc into a workflow that suits you.
For gaming it could assess what games you generally play, look for alternatives/ future games, find the best price, and rank a set of candidates giving pros and cons of each. Alternatively you could request it find you something new to play, purchase and install it when you are x% through the current game or after x hours (as you often lose interest)
The options are quite myriad and there are probably a lot of options that people aren't even thinking of.
What I ideally want is Jarvis or F.R.I.D.A.Y. to deal with the mundane and act effectively as a personal assistant does for a CEO - book restaurants, arrange travel (book me a trip to somewhere warm and sunny, secluded and not the USA under x amount), manage my calendar etc
Yes, it could attempt to understand things you liked and offer more of what you might like.
The downside is that if it had an agenda (companies paying to tilt the results) it could try to mould your preferences according to that agenda, and hide from view anything that is "unsponsored".
Because there is no oversight on how it arrived at data presented to you, corruption will be rife. With algorithms as they are now it is possible to look into the code that presents your choices and can check for a level playing field. This is not going to be possible with AI systems.