Re article title
“… just not for the consumer class”
Haven’t you heard? Nvidia don’t make consumer GPUs any more - well they do, but only to vaguely pretend to care
And they certainly don’t give a shit about Jo the consumer
1695 publicly visible posts • joined 24 Jun 2009
One thing I find odd is the number of times someone takes an existing problem (such as nuclear fusion simulation, for example), then (apparently) just “adds some AI” and the result is immediately “better” (or at least “wow! It’s AI”)
I know stuff-all about nuclear fusion simulation but I bet the (non “AI”) algorithms used have taken many many years (decades) to develop. How come new “AI” algorithms (which are obviously “better”. Obviously) seem to be instantly available? Or at least available as soon as the new “AI” machine has been built? I’m thinking “AI” algorithms must be massively different when compared to the boring old algorithms so how come they seem to get developed and written in an afternoon?
Or is the “AI” bit just …well …total bollocks?
It’s remarkable isn’t it? I don’t think I’ve ever read that Apple has voluntarily reduced its fees in the UK or Europe after having a little chat with the local regulator
In fact, prices don’t get reduced even when threatened - after the obligatory 2 year “investigation into the bleedin’ obvious” and 5 years of court appeals, naturally
I bought a cuckoo clock from Germany many years ago. I always thought they were German in origin.
It used to drive any gusts we had absolutely nuts! We (me and family) learned to switch off from its hourly (actually it might have been quarter hour?) cuckoo. Guests were not so lucky and would inevitably stop the pendulum at some point in the night :-)
These laws will be completely unenforceable. The only way they could be enforceable is if the OS’s and applications in the firing line were limited to the US states in which the laws apply. But they are not and nobody is going to voluntarily (because that is what it would take) accept and implement these laws outside of those states just for the convenience of those states’ legislature
It’s a storm in a teacup and it won’t matter how much the politicians of these US states stamp their little feet, it’s not going to happen
Some years ago, Intel removed ECC support from all its non-Xeon processors. This was a purely marketing thing to get industrial users to buy Xeons which are, of course, much more expensive.
Since then, there has been a prevailing attitude that “consumer grade” systems don’t need EEC. Which is total bollocks. Unfortunately, this attitude seems to be very widely held; not just by the likes of Intel and AMD
All AMD processors support ECC but motherboard support is optional - none of the non-server MSI motherboards support it for example. I think (not sure) Intel are also now starting to put ECC back into their “consumer” chips
ECC memory is also more expensive. But it doesn’t have to be - if all computers used ECC (and they should) then it would be “normal” and cheap (ignoring the ridiculous RAM prices caused by nVidia and friends at the moment).
So it’s a crap state of affairs and is an artificially made problem. If I remember correctly, Mr Torvald had some very justified harsh words directed at Intel at the time of their stupid decision
It would be better if they extended the law to ban exporting of all things “AI”
Go on - you know it makes sense - it would close any and all loopholes and would make for a much cleaner law. Which must be a good thing, yes?
Unfortunately, the rest of the world wouldn’t have access to American slop …errr “AI”. A real shame but I think a price worth paying
Agreed. He should have created a whole new project with a new name.
The thing I find troubling is that unless you follow the story closely, you wouldn’t necessarily know that the latest version is “AI” generated - which is a bugger if you are actively trying to avoid such slop (regardless of any licensing issues)
A couple of points
Firstly, that fact that the rewrite involved an LLM is irrelevant (except for any accusations of plagiarism). It’s a rewrite. It doesn’t matter who or what rewrote it - it’s a different implementation. I think the LLM aspect of this is bogus (again, except for any allegations of plagiarism) - and just for record, I am no fan of “AI” whatsoever
Which brings me on to the comments from the original developer saying that the rewrite is STILL under LGPL because it was not a clean room undertaking. That’s bollocks. It’s a rewrite. As long as there is no actual copying of the original code then you cannot claim the new version must follow the old version’s licence terms. That’s just nonsense - people make new versions of programs all the time, and not always under the same licence
And saying the rewrite must be public domain because copyright can’t be attributed to an LLM is just silly and trivially fixed - just claim authorship of the work yourself; the LLM isn’t going to complain. I don’t remember his name now but there was someone a year or two ago trying to legally argue that an LLM could obtain copyright on a work - he lost but I never understood the point of the argument; just claim ownership and copyright for yourself - what does it matter?
“maybe it's not bleedin obvious to the wearers”
In that case, those users should be euthanised to take them out of the gene pool. The mind boggles. Every day in the tech press there seems to be yet another story along the lines of “bloody stupid people doing bloody stupid things that probably ought to be illegal anyway, are complaining about the consequences of the stupid things they are doing”. Arrrrrrggghhhhhh!!!!
Rather than “concerning”, what about “bleedin’ obvious”? If you walk around all day pointing a camera and microphones at everything and everyone, then obviously you are going to record stuff that really shouldn’t be recorded
These things should be outright made illegal
That’s quite the overhead for a simple utility. And I would seriously question the value of rewriting a utility that has been about for decades and has been used countless times - do the benefits really outweigh the (minuscule, approaching zero) risk of leaving it alone?
Also, I appreciate that most programs bring in one or more libraries (even if it’s just libc) - in my mind, that’s fair enough. I start to get annoyed when an application brings in (and obviously has dependencies on) megabytes of slop from dozens of libraries, often just to use one or two functions from it; “framework”-type libraries being a particular annoyance - just code the bloody thing yourself!!
I think I’m more depressed than I was earlier!
From your first link, “write once and run anywhere” - where have we heard that before? I’ve not touched it in decades but I am reliably informed that Java is now so good (and compiles to native code) that it’s probably a prelude to the second coming (no sarcasm intended). I might even follow up on my colleague’s recommendation and take another look at it
Something else that made me smile (maybe wince) was in the second link; there is a picture of three GUI styles from times past to present - guess which one I thought immediately stood-out as being clear and simple to use? Yep
And writing applications in html, css and fucking JavaScript is as brain dead as this industry gets - and it’s up against some pretty stiff competition - “AI” [cough]
God I’m depressed….
If you take a radio system with the intention of improving its spectrum efficiency, then you either need to change the modulation scheme, or the media access protocol or something similar. These things must be done at the RF level - in this day and age, often via some FPGA signal processing magic. Regardless, it’s right at the coal face - exactly the bit that all SDR models ignore because “it’s too hard”
Shoving an LLM (because that’s what “AI” means these days) into your upper software layers isn’t going to change this. All it will do it eat up power and processor bandwidth and memory for no benefit other than to make some AI outfit richer.
But it’s the RF bit that’s the most complex and expensive. If you’re not changing that bit in your new version of your SDR then you’re just twiddling with the upper layers. Ok - that might be all you need to do for your application but you probably don’t need all the baggage that the typical SDR framework brings with it to achieve that - in fact you’re almost certainly better off without it
This classic article is also an amusing read…
https://arstechnica.com/information-technology/2012/06/how-to-blow-6-billion-on-a-tech-project/
Firstly, “… ever since 5G came to dominate in the earlier part of the decade”
Maybe I’ve been living in a cave, but this is news to me. I thought 4G was still much more widely used than 5G
As for the “AI” shite, I would love to know how sprinkling a turd shower of “AI” on to the mobile network it’s going to magically change fundamental radio modulation schemes - the number one distinguishing feature between each ‘G’ from 1 to 5 so far. Saying that “AI” Will allow for updates (say to 7G) without reinventing the hardware is total bollocks. It’s why “software defined radio” in general is such a half-arsed concept because it too completely ignores the fundamental RF side of things
“… our goal is to ensure developers and users have immediate access to the latest performance improvements, fixes and new capabilities.”
This sounds like tech-speak for “we’re “agile” (‘cos that’s cool) so we don’t test anything and rely on our users to run our half-cooked beta code to find the bugs in it”
Thankfully, I don’t use chrome and never will
I tend to agree. There is a common mentality that a program must keep acquiring features to be useful. Fix bugs, sure. But keep bolting on more and more features until it collapses under its own weight? This is exactly the problem with C++ (yes, it’s a spec, not a program but the principle is the same) - it’s now so ridiculously complex that many people have given up on it - I know I pretty much have
Look at many of the base unix tools - they’ve not changed in decades but are as useful as ever
If I was going for a job at Accenture, I would ask the question “pre the AI hype, if I told you I write software by picking stuff out of GitHub and Stack Overflow, gluing it together and just using it without any consideration for its accuracy, would you employ me?”
You can guess the follow-up….
LLMs do not deliberately make things up. They also do not deliberately tell the truth. They follow a decision tree of options and spit out what they end up with. Sometimes the result happens to match reality. Sometimes not. It is how they work and what they do. Attributing any sort of “understanding” is just wrong
Why does it surprises anyone that the scenario described here is possible or even likely?
“This means that Lenovo Group, operating under Chinese jurisdiction, "can use this data to build detailed dossiers on US residents, identify psychological or financial vulnerabilities, and target individuals in sensitive roles – such as jurists, military personnel, journalists, politicians, or dissidents”
“…. triggering trackers that violated "his reasonable expectation of privacy"
So exactly what Googlies and Microslop and Faecesbook have been doing for years then? The only difference is that the complainant thinks this stuff is being sent to China? The China bit bring the only thing that separates the two scenarios. I love the way American law completely excuses local businesses from brazenly exploiting their customers but is up in arms if anyone else tries the same (and I’m not saying Lenovo is guilty or not - I have no idea)
There is a theory that octopods are actually alien. Seriously. It is based on the fact that they are so totally different to anything else on the planet - no identified evolutionary ancestors and very few similarities with anything else
Then again, maybe we are actually genetically synthesised beings sent here by some alien race to totally fuck-over the planet for some unknown nefarious reason. We’re pretty bloody good at it though
Sad for the students who will be lulled into a false sense of achievement. Instead of wasting the students’ time on this nonsense, the college(s) should be teaching real software know-how that will actually be worth something.
What is going to happen when a “graduate” (I use the term loosely) of this nonsense starts a job and is asked to (say) look at an existing system and find out why “it fails to perform function A after half an hour of uptime. We think It’s something to do with the comms between these two systems here but we’re not sure. It could be that the interrupt handler isn’t being triggered - dunno why that would be though”? They won’t have a clue where to start
Why the hell would anyone think the answer to supporting one type of shit is to support MORE types of shit? No - the answer is to support NO shit - force Faecesbook to remove the “AI”
This is akin to the idiots in the US that think the answer to people shooting up schools and nightclubs and churches is to distribute MORE guns
FFS