Re: Oh Jaysus
There's a sitcom in there somewhere.
375 publicly visible posts • joined 4 Nov 2016
"During an outage in September, the Matrix.org homeserver went offline following a hardware failure. Organizations with their own homeservers, such as governments, were unaffected. While embarrassing, the incident also inadvertently demonstrated the strength of a decentralized approach."
Didja read the same article I did?
To be fair, the head of Palantir's UK operations pointed out you don't need an id card. You just create a database of NHS no, NI no, etc mapping to some unique key and be done with it. Palantir already have Gotham used by the Met and now presumably MoD, and all the NHS data since Covid. Their spook stack does not require id cards, that's kind of the point.
Here's a new one: undisclosed vulnerability details. So, sitting on vulnerabilities rather than closing them urgently?
That sounds like standard responsible disclosure- holding the vulnerabilities from publication until the vendors have had a chance to patch it. The security industry has been doing this because back in the bad old days they'd release the PoC and it would get weaponized by script kiddies.
The other issue is that security is very, very hard to get right. You are running a machine with who knows how many millions of lines of code, and all it takes is for one of those to be wrong. Defence is far harder than offence- you need to stop every single attack, including ones you may not yet be aware of, the attacker just needs to have one payload land correctly.
Because legislators are too dim to see the cause and effect between a poorly rushed age verification scheme that has been outsourced with zero privacy or security requirements to third party companies. They simply do not understand how the tech landscape works, and the sheer number of chancers in the industry willing to sell businesses a service they cooked up in a weekend for a quick buck.
(e.g. it doesn’t try to coerce/trick you into using unwanted cloud services, browsers, ads etc.). It handles all roles perfectly (general office productivity, *nix-land development etc.)
You sure there? Certainly the first thing it tries to do is connect you up to iCloud (vs Drive), demands your credit card info (what?), and foists Safari (vs Edge) upon you.
Granted it's not constantly notifying you about XBox Games Pass, but I'd say the shitware creep is fairly high here too. I wouldn't exactly call it polished either- Apple's software quality has been degrading for over a decade. The difference between Apple and Microsoft is primarily that Apple have a slightly more coherent vision of what the OS should do, vs whatever the hell Microsoft are doing.
To your second point the hardware is often well made though, but I'd counter with the fact that these days basically everything is glued in, so maintenance/upgradability suffers.
Goldman Sachs discovers model collapse and thus hurtles into the 2nd quarter of 2023. Just wait until they start reading 2024's research, because it doesn't get more optimistic with time outside of the marketing spiel.
somewhat "correct" answers seems just to be: "make it bigger" for the past several years.
But with such rapidly diminishing returns of performance vs size, and the really obvious elephant in the room- there have been no practical improvements of this technology since what has always been simply an intellectual curiosity escaped the labs a couple of years ago... It doesn't take a genius to figure out that spending trillions on 0.5% performance increase in some random benchmark resulting in an AI that "now only gets shit wrong 45% of the time" is not a great investment.
Seat belts? I'll take my chances and jump out the door at this point.
He clearly didn't consult the AWS engineers. God knows how many square milage of portacabins with network engineers scurrying around constantly replacing hardware as it inevitably fails. If he can automate them away, then he can start thinking about space deployment, but I think we can all agree, hardware fucks up on a regular basis, and tens of millions of dollars for a callout is just not financially viable BEFORE we even begin to worry about the practicality of e.g. cooling all that in a vacuum (good luck with that, maybe you can break thermodynamics while you're at it, Jeff).
It must be nice to be rich and clueless.
There is a slight bias towards the left in all LLMs, yes. This is because the bias is inherent in the training data because, on average the internet has always had a fair bit of a left bias. It's not a frickin' conspiracy you loon, they borrowed more money than the GDP of some countries, shoveled up the internet and poked it with a stick and this is what they got. It's objectively pants at everything, because no thought or nuance was ever put into it, just data- which is why you had a bad time with it, too.
AI looks to be one of those things like GUIs around 1970...
To be fair, AI looked exactly as it does now to you as it did to people in 1970. The problem is it's now a few decades later, and it's still "just around the corner."
Any sufficiently advanced technology is indistinguishable from a rigged demo.
"Atos, Capgemini, CGI, Cognizant, Methods, PA Consulting, Scrumconnect, Transform UK, Solirius Consulting, and Version 1 were the main suppliers of the services affected since 2016."
There are a few usual suspects that stick out here. But we will continue to suffer howlers like this until the government reforms its IT tendering process so it doesn't reward failures. Locking out contractors on huge projects like this due to IR35 is also not a great solution, because that means That One Guy Who Knows Everything About Software X Because He Literally Wrote It doesn't get to sit in the meetings and steer the corporate drones clear of the ravines.
it's very good for getting information and ideas, like a super search engine.
Is it though? It has no creativity, and the only reason it is like a 'super search engine' is because search engines have become a huge pile of shite over the last five years. Researching what other people have actually bet the farm on and succeeded with to push in a particular direction is infinitely more reliable than an LLM, which will spurt out any old vapid nonsense and then double down on it's bullshit when pressed.
The only thing an LLM is good at is tiring you out until your critical thinking skills are no longer able to fight back against its nonsense.
If the last two AI winters were anything to go by- absolutely not.
Although in the run ups to the last two cataclysms, the academics were loudly warning business on their overoptimism and hype. This time around I am honestly aghast at the apparent silence. Guess those departments don't fund themselves.
PICK... and APL.
I have not heard those names in a very long time.
You may be interested in J, definitely the closest spiritual successor to APL, although it's ascii based, so maybe that takes some of the charm away.
As for PICK... gosh. The last time I used that in anger was I think D3, in the 90s, on a Unix that some may have had the misfortune of hearing the name of before... SCO.
There are plenty of more effective ways to do this kind of narrow domain inference that have been in use for decades. In fact there are entire journals dedicated to specific classes of them.
Prodding an LLM with a stick and hoping it turns into something useful seems expensive and unlikely to work at this stage, they've already plateaued and we don't have a single working use-case for them.
Unfortunately this can't work in the context of neural networks due to Zipf's Law. The problem is that unless your problem falls in that sweet sweet low hanging fruit basket like "Can bees sting?", you simply cannot ask enough questions to have anything but statistical noise in the vast majority of your training weights.
This is a fundamental limitation of ML, and it cannot, under any real world situation be addressed in a purely data driven fashion.
Medical practitioners do a hell of a lot more than trial and error, but of course, the AI corpos want you to believe that super human intelligence is 'just around the corner'. It is not.