Re: My Five Accomplishments this Week
> You can "clean house" with a flamethrower
Good Ghod man, don't say things like that!
The last thing he needs is to be reminded he has a warehouse full of The Boring Company's unsold goods!
5065 publicly visible posts • joined 9 Nov 2021
> and resources," HUD spokesperson Kasey Lovett told The Register. "Appropriate action will be taken for all involved."
Presumably he means the waste that resulted in the apparent lack of security in his systems.
And the appropriate action will be - to whine about the evil people who did nothing more dangerous than show a not too unsafe for work video, instead of hiring in and paying for the right people to actually secure his screens.
Oops, sorry, you are entirely correct.
I did mean to only mention squashing flora[1] - although getting fauna juice all over you is no fun either.
I have No Reply At All to explain this Misunderstanding, Please Don't Ask, other than being too distracted by the chance for a Genesis reference. More Fool Me for not proofreading better, but Like It Or Not the edit window had closed after my Trespass into the wrong kingdom. The comment was all Coming Together nicely before then, but I Can't Dance around my mistake. Anyway, thanks for not taking The Knife and driving it In Too Deep. As Sure As Eggs Is Eggs there will more mistakes made if I comment more; it is well past Dusk, so best I'm just In Hiding as a Lurker on the forum For A While. Squonk.
[1] as it is easily spreadable, even when chilled
> If there are no toxic materials in the debris then why does there need to be a cleanup effort?
Random people fiddling with a non-toxic mangled bit of debris can still get injuries from sharp edges - including Tetanus. The debris may not be toxic in and of itself but it can dig up stuff you want to stay clear of: simply squishing the local fauna can mean it is covered in juices that'll ruin your day (giant hogweed isn't just a Genesis track). Or bits break off and stab at you (the photos of the debris don't make it look structurally sound).
Basically, these guys are fly tipping and you still go in and say you are doing a "clean up" if the only thing in the wayside is obviously an old chest of drawers, with nails in unexpected places.
Heck, if they don't go in wearing full bunny suits and one of them comes home with a nasty rash from stinging nettles there are people who'd scream it was nasty spacey stuff.
OTOH If they do go in looking like bleached Minions then da 'Webs are claiming that means a cover up...
It does appear that Art has an accurate point here - IF the OP is referring to "Lord" Christopher Monckton[1 ]then there is a definite lack of (online) references[2] to indicate that he has changed his mind - about *any* of the total bollocks that he has spouted over the years.
If there really *is* a citation that this particular Monckton has been convinced to change his tune on at least the topic at hand then that would be an excellent thing to share here.
But so far I am doubtful that The Mad Monckton of MAGA[3] has changed his mind.
[1] other wikis are available
[2] not that I'm claiming to have especially good google-fu, but it normally suffices
[3] just the opening frame is enough, no need to punish yourself by watching that YT video
> Presumably someone thought that if they did a runner, you could use sniffer dogs to catch them? It always struck me as a rather over-complicated and impractical system.
Sniffer dogs are over-complicated and impractical? Pretty sure we still use sniffer dogs - and FIR cameras on helicopters weren't really a thing in the days of the Stasi.
Can't be the "keeping things in jars", that is a fairly normal way of keeping stuff whose whiff you want to retain: trying to concentrate that down to spray-on Eau-de-Dissident would be rather costly whereas carrying a jar to the edge of the woods before releasing the hounds isn't a great burden. Compared to, well, bring in a personnel lorry of dogs and handlers.
And don't ignore the basic effect of hunting someone down with dogs, slavering beasts with sharp fangs who are promised a treat if they drag your battered body through the undergrowth. That'd put the frighteners into most people. And fright is the Stasi Fashion, if they don't get there in time to make you quietly vanish away.
> how do I prove that it was mid transaction when the power is removed?
Put a relay into the power lines, tied to a GPIO line [1] and flick it off mid-transaction. Depending upon what is doing the transactions, put this call into your C code, or put use your external function interface from your database or - you figure it out.
You can even measure the latency from the call to power off and move the trigger to just before the actual transaction.
You can even leave this code in, if you don't want a separate test build - just don't wire up the relay! Put a manual toggle switch into the test box to disable the relay when you are doing another test.
If you are really into it, put a relay onto the mains side and another onto the PSU outputs, then you can see if the PSU caps can keep go *just* long enough.
Then go wild with setting it up to only trap certain calls, triggering the power on and off in rapud succession (how long until your PSU gives up the ghost?) etc. Think Evil.
Note: solid state relays for a quick response time.
[1] from an FTDI chip on a comm port - preferably not USB, to keep the latency down - or on a hardware serial port - check the motherboard connectors, they still exist - control lines or even the motherboard port for controlling RGB LEDs!
> The good ones have on-drive RAM cache, cheaper ones use host memory.
Hmm.
*GOOD* ones would provide, say, an internal supercap to allow the cached writes to complete (you know, like industrial CompactFlash cards).
Cheaper ones that just want to game the average user's speeds will have their own cache that will fill and cause your write speed to plummet if you do anything "out of the ordinary".
The most cost effective & BEST will honestly report what they can do and allow you a way to mitigate that by sharing system RAM and allowing processes to be written to take all the factors into account (good old fashioned Producer/Consumer interactions). Giving you the opportunity to make your system consistent and predictable.
> You didn't know that? HOW?!
> utilise all the levels of multi-level cells.
Not everybody who uses computers (enough to have them on a LAN with a router - low bar, these days) actually gives a damn about those details, they just want their boxes to Do All The Things. And they've been told that SSDs are so much faster/better than spinning rust - after all, anyone can easily understand the principle of "wait until it spins and puts the right bit of disc under the head" - but the flash doesn't suffer from (at which point, the salesman kicks the engineer who was about to spill the beans).
Sadly, it is not just the politicians.
I caught a bit of morning news, a couple od days ago, where the sofa dollies where getting a young lady from the NSPCC to tearfully declare that the existence of E2EE (without a backdooe) was what was preventing them from protecting children.
Of course, absolutely no discussion of feasibility or what happens when the backdoor leaks.
Tables of seemingly random numbers that get loaded into hardware registers to initialise something or other. Ok.
> The source code for a work means the preferred form of the work for making modifications to it.
Clearly, you have not worked alongside the same hardware devs as I have.
They just LOVE writing their hardware setup directly into magic numbers! In a mixture of bases, just to keep us on our toes.
I've spent ages writing macros, with comments, so that I can write out "set (named) sub-register to (meaningful enumerated value) and set (other named sub-register) to (range-checked-at-compile small integer). Only to come back the next week to find it all replaced by a single 32-bit hex constant! Yes, yes, the code is a lot shorter now, thank you for that...
You might like to educate yourself over the difference between drivers for hardware and the kernel proper.
Unless you can demonstrate that EVERY single use of Linux MUST include at least one of those files then your claim is incorrect: it would be (is!) entirely possible to build and run an entirely free & open copy of Linux, even according to your standards.
Once past that - yes, some hardware does require opaque binary blobs to run. Some hardware requires you to go to the manufacturer's website and download a precompiled x86 executable (tough luck, foolish Arm/RISC-V/etc owner). Some require you to sign NDAs and hand over your firstborn into escrow. Nobody is even attempting to claim those are all open in any sense, let alone open source. But they run within Linux - and you ate totally at liberty to shun them with all the snoot you can muster.
But the existence of any of those drivers does impinge upon the kernel that they would run atop.
The onus is on you to back up your claim - get to it: demonstrate that EVERY Linux system MUST have the files you have mentionef
> Does the proposed additional language require an interpreter or LLVM
Can you clarify, please?
By "LLVM" you do mean "low level virtual machine" rather than "anything coming out of the LLVM project"?
100% agree with "no (more) VMs in the kernel" but, if that isn't your meaning, apart from the fact that the sources are (currently) only fit for consumption by GCC (AFAIK), is there anything intrinsically wrong with the LLVM languages? Which might be taken as a dig at the current Rust compiler!
> They'll have to take another more pragmatic look at the "no C++" rule ...
Whilst I count C++ as my favourite language (at the moment, it may change - again) I can sympathise with not wanting it in.
Recent work has provided memory safe ways of working - especially for new code where you aren't stuck with loads of lines using another idiom - and whilst I (hope to) account for every byte used (gotta love microcontrollers), I have seen far too many cases of C++ code that has utterly dreadful performance.
From not apparently even *knowing* you can pass a const ref[1] to (in the most recent years) writing "good" code that "always did everything an STL string" that ended up doing megabytes (I kid you not) of memory copies[2] just to spit out a couple of kilobytes of (badly formatted) data. The code "looked ok" but they just had no idea what was going on, when copies would occur... And that was just strings! What you can do with dictionaries of maps of dynamic vectors...[4]
It is *so* easy in C++ (and other languages, let's be fair) to not understand what is actually going on[3] and follow "the current style" - that may let lead to a usable application but I can see kernel devs just banging their heads, repeatedly, over so many occurrences of things like that.
C may (!) be less than perfect, but you find it so much harder to hide your little memcpy fetish.
You *can* carefully introduce C++ into a project, and get a good result without (too much) friction. But you can also, all too easily, introduce something that sweeps through the codebase in a massive search and replace, like a tub of sugar-free gummy bears, and suddenly everyone is doing C++, ready or not. At least the Rust code is segregated...
[1] hey, you could speed up MFC code just by going through the sources with a "make parameter const ref" macro on speed dial; CTime, looking at you
[2] including places where the compiler would happily have concatenated those constants, once, at compile time, if they'd just written them as #def'ed and then (LEADIN "this unique bit" LEADOUT)
[3] and trying to get anyone to write an instrumented version of their code to just print out (only when the correct #if is triggered, like your debug build - you do have a debug and a release build, don't you) all the sizes of things being instantiated, copied, destroyed...
[4] for anyone keeping count, from earlier in the week, yes, yes I have just done a rant about "bad code I have seen" in a way that is likely to discourage a newbie from trying to join in an open source project
> environmental change that provokes genetic change to at least compensate for, or, more productively, take advantage of the change.
Bullshit.
Genetic change comes from mutation - which includes copying errors, not just "direct damage" from Cosmic Rays, non-Cosmic rays, interloper chemicals and so forth. True, the number of mutations that give advantageous results is low - BUT there are LOTS of mutations occuring in individuals and (in a population that is able to evolve) there are a LOT of individuals doing a LOT of reproducing. Of the rest, the majority are not harmful to the organism - not enough to be of any concern w.r.t. just getting on with the job of living - and reproducing. These LOTS and LOTS and LOTS all multiply together to give - a very slow process of change!
Oh, and add in changes due to sexual reproduction (i.e. a mix from both gives a new, unique, combination). Which speeds the whole process up and was a very advantageous change for the lucky blob whose forebears accumulated the mutations to allow it to happen.
To reach the final individual, add in changes in expression due to epigenetic factors (including, but not limited to, chemicals from the environment - but not all chemicals are toxic!). But those are not in any way "provoking genetic change": the 'ability' to react to that feature of the environment already existed, it just happens to be being expressed today. That expression may even be in terms of which genes get passed onto the offspring - but I do not have any examples of that, so, wild surmise, if David Attenborough hasn't told us of it, it isn't occuring enough to be one of your cornerstones of evolution.
Posting links to The Eye of Argon[1] and Atlanta Nights? What kind of sicko does that? Don't you realise some people might actually go and try to read those? Would you have that on your conscience?
[1] or the version with added bits
Unless you pay one beeellion dollars, we will crash your economy by forcing everybody to stay at home all day, watching the online map showing their delivery spiralling around the neighbourhood without getting any closer, spiralling, spiralling, until - we ring the bell once but have gone before you can answer it!
That plan is fool proof, I shall write it down now - What?! Mr Bond, you have broken all my pencils? Fool, I merely have to order a new box - see, the van is almost here - it has gone past - NOOOO!
Except that the whole thing will be AI created direct to overly-compressed files (suitable for streaming) with nary a 32mm frame to he found.
When the time comes for the 25th anniversary remastered version they'll be arguing whether the strange rectangular edges in the over-stretched shadows are remnants of original set designs[1] and we'll be able to buy action figures with carefully recreated compression artifacts.
[1] massive crystal spires
> Without support, our future STEM workforce will suffer, with major economic impact to the USA.
If the STEM education gets much worse and the rise to Theocracy continues, Musk is going to be fighting to keep the crowds away from his launches - not the sightseers, but the Pious who are enraged at his claims to be able to breach The Firmament.
It'd be a darn shame if Elon gets caught up in his own web of over-promising and isn't ready in time before the crowd crashes the gates and begins beating the ground crew with shiny new Trump Bibles [1] and tryin' ta lassoo dat eeenormoos heathen phallic idol standin' yonder.
[1] 2nd Edition, with all the weak-ass Commie nonsense taken out (thur won't no "mount" an' sure weren't no "sermon" thayr) and an extra 50 verses in Relevations.
We are still waiting for Musk to show us his sub working: “We will make [a video] of the mini-sub/pod going all the way to Cave 5 no problemo.”
Presumably it will be FSD itself down there.
> Typically US government contracts don't work that way...
> there are clauses that allow the government to cancel the contract at will.
At will, there is the key. Trump would allow a contract to his bestie to be cancelled just for some silly little reason like the ISS isn't there any more?
Not that Trump'd know, it'd be DOGE in charge of deciding which payments are a boondoggle and which are in the benefit of the American People, sorry, Person.
It will be - interesting - to hear what is said when astronauts Wilmore and Williams return back to Earth and aren't showing signs of having been "trapped" in space, but merely the usual physical tiredness of a crew when returning from a perfectly normal (plus a few days) length of mission on board the ISS.
If this were a bad SciFi movie, I'd expect the billionaire in control of their re-entry vehicle to arrange "a little accident", maybe just in their individual air supplies, "see, they suffered from having been up there too long". Or maybe a bit more dramatic (have the rest of Crew-9 been "behaving appropriately"?).
Hopefully, instead all we will have is tall tales from certain quarters, praising the two for "putting on such a brave face" and how "we must let them rest from their ordeal and not take anything they say as anything more than ramblings of clearly exhausted, but brave - oh, very brave - individuals".
Whatever happens, Mister X will try to put so much spin on it that we'll believe he is trying to beat that centrifuge launch company at their own game. Or he'll not bother and just call them names.
See JoshOvki's suggestion above: if Musk is contracted to supply NASA until 2030 but the ISS comes down early, SpaceX then show *their* half of the contract was held up (pointing to a stack, apparently ready to go) and demand NASA keep up their half (i.e. PAY).
Literally money for nothing (just keep the stack dusted and add a few theatre smoke machines for that "boiling off" look).
> will we be back in another 10 years wondering who is measuring the business value of all that AI in which organizations have invested billions?
Yes.
Yes, of course we will.
At least with "Big Data" you might be able to find somebody to take it off your hands[1] (but do it quick before everyone else realises they can't really make money from any old data either).
But what you gonna do with your "AI investment"?
Most of you don't own the kit, just bought a subscription, so by then you're just stuck with a decade's worth of random output. Bet you can't even track what text & prompts were used to generate it - or even know what parts of what emails and documents were generated by machine in the first place! You know you got some outputs, the bill says 45 million tokens worth - but just where has it all gone?
[1]"Awrite guv, 'kin gives yer a pony per terabyte, that's me limit. Ok? Roight, Jim, start grabbing drives. Yeah, yeah, we'll just file down the table headers then flog the lot to DoubleClick".
The editor and debugger are black text on white, the shell windows white on black - i.e. all just left on their defaults.
Whenever anyone came up and volunteered that "they agreed with my choice", I'd just bring the other window to the top and watch the confusion creep across their face...
You don't accept that anything deserves the label "graphic novel", they are all just "soapy paper"?
So you discount (or are simply ignorant of) of the work of Bryan Talbot in recent years? The "Grandville" books, "The Tale of One Bad Rat", "Alice in Sunderland", "Dotter of Her Father's Eyes" (winner of the Costa Biography Award) or the recent "Armed With Madness" (those latter written with Professor Mary Talbot)? What about "Persepolis" by Marjane Satrapi? "They Called Us Enemy", George Takei et al (if you feel that biographic tales of modern life in Iran or of the internment camps in America are just soap operas, well...)? None of which were published as partworks.
For that matter, what is wrong with publishing as partworks? Do you really find that that diminishes the writing to the point it can be dismissed out of hand, as you feel it did "Watchmen"? You feel "Maus" is some 'soapy paper'? Do you equally dismiss the stories of Arthur Conan Doyle, Charles Dickens? Every book that started life in the pulps? Asimov? No doubt you also sneer at that feeble partwork, "Sense and Sensibility: A Novel. In Three Volumes. By a Lady"?
> 1) This shouldn't apply to LEDs. They should last for absolutely ages, and failures appear to be due to cheap electronics.
Go watch Big Clive as he disassembles LED bulbs, pops in a resistor and gets them running at lower currents, so they no longer get pointlessly hot.
Well, pointless unless you have some reason to want to have them fail early...
Shush.
*We* want them to sue - that can lead to all the paperwork, showing that foisting the AI on their users was a bad idea, into the public record. With luck, including correspondence with other peddlers that shows they were also of the same opinion, opening up a crack there. Fingers crossed.
Suing also makes for nice juicy news stories, that can drag on a bit, keeping fresh in the public mind (and the minds of the AI peddlers en masse) that this stuff is not living up to the hype.
Wait until he hears about the trans-istors that come into the US, smuggled inside the "chip packaging"[1].
Not to mention how many of them are bi-polar (although some of those also identify as hetero-junction as well, which is very confusing).
Then there is all the tunneling these things get up to once they are powered up inside the US borders...
[1] crisp packaging to rightpondians.
We all "know" that the only way to be efficient is to let independent companies take on tasks that the gubbermint is too wishy-washy to do properly. After all, it is clearly Musk who is doing everything[1] that NASA was supposed to do
From the article TFA linked to:
> Three U.S. officials who spoke to The Associated Press said up to 350 employees at the National Nuclear Security Administration were abruptly laid off late Thursday, with some losing access to email before they’d learned they were fired, only to try to enter their offices on Friday morning to find they were locked out. The officials spoke on the condition of anonymity for fear of retaliation.
> One of the hardest hit offices was the Pantex Plant near Amarillo, Texas, which saw about 30% of the cuts. Those employees work on reassembling warheads, one of the most sensitive jobs across the nuclear weapons enterprise, with the highest levels of clearance.
> The hundreds let go at NNSA were part of a DOGE purge across the Department of Energy that targeted about 2,000 employees.
*Clearly* what was intended[2] was for Musk "I probably know more about manufacturing nuclear weapons than anyone alive today"[3] to take over reassembling warheads[4], which he could then "borrow" to equip Starship[4] for his 2022 Mars colonisation flights.
[1] well, apart from all the robotics, observations, collaborations, Science, piffling little things like that.
[2] wot, me speculate?
[3] ok, ok, but you know he'd come out and say that if asked
[4] why create something if you can get it ready to assemble?
> Go read the explanation of how Spectre works ... Go look at...
I have, ta. But any new refs are, of course, happily accepted.
> because there's more than one processes sharing the hardware resources
Well, yes, I was going to make that my immediate response: once you have a hole opened up by Spectre et al then it is potluck which bits of the shared resources are then also vulnerable. That is not specifically an issue with SMP:
(0) Spectre et al does *NOT* require SMP to operate - they happily affect single-core CPUs in a single-CPU system
(1) SMP does *NOT* imply all cores are on one die (and hence potentially vulnerable to Spectre) - see chiplets, multi-CPU mainboards, all the way up to multi-cabinet monsters
(2) Shared on-die resources does *NOT* imply SMP - there is no (logical) reason for (at the programmer's model level that SMP occurs) for separate processors (need even be CPUs) not to share the lowest-level hardware resources, there is nothing in the scheduling requirements that prevents that[3]
> threads are not separated off into a separate address space.
Well, no, that is the definition of what "threads" is. We use threads when we *WANT* to share resources between them. Curiously, we also have "processes" which *do* separate address spaces (and, where available, make use of the MMU to enforce that - until we use explicit mechanisms to create sharable resouces).
> libraries that sprung up around C
And around every other language that provides am inviolable single-line-of-execution model to the programmer - you know, those poor fallible beings who have to actually *use* all this stuff - and we see enough problems moaned about around here because so many still seem to struggle with *that* model :-(
> The whole point of archictures such as CSP ...
True. And it is a shame they haven't taken off as much in the mainstream. Then again, the last Transputer project I was (perpherally) involved with, down in Brizzle, failed because the time spent in comms was vastly outweighing the time spent in actual compute. This may well have been a problem with the programming - *BUT* as they needed a result that actually worked better and they could do that without the Transputer...[3]
> The problem caused in designing hardware that panders to C's "see all memory"
Hmm, I really don't see why you have this idea that C has this "issue" that needs to be "pandered to": a C program is gven a chunk of memory and works on that; at root, said program expects to be single-core[1]. But your program only sees the memory that it has been assigned - malloc() can fail before the entire hardware RAM has been exhausted, stack space be constrained!
Really, at bottom, you just seem to have a big issue with on-chip multiprocessing that has tried (and, as Spectre etc shows, failed) to create optimisation algorithms for the hardware (yes, it is an algorithmic failure, it just happens to be implemented "in the hardware" - well, the microcode and the bits below that) but then want to blame that solely upon the use of SMP (see above - nope) and then blame the popularity of SMP upon C (again, nope).
We could all do without the ills of Spectre - and of RAM-smashing exploits and everything else - but trying to scapegoat C is getting daft. There are other languages around that could have become the low-level lingua franca, C just happened to get there first (and, IMO, not because it was C but because it was dragged along in the wake of popular - not even "the best" - software that happened to be written in C). The programmer's model (which does NOT have to match the reality beneath) of a single-line-of-execution is one that has proven highly productive. The provision of (various) multi-process models on top of that, which are not forced upon anyone, has - issues - but not anything specifically related to SMP (from the p.o.v. of the programmer, that just happens to be the next simplest model up, so, yeah, it'll be popular, go figure) and we have (or can have) very-close-to-C languages that provide sugar to alleviate them (but NOT totally remove; you can work around even the best planned language's restrictions the very second that you allow it to - tada - call into the operating system!).
[1] and? so what? many languages do - in fact, the coder's model for *most* languages do - some higher level ones are clever enough to "do better"[2] but we still want the C-level to build up to those tools.
[2] Prolog can be coded to do parallel searches - but beware, because they have to be resynchronised *and* if the code didn't need to backtrack after all then you can end up dropping the results of a lot of pre-emptive computation; where have I heard that before?
[3] Given the way that Spectre etc crept in, I wouldn't put it past hardware optimisers to be able to create a similar problem within the actual implementation of (some of) those designs, even the Transputer. "The programmer's model hasn't been changed", they'll say...