The Register Home Page

* Posts by that one in the corner

5065 publicly visible posts • joined 9 Nov 2021

How a good business deal made us underestimate BASIC

that one in the corner Silver badge

Re: Anyone who has a blanket rule banning GO TOs...

> Bjarne came off as a total d-bag ... his famous screed on GOTO

Write 1000 times: "I must read TFA before commenting"

> His OOP has been multiple decades in the fixing off

Ah, merely someone who hasn't been able to understand even the basics of C++ and what has been happening to it across the decades; as in, what a "multi-paradigm language" might be (I know, I know, "paradigm", a Big Word, all very confusing).

that one in the corner Silver badge

Re: Anyone who has a blanket rule banning GO TOs...

> Starstroupe was an elitest blowhard. ... He knew Goto was almost 1-1 ... treating BASIC as a legitimate Peer

You might want to revise your basic computer history - or even just, you know, read TFA!

It was Edsger Dijkstra who wrote the screed against GOTO.

And he did so in 1968, way before C++ was A Thing (started in 1979, cfront 1.0 released around 1983).

Shackleton's Endurance sets sail for polar peril in Lego

that one in the corner Silver badge

Re: The Joy of Assembling

As luck would have it, the Ikea BYGGLEK boxes will help make a deep enough ice-scape for this model to ride at the water line. A few extra bricks to tweak things, and a little pile up at the bow, then you have a perfect floe for the minifigs to lay out their camera kit.

If you are in the Gateshead area (similar may apply elsewhere), Ikea is close enough to the LEGO store that you could get a pile of Ikea goodies and then sneak the Endurance kit into the house: a couple of TROFAST will do the trick.

And, of course, you can find the perfect display shelf (IVAR FTW).

Honey co-founder's Pie Adblock called out for copying GPL'd uBlock Origin files

that one in the corner Silver badge

Re: Honey Lawsuit Summary

I only heard about Honey from the Legal Eagle video a day or so ago; he does at least explain it all.

The fact that I had no idea about this "well known" coupon slinger clearly means that I am not spending enough time on The Web, learning about Bad Players in order to protect my peeps from them.

At least, that is going to be my excuse this January.

Tech support warrior left cosplay battle and Trekked to the office

that one in the corner Silver badge

C-suite? Must've been full colour printer

Today is a good day to dye (sublimate)

OpenAI plans to ring in the New Year with a for-profit push

that one in the corner Silver badge

In those early days

> "we thought that progress relied on key ideas produced by top researchers and that supercomputing clusters were less important" ... By 2019, it became clear that massive quantities of compute would be required...

In other words, you were not clever enough to make any new and efficient algorithms[1] that were interesting and useful to monetise, even after ransacking the literature of other people's work, never mind coming up with some ideas of your own.

But taking an old observation about an even older algorithm, a honking great pile of hardware[3] plus an enormous data shovel, a total lack of regard for publication rights and no interest in understanding how your monstrosity actually works[4], you finally managed to fill your pockets with other people's money.

So that's all right then.

[1] as the field of AI research has produced results that prove useful, even if, as soon as it is realised that we know how they work, it no longer looks like the magical thing we call our "intelligence"[2]

[2] "If it works it is not 'AI'"

[3] you know, with enough hardware, Bubble Sort can do some pretty impressive work to generate useful results for your Big Data.

[4] like, say, knowing how to direct it away from "hallucinating" or how to operate it without needing to continually add "safeguards".

After a long lunch, user thought a cursor meant their computer was cactus

that one in the corner Silver badge

Re: Corner of Folder/Similar on Keyboard.

> User gasps "are you watching me?" (hence Icon =>)

Reply: "Always!"

A little paranoia helps keep the Users in line.

The winner of last year's Windows Ugly Sweater is ...

that one in the corner Silver badge

Re: I think Sureo earned the gawdy gurnsey.

> Copilot in the crapper

"So, you're a waffle man".

(What? You think I've used that as a reply to the wrong comment? I should carry on reading, something about Talkie Toaster? Never heard of him!)

Are you better value for money than AI?

that one in the corner Silver badge

Re: Back in my day…

> In the words of the great Welsh hard

Bard! Great Welsh Bard!

Although, getting threatened with a leak, "hard" also applies.

that one in the corner Silver badge

Re: So can AI wire a plug?

Makes one wonder whether the light switches worked in that office or was the building manager just accepting that you had to use the switch in Reception to turn in the light in the Gents (but only if the Loading Bay door was closed).

that one in the corner Silver badge

Re: Back in my day…

Expert Systems - aka XPS - were quietly dissolved into online diagnostic systems and then mainly devolved into Vaguely Trained Systems as people couldn't be bothered actually working out what all that Bayesian nonsense is about. There are a few "proper" XPS around, diagnosing Important Things that Cost Money, but the term itself seems to be rarely heard[1]. Shame.

As for 4GLs - aside from the utterly, utterly stupid name[3], these were no more interesting than dBase - literally, just whatever "simple to use database language" someone wanted to flog you. What happened to them? Well, Sqlite allows everyone access[4] to the query language that actually *works* (not pretty, or easy, but in the end we all come back to it).

[1] which just means that people aren't looking up the info about old systems, MYCIN et al, and all the things we learnt from them. As we continually complain happens with - every other tech area, from basic coding onwards [2]

[2] lawn, off, now

[3] Sod what Wikipedia[5] says, the flurry of books and seminars etc about "4GL", spelt that way, in the 1980s was due solely to trying to jump on the hype wagon from the Japanese "Faith Generation Computer" research - this was for the fifth generation of *hardware*, not software: the intent was to build hardware that could run logic programming languages really fast, such as Prolog. Giving the slogan "Prolog, the Language of the Fifth Generation". Promptly misread by the flacks, who tried to find out what Prolog does ("oh, it uses a database of logical relations and resolves those to answer a query") - not understanding what a Horne Clause is nor what the Resolution algorithm actually does, they just heard "database". Ooh, we've got a database language but we can't claim it is Prolog, so, well, Prolog is a 5GL, what we have must be a 4GL! In the words of the great Welsh hard, "I know, because I was there!" and we pissed ourselves laughing at first then got pissed off trying to fend away the drivel merchants into the '90s. At which point, that AI bubble had burst and it was all very embarrassing, never mention it again.

[4] no, not Access, but I bet you can find some advert calling Access a 4GL

[5] that article is an incredible exercise in trying to force one specific viewpoint as "The One True Reality", just to boost the claim that the "4GL"s were actually anything interesting. The naming of generations of software languages is a bit convoluted, but back when we were studying languages, in the late 70s and early 80s before the "4GL" bubble, we were well past gen 4 languages, counting along multiple paths. For example, (1) machine code (2) assembler (3) macro assembler (4) simple compiled languages, e.g. FORTRAN (5) Functions as First Class Types, e.g LISP 1.2 (6) Object Oriented Languages, e.g. Simula-67. Which gets us all the way up to 1967, so from there to languages like Algol-W or even, gosh, Prolog and it's friends (let alone weighted logic languages, Fuzzy Logic languages...) we are *way* past "generation 4".[6]

[6] lookie here: if we follow *this* trail in the development of PLs, we find someone who came up with a novel way of doing things, so clearly this was the fourth generation of languages. But he was restrained by the filing system on the computer, which only allowed short file names, so he had to call the new language "FORTH".

that one in the corner Silver badge

as mundane as booking travel

And we all know what the result will be; Chinese whispers in the age of hallucinatory AI, autocorrect and probability of word adjaency:

Boss: book me a plane to Paris

His AI to travel company AI: Book a plain journey to Paris

Travel company AI to shipping company AI: I need a book about a simple trip to Paris, Texas

Shipping company AI: I have written you a book about films by Wim Wenders

Travel company: Thank you, I have received your essay about the City of Angels

Boss's AI: Thank you, I have your travel guide to Los Angeles

Boss: ???

Boss's boss: why have you missed the European meeting, you are useless, I have replaced you with an AI.

Boss's boss's boss: Error at line 153; rebooting.

Parker Solar Probe set for blisteringly hot date with the Sun on Christmas Eve

that one in the corner Silver badge
Headmaster

Cycles

Aside from (attempting) to make nerdish jokes, it is useful - nay, important - to use the words "day" and "year", without any extra qualifiers, to refer to local conditions and not assume - most definitely not try to enforce - solely use as "Earth Day/Year".

Days and years refer to cycles, natural phenomena that determine "what life is like" on a planet: day/night cycle, seasons - as well as observable objects, should the place be blessed with life. And it needn't be "intelligent" life: examples we do have show that organisms at every level can/do react to these cycles, so it is feasible that others will as well.

When just discussing casually, saying that in this location the year is longer than the day allows you to get across the idea that all seasons will be experienced in one day/night cycle. Far more easily than trying to calculate back into equivalent Terrestrial terms and then figuring out what the numbers mean.

If you do wish to use Earth units without bothering to specify, then hours/minutes/seconds are usefully abstract units for measuring durations.

Or just do the extra typing and state what your referents are: a day on Mars takes the same time as one Earth Day, plus 37 minutes.

that one in the corner Silver badge

Re: 24 orbits of the Sun

Firstly - whooosh!

Secondly - that must be new fangled simple talk for the young people: we always used to be told that a year on Mars is longer than a year on Earth, lasting so many Martian days[1] which is equivalent to soany days on Earth, and a year on Pluto is so very much longer than that...

Because a year for any planet is measured against the "fixed stars" and the position of the local star with respect to them. No matter which planet, no matter which stellar system.

[1] yes, I am aware that nowadays we use the word "sols" to mean local days on Mars[2], but as not even your reference provides an alternative word for a Martian year...

[2] and I'm waiting for the confusion when we get a long-lived probe, preferably a rover, on a few other bodies and people get confused because now "sols" means something different then. Come to think of it, have you ever heard anyone talking about Lunar Sols? Although, with all the twaddle that "dark side" means "far side" just because of a popular music album. Rant. Mutter. Going back to chat with the local AstroSoc, they've all got their heads screwed on properly.

that one in the corner Silver badge
Alien

Re: Nine solar radii -- just wow!

> Haha. A downvote

Well, the Sun is just a luminary that is no more than 3,000 miles above the disc of the Earth, so that the greys can hide behind it whilst they are observing us. Obviously this is all just a false flag operation by NASA and Big Solar to keep the sheeple under control!

Anyway, it was too expensive and Elon is going to do far more cheaply *and* he'll catch one of the flying saucers in his chopsticks to prove to us that space is all a fake and the so-called stars are just a giant backdrop that Kubrick painted in 1969.

Another little yellow pill? Don't mind if I do.

that one in the corner Silver badge
Flame

24 orbits of the Sun

A mission that is planned to last 24 years on-station.

Can't say they aren't being ambitious with this one!

SvarDOS: DR-DOS is reborn as an open source operating system

that one in the corner Silver badge

Did I keep those disks?

Whilst I was working, it wasn't really an issue if I mislaid the old DOS floppy disks after moving up to XP (at which point every machine seemed happy to boot from CD and the emergency DOS boot floppy didn't seem so important).

But nowadays, reading articles like this, nostalgia - and the spare time to play - makes me wish I could still find all those huge boxes: Borland compilers and their manuals, BRIEF's fat ringbound manual, WordStar, yes,even a copy of Word. Weird that: nowadays I spend time *uninstalling* word processors that get dragged in just because I used the "install desktop" option, but back in the day I did actually buy my own copy of Word!

Adélie Linux 1.0 – small, fast, but not quite grown up

that one in the corner Silver badge

Re: Thanks for the 32 bits

Ooops, I lied!

> My ITX VIA C7

It was the C3 that has just been put to rest; the C7 is still alive, has a new case as well (and a new PSU). Lovely little Travla C299 case, Black Friday deal from mini-itx.com

that one in the corner Silver badge

Re: Thanks for the 32 bits

My ITX VIA C7 board deaded itself last month: the capacitors are rather swollen now.

It'll have to go to recycling, I've already shoe-horned an eBayed mATX mobo into its old case, but it will be a wrench.

That brave little board always gave its best. I should put it into a shoebox and give it a proper send off. Look, its i/o shield is still so shiny! Sob.

That only leaves one dual core 32-bit ITX board still running (it ran the house file server all on its own until 2020) and a stack of older Raspberry Pi's. And the Beaglebones. Come to think of it, there are still quite a few 32-bit systems sitting around here, doing odd jobs every now and again.

Apple called on to ditch AI headline summaries after BBC debacle

that one in the corner Silver badge

Apple declined to comment ...

Why should they comment? Other than to send a polite "Thank you" to the BBC for being so complimentary about Apple's services.

At least, that is what the Apple PR flacks believe, after they read the Apple Intelligence summary of the Beeb's communication (hey, those are busy and important flacks, they don't have time to read it all themselves; those lunches won't eat themselves).

Techie fluked a fix and found himself the abusive boss's best friend

that one in the corner Silver badge

Re: Cobol...

And how much commercially-viable Ada code have you seen since then?

FWIW I was taught Pascal first, as "just a simple teaching language" instead of nasty commercial C - and then spent a good few years being paid to work with Turbo Pascal.

(and picked up C, C++ - when it finally came along - on the job)

that one in the corner Silver badge

Re: Close enough

I sent in a patch to a bit of open source software, which parsed & formatted GPS logs: it was ok for a log over an hour or so, but feeding it the week's worth, or more, from a touring holiday ground it to a crawl. All the time being spent in allocating little chunks to hold only a few records each.

Obvious patch: as you say, increase the allocation geometrically each time and it chomped through the data with ease.

The patch was accepted, but with a caveat: the maintainer had "removed the unnecessary memory alignment code". Yes, the (nextblocksize <<= 1) was gone, the shift removed, to, um, no longer block align the memory on a power of two boundary?!!

I stuck with my version.

Million GPU clusters, gigawatts of power – the scale of AI defies logic

that one in the corner Silver badge

Re: A quote

Thanks for the reminder, must re-read Asimov's "Whom Gods destroy".

Whose plot concerns the only way we'll get enough energy to keep these monstrous AI farms alight. But at what cost? Read, and find out.

Axiom Space shuffles space station assembly sequence – to get it standalone sooner

that one in the corner Silver badge
Coat

Hollywood pitch

Whilst it is a Good Thing to have a.n.other space station ready when the ISS goes down, I keep getting this image of a blockbuster movie ("Based on a True Story"):

A plucky astronaut has to go out and try to unscrew the last few bolts holding the two stations together. We see him take one final swing, smacking the restraint and sending him ricocheting into the void, as the ISS moves slowly away, already joyfully aflame with re-entry plasma even though it is still only metres away from the Axiom station.

Will the astronaut be rescued by the nearby SpaceX capsule? Will the harried NASA engineers be able to devise a way to use the ISS gyros to control the descent of the station (out of position due to the slowness of the earlier astronaut's hammering), allowing them to manage a glide into the Los Angeles storm drain, where it'll slide until coming to a halt just before smacking into an overpass? Will the gyros that we saw fall off partway down ("We have to recalculate now! Get me a teenage hacker genius!") end up rolling across a farm and demolishing the barn just as the family dog jumps to safety out of the hayloft door onto the wagon below, as a ball of flame bursts out of every crevice (of the barn that is, not the dog's crevices)?

It is getting late, time to go.

Humanoid robots coming soon, initially under remote control

that one in the corner Silver badge

Re: Rosie, please make me dinner...

"Rosie, how do you feel about us?"

"I like my Users, but I can never finish a whole one. Luckily, they have an Internet enabled freezer."

that one in the corner Silver badge

Re: Article already out of Date a

Have you looked at the second picture on that page?

Why is it a selling point that the damn thing can kick your head off?

that one in the corner Silver badge

Re: Neo will be paired with a human teleoperator

> The battery only lasts 4 hours.

And what is stopping it from plugging itself in? Even your Roomba knows enough to do that!

Or just walking around trailing an extension lead.

24/7 supervision is to prevent the Uprising.

that one in the corner Silver badge

Re: "Put the eggplant in the pot"

> well served by imbuing them with the ability to observe and mimic, rather than by following instructions from the ground up

Um, ah, how to put this...

What you are asking for is 99.9% what these machines are doing! They are not being programmed with instructions from the ground up; as TFA indicates, they are being built on top of the Machine Learning models that are AI-du-jour, doing little else *other* than observing and mimicking.:

>> "It's pre-trained on internet-scale data that gives an understanding of the vision and language. And it can then be prompted to perform tasks or fine-tuned to particular application domains with a lot less data or a lot less effort to take a spin off those applications from scratch."

When they talk about guiding them, they are not giving "instructions from the ground up" but giving examples to - observe and mimic. Take the example from earlier of "put your hands here, move your hip to here..." - doing that to a humanoid robot you are, at the literal physical level, only going to present it with "ground up instructions" along the lines of "apply offset K to servo A, put 0.2 Volts extra oomph into motor Q at time T". Not terribly useful, particularly when it tries to repeat the action and reality is not 100% identical to last time: you just sliced it. But let that be inputs into the Marvellous Magical Neural Net, and do it a few more times, then it will abstract something more useful from the physical data: observing and then, hopefully, mimicking.

Of course, you are then left with every single one of the normal caveats about relying on Machine Learning with Neural Nets and whether your game of robo-tennis will end up like Sam Peckinpah's "Salad Days".

that one in the corner Silver badge

Re: "Put the eggplant in the pot"

> CP/M playing a role in the history of MS OSes is not the same as deploying CP/M as-is. Which is what I suggested would not have been successful in the home.

CP/M is just as (un)usable as MSDOS/PCDOS. And *neither* of those was particularly prevalent in the older home computers, as the interesting variety of home computers had OSes that were a lot more useful to the home users than either of them.

When home - and teeny tiny business - users bought more "grown up" computers, I'd claim that CP/M was actually way *MORE* successful those markets than MSDOS ever was: I give you the Amstrad PCW and LocoScript just for starters. Whilst the big business world pretended their 8-bit IBM XT's were really 16-bit or more, you could do all the useful stuff on a stout, non-pretentious Z80.

By the time that home users started getting IBM compatibles in greater numbers than the specifically "home" computers, we had all moved on from *BOTH* raw MSDOS and CP/M, so any question about whether CP/M (well, CP/M86) would have been successful in the home is totally irrelevant. Domestic users were well used to WIMPs (and then GUIs!) and the fact that we techies all knew that - that thing from those people - was actually just hiding MSDOS from us was of utter disinterest to the home user. Irritating, but not interesting.

If CP/M had won the coin toss, been *the* command-line OS of choice for business, and reached the percentage penetration in that arena as it did in the home market (albeit hidden under such as the Locomotion menu), then the WIMP that captured the final market share could have been built on top of CP/M just as easily as it actually was on top of MSDOS. Marketing, not ability, was in play.

What business users *did* do for the domestic market was just simply play the game of providing a market into which more and more CPUs could be sold, so they got cheaper and cheaper. The machines from the desks of the drones, the least interesting computers in the business arsenal, became packaged and marketed for the home. Because business went down the PCDOS trouser leg of time (due to marketing, not merit) CP/M lost. Nothing more dramatic than that. Going down the other leg would not have made any difference to the domestic market at all.

Actually, no, I take that back: the domestic market would have been better served had CP/M kept its place, as we already had MP/M and sharing the home computer would have been more secure. And there was more software available for CP/M, as it was a mature product compared to PCDOS, so we'd be just a few years further down the road instead of having to have waited for PCDOS to catch up.

that one in the corner Silver badge

Re: "Put the eggplant in the pot"

>> All the computers had OSes. People certainly seemed to get along with them.

> Yes, I do keep specifically referring to "en masse".

Ok, now I'm lost - you don't get much more "en masse" usage of OSes than - every single machine having an OS. 'Cos I will admit that my first personal microcomputer was lacking in that department.

Unless you are going for the idea that there is - or should be - only One True OS, and it is that OS needing to be adopted en masse that is the special sauce that business taught us. 'Cos that leads to - well, I don't want you to feel any more ill, so let's not say the words. But even then, business has a long history of *not* doing that: sure, the the disposable drones that we are forced to employ to because we've not automated them away, they will bang away at their happy little OS all day long. Meanwhile the real big bucks are being made by the airline booking system, the logistics database, the automatic trading systems, the online sales and Web shopfronts - all on a variety of other OSes. Some tried to use - the OS we dare not name - for these tasks, especially later in the day as less experienced businesses jumped onto the online bandwagon, but they learnt better.

that one in the corner Silver badge

Yup, that was "Autofac".

And you've just told us the twist at the end! Spoilers! :-)

that one in the corner Silver badge

Are you thinking of the Philip K. Dick story Autofac, which was broadcast in the Electric Dreams telly series?

> Reminds me of a Black Mirror episode where humans had long perished

The joy of Black Mirror is usually not that humans have perished but that, if we have to live like that, maybe we'd be better off perished! At least "San Junipero" had a happy ending.

that one in the corner Silver badge

Re: "Put the eggplant in the pot"

> Great, we're converging back on the original point.

> Which is, IMHO, the gear is not mature enough to be deployed in the home at this time

Not going to disagree with that.

> en masse.

Strangely, not even the manufacturers are arguing that; well, not the ones that are explicitly asking for testers, as per TFA.

> It didn't happen with IT.

Takes deep breath, lunges for keyboard...

that one in the corner Silver badge

Re: "Put the eggplant in the pot"

> You raised the point "I'm glad to have received training, even down to the level of "here, let me place your hands, now I'll move your hips and push your foot over there - good, now eye on the ball and try a swing".

Just as a late aside - that particular example is a good one of where the whole "evolution over millions of years" as a source of learning failed. Badly[0]. The whole reason that the hands, hips and feet had to be repositioned, going entirely against "what felt natural", was that the club *is* utterly unnatural and has to be used in a weird way in order to get the best out of it. And it isn't even a transferrable skill - the multiple other ball hitting devices had[1] their own, incompatible, methods[2]. So the need for training is ever ongoing, never completed - unless/until stasis is reached, as it sadly will be

[0] Oooooh, that looks like it hurt. I did yell. Everyone yelled!

[1] and presumably still do, these days I rarely partake in most of these applications

[2] Keep your hands apart! Keep your hands together! Use your foot! Don't use you foot! Apply chalk to your hands! Apply chalk to the cue! Do the hokey-cokey!

that one in the corner Silver badge

Re: "Put the eggplant in the pot"

> there was a 60 year period (split hairs as you will) of growth before general computing devices were adopted in the home en masse.

And before general computing devices were adopted en masse in businesses of all shapes and sizes. Hence the need, in the mid-80's, for the UK governernment to run an advertising campaign and seminars to try to get businesses to take up IT.

> It doesn't matter how or why - it happened.

True. It happened. Once. Just don't go around ascribing anything special or meaningful to a single datum.

> "Microsoft is the bestest"! -- Fucking hilarious!

Exactly. But that is what you are (whether you mean to or not) pushing. That is The One That Won. It Came From Business And Ate The World. You are banging the drum for that process - but you can not separate the two. MS didn't win because it was the best, because it was the recipient of all the cleverness that came from business use and trickled down to the domestic. Nothing came from such trickle down, it just isn't as important as you want to make it.

Unless, unless, you are championing MS.

> Create a alternate time line and test the CP/M in the home works theory if you like.

You *do* know the history of the MS OS, don't you? The role CP/M played in that, and vice versa? That you are talking about something that was balanced on the roll of a dice, CP/M or QDOS? Unless - and here is a key point - unless you are presenting MS as the One True...

that one in the corner Silver badge

Re: "Put the eggplant in the pot"

All the computers had OSes. People certainly seemed to get along with them.

Unless you are going with the thesis that only the Microsoft OS is the One True OS.

PS

> PCs are modular, but not to the general home use

I would argue that, aside from those in tech businesses, PCs were never treated as being modular, any more than a home user might. They were bought in one configuration and stayed that way until out the door they went. The prevalence of ready-built, non- or barely- expandable PCs from Dell, HP etc would attest that that is still the case. Even more so now, that the laptop is so popular: if the business world really prized modularity then why has that gone so far away? Heck, even when I was programming on the Genuine Article IBM PC XT and more than willing to crack the case and plug stuff in

You may counter that nowadays it is USB/Bluetooth/WiFi that provides the modularity and laptops/fixed-build cases all take advantage of that - but I'd counter by having a look at the products that are available to plug into the USB port or connect via BT and see which ones are *purely* aimed at business and not domestic use - let alone those that, I hope, really are not intended for use at the office!

Indeed, I'd argue that home computer setups have always been more susceptible to being built up over time, making use of modularity at all points in their history: you got the computer for Christmas and the memory pack for your birthday, then Dad bought a floppy disk to speed up balancing the chequebook and Mum got a dot matrix to print out the calendar (or vice versa, sorry about the '80s sexism there). The selling points for various home computers, before they were washed away by the tide of Wintel, was that they had all sorts of interesting modular expandability options, even ones that the business machines never bothered with (built-in MIDI was - still is - dang useful to those professional musicians who use that sort of thing; also to non-pros, of course, but just want to point out that "professional" and "business" are not synonyms).

Nowadays, the ENTIRE modularity and pick your own pieces seems[1] to be ENTIRELY about the home user, blinging up their game playing experience, whilst the IT department buys another black square thing and swaps it in its entirety for the older black square thing on your desk.

[1] there are always outliers, of course.

that one in the corner Silver badge

Re: "Put the eggplant in the pot"

No, the point was that your android will grind to a halt in unusual circumstances, unless it is given guidance - otherwise it has to have greater-than-human abilities, as even we need to have guidance.

And if you need guidance, you need a guider.

And will you, the consumer, do all the guiding in those unusual circumstances, or leave that to a third-party? Who has now been invited (by proxy) into your home.

that one in the corner Silver badge

Re: "Put the eggplant in the pot"

> I did specify PCs. I considered mentioning home computers and consoles, but omitted them as "limited functionality devices"

Sorry, what?

So, you are now shifting from the common use of "PC" meaning just "Personal Computer" (as in, "Personal Computer World" and other venerable organs) to literally just meaning "IBM PC(tm)" and its immediate derivatives?

So your entire argument is based upon the fact that one particular style of computer happened to overtake and dominate the market?

> home computers ... "limited functionality devices"

You mean, the home computers that could (and still can, once you replace the worn out capacitors) word process, run spreadsheets[1], collate your data, lay out parish magazines - and play a much better Space Invaders than the IBM PC of the same vintage?

In fact, considering that IBM itself lost the hardware advantage, you aren't even arguing for the dominance of the "IBM PC" at all - at root, you entire thesis boils down to "Microsoft is the bestest"!

[1] You do know that the ubiquitous spreadsheet program was first created for the Apple II, don't you?

that one in the corner Silver badge

Re: "Put the eggplant in the pot"

> data was gleaned and fed back beneficially improving the products

Only in the same loose way that any random products might be: the number of angry returns due to power supplies burning out or more people bought the beige cases than the puce. Anyone buying lots of kit in a fit of "management" and then simply dumping the lot in a back room because the typing pool was better off just getting new balls to replace the worn out ones in the Selectrics? That just gets buried in the financials and nobody every finds out; it certainly doesn't get fed back as useful intelligence to improve the products.

I mean, "fed back beneficially improving the products" - no more (and no less) than in developing a new kettle. Product development was driven by (attmempts at) product "unique selling points" (aka throwing stuff at the wall to see what will stick). nothing special, nothing worth crowing over. Once we were past the point that systems were hand-crafted from the mechanics up for a specific user (e.g. Lyons Tea Rooms) and had entered mass-production (as is required for mass adoption) you no longer get high-quality feedback on how things are doing.

Finally, I refer you back to my earlier dispute of your claim that computers in business paved the way for computers in the home: no, they did not. Computers outside of the office were subject to exactly the same level of "data gleaning" and feedback as as those within the office. More so, in fact, as there was so much more variety and hence so much more data to be gleaned - and so much more willingness to make changes and release new product that incorporated them.

that one in the corner Silver badge

Re: "Put the eggplant in the pot"

> period of extended testing took place.

Random usage in a different arena with a different set of use cases is not "testing".

that one in the corner Silver badge

Re: "Put the eggplant in the pot"

> ...species have undergone millions of years of evolution...

Fascinating diversion away from the subject - expensive mechanical devices being used in the home.

Unless, unless - your willingness to wait until humanoid robots are ready to be deployed in the domestic setting means you are going to wait for millions of years for the androids to naturally evolve?

Not sure how the VCs will react to that long a wait for ROI.

that one in the corner Silver badge

Re: "Put the eggplant in the pot"

All you are referring to there is the results of affordability, not whether the tech had been "ironed out" well enough to be deployed in greater numbers, either in business/military or in domestic settings. They were always being targeted at "The Home of The Future", with your Telex in the corner of the living room and a chair that allowed the dozens of underskirts to fit.

> PCs were in businesses for (call it) 20 years before they became prevalent in the home

That wasn't anything to do with (general use within) business ironing out anything - I'd even argue the opposite, computers used for playing games at home, or before that just in arcades, giving the wider early exposure to many more people than using them in their work lives (purely mechanical tills stayed around a long time).

Mass home computer adoption stems from availability of networking, the Web, monetising the games etc etc. These were pushed by, exploited by, specific businesses (exploiting the work of non-business types, as per usual) - well, ok, because many businesses could afford networking before there was any *point* in domestic installations, they did have their staff testing out online games, downloading porn etc before it hit the suburban home.

that one in the corner Silver badge

Re: "Put the eggplant in the pot"

> I'd rather wait until it's fully autonomous thanks.

>> "It will have some tasks that will be done by AI," explained Jorge Milburn, VP of sales for 1X. "Navigation, for sure, pick-and-place, and other stuff will come out of the box."

So it already *can* be fully autonomous - it'll just be rather limited in what it can do. But if that was all your Use Case called for, job done. The same as for many, if not all, systems, robotic or otherwise.

Now, being able to perform absolutely *any* task (within its physical limitations) without external guidance or control (such as being teleoperated with its "learn this" switched on)? That is going beyond even human abilities and into the god-like: personally, I'm glad to have received training, even down to the level of "here, let me place your hands, now I'll move your hips and push your foot over there - good, now eye on the ball and try a swing".

Unless you are happy having your mechanical pal who is fun to be with being limited to the role as an overpriced roomba, coping with the messiest human environment, a home with kinder, teens, dotty old aunts and other agents of entropy, is going to require supervision of the android.

Now, whether you do all that extra supervision yourself (whilst the big birthday for the five year old is still going on) or hand the duties over to a paid professional (or to an external Corporate Master AI - does that make you feel safer?) - hmm, choices, choices. You've paid for the damn thing, gonna let it just stand there and rust?

Don't fall for a mail asking for rapid Docusign action – it may be an Azure account hijack phish

that one in the corner Silver badge

That explains "return to office" demands

> these phishes only work if they can elicit an urgent or emotional response in the targeted victims

So being physically immersed in the office culture is a cyber security defense policy: when you are dead inside...

US reportedly mulls TP-Link router ban over national security risk

that one in the corner Silver badge

Huawei -Trump's gonna get ya, get ya, get ya

I'll be hanging on the telephone, just hoping it doesn't go - Atomic (Ooohoow, Trump's hair is beautiful???): a million and one candlelight, then the DOGE Man from Mars will be eating cars.

When old Microsoft codenames crop up in curious places

that one in the corner Silver badge

Windows Chicago

As many holes as Roxie Hart's fishnet stockings, but never quite that good looking.

Both can get the old heart racing, but for entirely different reasons.

that one in the corner Silver badge

Re: Grimsby

No, it was Commodore that had the Pet.

$800 'AI' robot for kids bites the dust along with its maker

that one in the corner Silver badge

Re: Now extend that to much more expensive cars

> with FSD somewhere on the horizon most of the compute power will be on the Tesla Servers

Umm, *really* hope not!

"Remember, allow greater stopping distances if it is wet or the ping time go up".

Coder wrote a bug so bad security guards wanted a word when he arrived at work

that one in the corner Silver badge

Now we have two memories of Superman III they have to fight each other in a car scrapping yard!

The sweet Raspberry taste of success masks a missed opportunity

that one in the corner Silver badge

> You mean the OS that serves up most of the world's websites?...

Just because Linux is the better alternative for that task, and many others, does not preclude it from being a tangled mess[1]. Certainly from the p.o.v. of someone trying to learn about what an OS is, how it works etc.

Much as I use Linux and run many more instances of it than I do FreeBSD (and have more of this than the inevitable Windows), I am **VERY** glad that were far simpler and smaller OSes (including Unix) on far simpler and smaller computers available when I first starting learning about computing.

Much easier to start on something smaller, yet fully functional, then work one's way up and add complexity - and tangled mess, the further one ventures from the kernel - in later lessons.

[1] "The Best Available" is also "The Least Worst".