* Posts by that one in the corner

2380 publicly visible posts • joined 9 Nov 2021

Odysseus probe moonwalking on the edge of battery life after landing on its side

that one in the corner Silver badge

Re: Failure is an option

Just seal the balloon's feed tube (automatic cable/jubilee clip) and cut it on the balloon's side (sharpen the clip on that side).

The balloon will then (literally) rocket away, spinning merrily.

"In Space, no-one can hear you making whoopee cushion noises"

Starting over: Rebooting the OS stack for fun and profit

that one in the corner Silver badge

Re: In the absence of files...

From the first message, I was having a hard time understanding just what problem this is trying to solve.

Now you have given two use-cases, neither of which seem to warrant such a complicated scheme:

1) " if you uninstall an application you can lose access to data without realising it."

Is that something that happens a lot and is a real killer, so much so it needs this complexity? The file is still there, you can see it is still there; presumably you recall what application you uninstalled and can just re-install it? Or just look the name and type of the file and install anything that claims to be able to read it (if you have no idea what was uninstalled, this may require a bit of searching, or just asking your friendly IT guy) but is it something that occurs ofen)? Indeed, if you are as totally application-oriented as this whole idea suggests, your thoughts would more likely be "Ah, I need this data that I usually access via SuperApp, oh dear, SuperApp is not installed, let's just install it then".

2) an idea that apparently "improves security because it prevents applications from opening data objects willy-nilly"

> An attempt to open secret_db.sql with - say - a hex editor will be impossible because you can't navigate to it.

Security for whom? We already have (unless you are suggesting throwing them away) mechanisms for providing security for the data based on the access rights granted to the User (many such mechanisms, of varying strengths); e.g. if your User does not have read-access to a directory containing secret_db.sql then you can't even see it exists, let alone navigate to it.

If an application is able to prevent data being opened then the only thing you are going to protect is the implementation details of that application's data files! That is adding a lot of complexity to the User's system to add a feature that is doing absolutely nothing for the User.

I'd also add that those two ideas can clash, badly. If you have uninstalled an application that can read a particular file - or never installed one in the first place - then methods to decide what type the file is include running the 'file' command and having a look inside with a hex dump. If the file is prevented from being opened because the correct application is missing and you can't identify the file type (no, file extensions are not always reliable) then - ooops. You really *have* lost access to that data!

> or as now when you try to open the file in your mail client the operating system suggests a compatible application

Once you have made your choice from the list of all the compatible applications that the OS has suggested (your OS only suggests one?), that isn't making the selected application "responsible" for the data; the email client just wants to let you view the data it can't render. In fact, if it is anything like my email client, the attachment as received is still in the email store, all you have is a temp file: if you want to keep that as a separate file, save it somewhere sensible now, as otherwise it'll get cleaned up along with the rest of the temp files at some arbitray time after you've stopped viewing it.

> you open a compatible application and import it

Ugh, I absolutely loathe applications that "import" your data: either they just mean they'll open it for you (in which, just say "open") or are making a note of its location for future reference (in which case, just say "noted") *or* they are moving the file, renaming it, trying to take it over completely and prevent you from ever daring to make use of it in any other program. I keep as far away from that kind of program as I possibly can. For example, having tried it, twice, I'll manage all my ebooks (of which I have many - thanks to Humble Bundle) in a simple dir tree rather than *ever* install Calibre again (unless they stop "importing" and just start "taking note")!

If that "import" behaviour is the sort of thing that you like then I can see why I'm having a hard time with this concept.

> The browser would talk to Inkscape to render the SVG

Doing things like providing a rendering service for SVG to any program that wants it is a reasonable thing to do - and, as you mentioned, there are and have been ways of doing that (OLE being one such attempt). But they are totally divorced from any idea of "being responsible" for the data.

Clearly I am not understanding what this concept is supposed to provide to the User and how it warrants such complexity - and, as far as I can see, such ambiguity over "what application is responsible for the data" and the potential for duplicating data with clones for no good reason. Ah well. If it is a good idea that I'm just not understanding then I'm sure it'll appear somewhere and, seing it in action, the penny will drop.

that one in the corner Silver badge

Re: In the absence of files...

> I observe that a number of music applications seem to have given up on the idea of, for example, selecting a music album and playing it from start to finish

And applications that take your carefully arranged files, one directory per album, some of the (newer) files even containing all the MP3 metadata (so even the files know which album they belong to) and carefully ignore all of it.

Our car radio does precisely this - it'll find all the files, put them in alphabetical order and then play them. So the Christmas albums USB memory stick become unbearable - we get every. single. version. of. "Come All Ye Faithful" one. after. another. And end the trip with "We Three Times 11 Kings".

Even worse, trying to play one story from a USB stick with a collection of Blake's 7 audio plays? All the "01 - Title Music" together, then all the "02 - Credits" and so on. OTOH if you find the plots a little simplistic then you'll probably enjoy trying to follow every single one of them at the same time!

> surely, the producer of the album has already curated it for you

Yes, and, as with the audiobooks, made sure that the Concept of the album is made clear by the order of the tracks.

Clearly, this is all our fault for not finding out what format of playlist this player can use (including whether it allows dir names within the playlists) and taking the time to write a program to create them all, because that is something everybody can do.



This is relevant to the bigger picture, as highlights the point: ok, you want us to label our data - but does everything that insists on reading those labels use the same format as everything else? Or do we have to keep on adding fields when trying to move from one application to the next?

that one in the corner Silver badge

Re: In the absence of files...

> they prefer categorisation by labels.

> Witness some of the comments here

What I notice, particularly in the Reddit comments example, is that they appear to be largely relying on labels that have been attached by a third party to widely-disseminated data (in that particular case, TV and Film, using things like the age ratings).

Other sources of data can also add labels (aka metadata) on behalf of the user: a photo on a mobile 'phone can[1] be GPS tagged, then looked up on an online map to add the venue name, and the shot sent to Facebook to have face recognition applied, tagging all of your friends; whilst my boring old camera will write a UTC timestamp and then anything else is left to me. So my photos are all stored in subdirs by date, with manual dir names added to give a brief description (venue, event, whatever is meaningful to me), if for no other reason than that is consistent over *all* my photos, no matter how old (including transfers from film), and I'm used to it by now.

Similarly, I have a load of MP3s that don't contain any metadata 'cos when they were ripped from the CD we were just glad to get an audio file. So now all my music is sorted using a directory structure that suits my needs, I'm happy with it - and don't really care whether or not the files that make up the latest album that I bought, directly in digital form, have any extra metadata in them.

Perhaps a major difference between the beardies and the youngsters is that they were never faced with doing all of the organising and labelling by hand, so never *needed* to learn any other organisational skills, such as creating their own hierarchic taxonomy of subdirectories, just out of necessity?[1][2]

[1] and, if the scenario given above is realistic, the youngsters would be less wary of just throwing *all* the photos at places like Facebook?

[2] although I do admit to also using a wiki full of notes (which can be searched by text, have category tags attached to pages, ...) where I also keep references to specific locations in the massive archive dir tree - but the amount of material that has been "tagged" in that way is miniscule, mainly 'cos the effort to do that is far too high - and it would lose the metametadata that those items are *so* super-special that they are worth the extra time to add them to the notes.

that one in the corner Silver badge

Re: Hit-and-Miss

> I mean, seriously, if it couldn’t, how could so many important ideas have been originated in it?

Well, version control is all about letting you make a complete mess of things and then have some hope that you can recover

You don't *need* a VCS to create an amazing piece of software and within that originate many marvellous concepts.

It is just a *lot* safer and less nerve wracking to do that with the VCS, especially if there is potentially more than one person working on a section of code[1]. DVCS extends that, making it easier (understatement) to spread your team around; it still isnt *necessary*[2], just helps maintain the sanity of those involved.

[1] The first commercially shipped, shrink-wrap software I worked on was written by a team of half a dozen and we had nothing that could be called "version control" - we each worked on dual-floppy PCs and it was only *just* possible to compile and link the entire program on the single hard drive equipped PC that we all shared. The closest we got to version control was someone running grandfather-father-son backups. We were careful. And delivered. /macho

[2] Barring things like management refusing to allow the longer timescales due to posting floppy discs around the globe - those sorts of things control whether you'll get the resources to write the software, not whether it is possible to ever it under those conditions.

that one in the corner Silver badge

Re: In the absence of files...

> Whether you call something a directory or a folder it's still an abstraction of a place in which to store other things, some of which may also be files or folders depending on the vocabulary you choose to use. You're confusing the abstraction with its implementation.

No - a directory is *not* "a place to store something" - it is an indirection, a reference to where the thing is stored.

Joe Bloggs is not "stored" in any of the paper directories - finding a reference to Joe then points to his physical location (or to another reference that you can then follow to finally reach him physically - e.g. a 'phone number).

>> Folders are a physical box to put things in. I put the thing in box A, it's still in that box later. I move it to box B, it's not in box A anymore.

That is why adding or removing a reference to Joe does not change the way he is stored, nor does it change any of the other references.

Similarly[1] a file system's directory entry is *not* the file itself, it is a reference to where the file can be found. You will often "move" a file from one directory to another, but that is conflating two directory operations into a single command: add the new dir entry, then delete the old dir entry. At no point[1] does the file itself change its location during this move.

The semantics of a "folder" and a "directory" are very different, but everyone continuing to use the UI term "folder" is clearly confusing/limiting people's thinking.

[1] ignoring later optimisations that were added to the file systems, like storing the contents of a tiny file inside its directory node - which then causes all sorts of fun when you create another link to reference that data

that one in the corner Silver badge

Re: Well, at least this reminded me to have a play

> Python is the only language I'm aware of that needs a "pass" statement


I am still such a Python neophyte that, seriously, I only learned of "pass" last week - and then it took hours of scouring to find it - it is buried hundreds of pages deep in "Learning Python", skipped over entirely in lesser beginner's guides (as they sneakily avoid situations where it is needed) and just what the heck do you google for if you don't already know the name but are hoping something like it exists?

that one in the corner Silver badge

Re: The argument for keeping different horses for different courses runs thus...

> we'll have another look to see whether all that work to recreate exactly what we already have was really worth it.

Um, why would you, or anyone, want to recreate *exactly* what we have now? We have Call of Duty 6, you can play it now, why recreate it - and do so by changing over to a new language at the same time?

Maybe you are thinking that CoD6 is deserving, in the future, of the cult status that many retro games already have, so we must be sure to be able to run it, exactly as it is now, on the Machines Of The Future? Well, maybe, but then we'd use the same methods as we do for the current retro games - fire up a VM and run it in that.

In the course of time, unless you know of some fundamental reason why not (in which case, please share it), there is no reason to think that one couldn't build a video game of the scale of CoD totally natively to a Lisp-based system[1].

But it will get to that point piecemeal, just as CoD did on the PC (what, CoD came into being as a single project, not in any way pieced together from and building upon existing materials?). But did anyone promise the new platform as a Gamer's Paradise from Day One? Maybe, if it happens, it'll take over from a different starting point and, for the first couple of years after it trickles down to all the home users, there will be a resurgence in console gaming whilst the new stuuf ramps up behind the scenes.

As for the database - why recreate it, especially as it is running on a server and we can just talk it over the network? Heck, if it is a big industrial database it is probably already running on a different OS than its clients are!

But we can write some new database clients using Smalltalk - after all, how many people actually write the database server compared the number that write new databases to live in that server and new clients to access that data in User-appropriate fashion?

Or did you actually mean that you don't see any purpose to a new programming idea until it has been used totally replace the current crop, in one fell swoop?

[1] Lisp-based doesn't restrict you to only writing raw Lisp, of course; just like the use of C in CoD doesn't limit it to being written in C (in fact, Googling it, C isn't mentioned - only C#, C++ oh, and Lua, which is in C but aren't writing C, they are just using the language that is implemented on top of it. Many a domain specific language has been implemented in Lisp.

that one in the corner Silver badge

Re: Hit-and-Miss

> That's application level detail

Hmm, nope. It is the difference between tacking version numbers onto files (which you can do manually, VMS just did it automatically) and "version control" as per a "version control system" or VCS. There are "application level" features built on top, some of which are so common that we associate them with the concept of a VCS, even though they are totally generic facilities: most importantly, diff and merge (which you do to anything, with varying usefulness, not just the contents of a VCS).

Ignoring the use of compacted storage (e.g. just saving the results of the diff) - which is a pretty big thing to ignore, but never mind - version numbers do not a VCS make.

Even the crudest VCSs, which operated on the single file level, such as SCCS and RCS, provided more than VMS versions when you saved the next revision - basically, more meta data was added. Some of that VMS retained - eg, the timestamps on each version.

But nowadays, we tend to shudder at the thought of going back to RCS and file-by-file versioning. The use of CVS (and later) features, such tracking renaming and grouping changes to multiple items as a single atomic change are beyond what VMS would do for you.

Perhaps cheekily, let us not forget that automatically saving FRED:001, FRED:002, FRED:003 etc does not even provide the level of meaning we attach to "saving the copy with a new version number", such as FRED:1.2.0, FRED:1.2.1 and FRED:2.0.0 (yup, the third copy was a major change in the file).

that one in the corner Silver badge

Re: Hit-and-Miss

> The cost of PMEM is not such a critical aspect as, one might presume, it becomes just a tier in a seamless pool of storage

We can already do this without the PMEM: you just[1] have to memory map the entire raw disk, without worrying about creating a file system first. You can even leave all that to hardware and not bother the (programmer' model of) the CPU at all, giving you a transparent memory model from L1 associative cache down to spinning rust. Which I'm pretty sure a few applications have done, albeit ending up laying down a data-structure that you could easily just call a task-specific file system (e.g. a database taking over a raw partition instead of using an OS provided file system).

Which is fine and dandy, until you wanted to use more storage than the CPU could address (ok, we hit that immediately for 16-bit CPU addressing, very quickly for 32-bit but, whilst 64-bit made a mockery of the 48-bit LBA hard drives and their feeble 100-odd terabyte limit, by then we were rather stuck in our ways).

But once you do have your memory-mapped drive *or* your PMEM, to protect against software (and some hardware) crashes you apply the same methods to data structures in all of memory as we do now to those on the hard drive: i.e. journalling, atomic writes to switch representations of a data structure from one consistent representation to a newly copied-and-modified-then-verified-consistent representation and so forth. Which means a *lot* of discipline by programmers (not everyone actually writes database programmes all day, so aren't very used to thinking about doing everything by read-modify-commit transactions) but preferably compiler/language runtime support (which sadly isn't, as far as I know, fundamentally built into Smalltalk or its ilk, as per the suggested languages in the article; anyway, how do you automatically decide when a totally arbitrary data structure is in a new and consistent state?).

[1] he says, glossing over the issue of unexpected power loss, but keep on reading.

that one in the corner Silver badge

Re: In the absence of files...

> If another application wants that data it has to go through the owning application.

So do I *have* to have Word installed, as it is described as the "owner" of the file that you sent me? Or can Libre Office be used? What if I have both installed, because I'm supporting Users who each choose a different program?

Is that SVG file "owned" by Inkscape or by the web browser from which I used "save link"? What about the SVG file I created using Inkscape but want to view in the browser? What about when I uninstall Inkscape? What about when I just open the SVG in a plain text editor to do a quick search and replace of an element?

Is that JPEG owned by Irfanview, my favoured quick'n'dirty viewer, or by the Sony program that copied it from the camera? What about the JPEG copied across from the Olympus DSLR? Or the images from my mate's Canon? Maybe they are owned by Lightroom whilst I'm in an editing frame of mind, trying to correct the colour balance? Or back to Irfanview as all the editing required is a simple rotate and crop?

that one in the corner Silver badge

Re: In the absence of files...

> This is probably because there is no physical analog.

There is - a directory! Or a set of directories.

You can use the directory of local trades, or the directory at Companies House, or the local phone directory of every subscriber, or the directory of local council contractors, or the directory of members of the plumbers & gasfitters union, or the directory of Corgi[1] registrations - all of which can lead you to Joe Bloggs, of Bloggs and Sons (gas) Ltd, who fitted the new mayoral kitchenette.

Which is why computers actually have directories full of files, not "folders"; which is just a weird UI choice made by someone who created a GUI and it unfortunately stuck (a great shame, as we definitely still had lots of examples of real, paper, directories in use when that silly choice was made).

[1] old name, but far more memorable than whatever it is now

that one in the corner Silver badge

Well, at least this reminded me to have a play

Despite not having had success applying Lisp or Smalltalk in the Real World[1] this article has reminded me that how exciting all this was.

Now that I have some more time on my hands, I have at least made a few notes, reminders to dig back into the piles of books (Blue, Green, Red and Purple - I still have them all in dead tree form), look up the latest installer for Squeak (and Pharo, why not) and give the middle mouse button some attention.[2]

I might even dig out my hacked copy of Budd's "A Little Smalltalk" (although that reaks of masochism these days, for a while it was the best - yikes - that was actually available)!

Thank you, Liam, for a definitely thought-provoking article.

[1] we tried, back in the '80s; I was even hired 'cos I could namedrop "object browser" and draw a picture to explain CADR, although they did demand I came back looking a little less like a programmer and a bit more City of London!

[2] But first, I am still gritting my teeth and trying to convince myself that semantic white space is bearable and Python *can* be learnt! Oh, why didn't they release the "Raspberry Lua"?

that one in the corner Silver badge

Re: Ah, LISP

Try The Implementation Of Postgres - I wasn't presented with a paywall.

>> Our feeling is that the use of LISP has been a terrible mistake for several reasons. First, current LISP environments are very large. ... As noted in the performance figures in the next section, our LISP code is more than twice as slow as the comparable C code. Of course, it is possible that we are not skilled LISP programmers or do not know how to optimize the language; hence, our experience should be suitably discounted.

>> ...

>> However, none of these irritants was the real disaster. We have found that debugging a two-language system is extremely difficult. The C debugger, of course, knows nothing about LISP while the LISP debugger knows nothing about C. As a result, we have found debugging POSTGRES to be a painful and frustrating task.

that one in the corner Silver badge

Re: NV RAM never entirely went away & predates Optane

> NV RAM / PRAM. It was static memory with a lithium coin cell. Microcontrollers could be powered off and resume.

As used in the Palm Pilot PDAs (although the power was triple-As with a small backup whilst you switched in new batteries - which was *not* a daily occurence!)

> Some could "sleep" due to having a static design. You could literally stop the CPU external clock to pause execution.

The CDP1802 COSMAC ELF was a brilliant first computer to learn assembler on, for exactly that reason. Make a simple two-transistor logic probe[1] and watch it all happening, one clock at a time.

[1] rich sods built a couple of dozen probes and soldered them onto the system busses; flash gits. Mutter.

that one in the corner Silver badge

Re: Windows NT

Under all the Usual Suspects operating systems, you can happily memory map a file across the network. With caveats.[1]

You should be able to use memory mapped files in any programming generic language, Python, Smalltalk, Lisp etc, unless they are being a pain in the backside and not allowing you to take advantage of the OS features.

[1] the most obvious being consistency[2] if multiple processes are mapping the same file and at least one is writing to it: across the network, those processes can now be on distinct machines, so unless you (use a library to) do the hard work, you are not guaranteed that both processes will see the same contents at the same time.

[2] actually, I suppose *the* most obvious is that the server can be unplugged without the client being told and your next attempt to page the file will cause an exception! But that is hard to prevent just b linking in a helpful library.

that one in the corner Silver badge

Re: Costs and Benefits......

The Lisa required far more hardware than the Mac, and definitely way more than the early IBM PC, which helped push its price way up there. Who on Earth would ever want a computer that could take 2 MB of RAM and a hard drive? One floppy and upto 512kB was enough to get the work done all around the office.

And a soft power-off function? Pah, the BRS is cheaper and all you need!

(The Lisa GUI was a lot nicer than the IBM PC, but I'd been spoiled - the PERQ had a bigger monitor)

that one in the corner Silver badge

Chickens and swans: they are harder to tell apart than one might think

Is there *really* some magical difference between using the fondleslab and the PC? And I'm admitting here to using Windows most days[5]

> You have a grid of icons and you can't directly see your "documents". Open an app and it picks up where it was. Turn off the fondleslab and when you turn it on it's right where it was, although some things will be slower to open at first.

You know, that sounds like my day to day experience with the PC for, ooh, decades now. *The* "productivity tool" I use is a plain old programmer's text editor - and those have been reloading the last state for donkeys years, including cursor position[3]! And way better than that, tracking a whole lot of different sets of states that I can switch to easily.

Just quickly comparing the Android[1] devices sitting next to the PC, on both in their normal configuration:

* I have to start up a special program in order to 'directly see my "documents"' (although I can create an icon that will directly open a particular one): xplorer on the PC, Ghost Commander on 'phone & tablet.

* Some programs will automagically reload the previous state, some won't; the web browser is the obvious example (which started doing that on the PC back in - not sure, was it Opera around about 1997?) but the PDF reader(s) will as well[2]

* Other programs steadfastly refuse to reload the previous state, but do something more useful (e.g. load the night sky as it looks today, right now, by default)

* Other programs don't (and I sometimes think I'll hunt down something that will, but it is more faff to do that than not bother; hey, I probably just need to check the options): on both, opening the photo browser program directly means I then have to navigate to whatever I want to see today, even if it is the same as that thing yesterday.

* When I turn on the PC[4], it starts all the bits I *want* it to start, in their previous state (if that is appropriate)

Oh well, maybe my PC is an outlier, running so much weird stuff that it is unrecognisable to Joe Bloggs.

Okay, on the PC it is a *lot* easier to drop into the Really Oldie Days way of doing things, and I do do that (frequently) as, strangely enough, using a command line can be a really efficient way of doing *some* parts of the task. So maybe that is the big difference between the two? But if I decided to load

[1] full disclosure, haven't used an iOS device for more than a few tens of minutes, but I'm told the experiences are similar

[2] actually, both the PC versions win here, as they reload *all* the docs I had open last time, right at the relevant page; great for hardware refs.

[3] seriously, donkey's years; the *current*, latest, exe for my favourite editor, the one I open first in the morning, CodeWright, is dated 01/08/2003

[4] if I used my laptop more, it would hibernate and come back up in its last state even faster!

[5] I have a Real OS running as well, honest, Windows is just in a VM - with sole access to this GPU, so it *is* all very hackery and clever; stop looking at me like that!

that one in the corner Silver badge

Re: Hit-and-Miss

> Let's talk about "version control" and LISP and Smalltalk. Oh, they don't have that, so we're back to BASIC-style "SAVE PRG001.LSP", "SAVE PRG002.LSP", etc.

Absolutely. And forget about diff'ing the saved images to get a meaningful view of what has changed from one version to the next.

And, for that matter, wanting to run different images alongside each other, for the sake of comparison: at that point, you are into loading VMs and selecting the images to run, which is a sudden jump in complexity (especially in Bygone Days).

As much as I had wanted to like Smalltalk, since reading the dedicated Byte issue, that was one of the ideas that I really was never comfortable with, so never really got to grips with "the proper Smalltalk experience" enough to complete a product in it - not helped by being asked how to remove all the debugging and object examination features to turn it into something that was deemed shippable![1].

(FWIW All my LISP work was done the normal way - editing the source in the same editor used for Pascal, roff etc; and I got a couple of the Squeak books - and so many versions downloaded - hoping to "really get into it this time" but, well, not so far, despite being one of those C++ programmers who still really likes using OOP)

[1] There probably was a way to do that but it didn't seem to be documented at the time - maybe we hadn't bought the full "commercial-grade package"? - and, unlike nowadays, you couldn't Google and find a Youttube video entitled 21. Tutorial. How to distribute a Smalltalk application.

Judge slaps down law firm using ChatGPT to justify six-figure trial fee

that one in the corner Silver badge

Re: Second time in two days ...

> But it does seem as those most of the hate here is based on fear

Hmm, but you said

> I have already openly admitted that I have experienced ChatGPT be very wrong about very cut and dry factual matters.

So it must be fear, not a desire to have reliability, especially when there are apologists around (hint: mirrors exist) to wave away the unreliability as unimportant.

So, unreliability is unimportant in something that is being literally thrust upon the masses, who have no way of even judging whether what they see came from that source? Let alone, if it did come ChatGPT, in which direction it has decided to be unreliable today.

Relying on the unreliable is a common trait, but at least you *know* that you've just read an Info Wars article.

that one in the corner Silver badge

Re: Second time in two days ...

> That is, generally, a good thing.

But not a thing that is related to the point doublelayer was making.

Reacting to the context of the conversation would indeed be a useful feature, if it were based on a solid foundation.

Although, what is the conceptual difference between "reacting to the context of the conversation" and simply re-starting from scratch and feeding in the whole conversation again as the prompt?

In other words, "reacting to the context" is really an amazing feature only insofar as they are spending the resources just to keep your session alive whilst waiting patiently for you to type something, rather than dropping it and letting another process use the expensive hardware.

that one in the corner Silver badge

Re: Bullshit

> a fairly new technology

New? In comparison to other computer tech?

Neural Nets predate WYSIWYG wordprocessors. By decades.

The only "new" things here is the scope of the hype, the way that has reached pretty much everyone who uses computing devices on a daily, nay, minute-by-minute basis.

Actually, even that is barely true. As noted, NNs are old and people have been building them and offering them to compulsive computer users pretty much the whole time. Without much success, beyond making money before the bubble bursts (to the point of just-inside-the-letter-of-the-law fraud[1].

There have been - still are - places where NNs have done good - have saved lives[2] - so there is certainly no reason for thinking that there is some irrational anti-Net bias: but those genuine successes are not the subject of the massive hype machine.

The change now is that so many, many more of us are now in the minute-by-minute computer users category and the economy of scale means that spending one dollar per user has allowed a (very small number) of players to build some very big 'nets.

But being big doesn't mean the maths and logic behind these things has in any way suddenly changed[3]. All the flaws are still there[4].

[1] e.g. 1980s/90s, as automated trading took hold: take a stock exchange feed, train up as many 'nets as you can get PCs to run them - don't worry, the data rates are low enough and you don't need to waste 'net nodes on pseudo-parsing natural parsing language - then switch them to output mode before trading opens the next day: ta-da, predictions for the market that day. Repeat for, say, a month. Most models' predictions made losses - but this handful won Big Time! With a totally straight face, sell copies of that handful of 'nets (with witnessed guarantees that they did make those predictions!) to anyone with a big old pot of cash.

[2] 'nets used in medical image screening - *but* we just happen(!) to give very high value to the true positives, enough that we are willing to forgive the false positives - and quietly shrug our shoulders over the false negatives, because we are still catching more than we did previously.

[3] there are good reasons why AI research spent time on more than just NNs - and it wasn't just because the hardware wasn't available: Moore coined his law a while ago (and the current LLMs are appearing in a time where we are concerned that said law is losing steam)

[4] big bug bear: inability to explain their reasoning *and* to have that path tweaked to improve the results; instead, next run (irrespective of whether you change the prompt) you get a totally new output and need to go over inch of it again to look for flaws.

that one in the corner Silver badge

> it makes up nonsense, but it suggested a line of attack he hadn't thought of, which worked.

These anecdotes, where someone - anyone - has used ChatGPT and "it didn't solve the problem, but I came way with a new idea" make me wonder if they could get the same result by using "Rubber Duck Debugging" and a random choice of new approach to the problem by, say, drawing a card from Brian Eno's Oblique Strategies deck[1]. Cheaper, far fewer resources needed.

Now, if you have a citation, including the transcript, so we can see that it was, indeed, ChatGPT that came up with the idea out of the blue, without either being led by the nose[2] or it being one out of dozens of otherwise useless ideas[3] then we can reevaluate the level of incredulity.

[1] hmm, to make it more Modern and Computery, we could drop a copy of those cards into ELIZA and save on buying rubber ducks; I mean, have you seen how much they charge nowadays for a decent model, like a Shakesduck or an Isambard Kingduck Brunel?

[2] I've had some useful info out of ChatGPT, because, unlike Google, after it has come up with the latest, most repeated/hyped, suggestion, you can tell GPT "no" and it'll dig out something a bit more obscure; did that a few times and it finally spat out a reference to some 1970s tech that turned out to be what I wanted to find.

[3] carry on saying "no, that isn't what I want" and it'll happily apologise and then spit out yet another semi-random selection, throwing stuff on the wall and seeing what will stick. Then a simple bit of confirmation bias[4] will get you to remember - or just retell - the sticky bit and forget to mention the mess piled up on the floor or the bits that were so off-target they flew out the window.

[4] even the smartest people are subject to this; I'd even wager the smart ones are more used to having lots and lots of thoughts about a problem that they are really used to just forgetting about all the ones that never lead anywhere that they may not even notice how many such fly past.

that one in the corner Silver badge

"was misbegotten at the jump"

At the "jump"? Misbegotten from the start, surely, at the very point someone on the legal team even had an inkling to use[1] ChatGPT for this purpose, let alone the point where, having prepared everything, they finally made the jump and submitted the prompt.

[1] would say "misuse" here, but that would imply that there is a sensible way for these people to "use" ChatGPT in the normal day to day[2], and it is difficult to come up with a scenario where anyone in the legal profession should ever hand over responsibility for writing anything to ChatGPT, let alone something that is intended to be examined by the one person who has such power over the case!

[2] but what about simple, repetitive, things, like writing flyers to leave under windscreen wipers, I hear you cry? Well, if the interns aren't doing the simple assignments properly first, they'll not be getting the practice needed to get the meaty stuff right. Though if they are in a firm that advertises by interfering with parked cars...

China breakthrough promises optical discs that store hundreds of terabytes

that one in the corner Silver badge

Re: Read/write speed may be the barrier to adoption here

With 100 layers, add as many heads as you are willing to pay for. The rotational latency will remain the same, but you wanted throughput.

AI comes for jobs at studio of American filmmaker Tyler Perry

that one in the corner Silver badge

> Hasn't he heard of matte painting...

You can't expect a man of Perry's stature to know anything about the history[1] of the craft, he is a Going Forwards only kind of guy.

[1] "history" of moving film? Good grief, it is so young that it barely has history! Compared to theatre, say... Next, you'll be telling me people involved in something newer, like computers, don't bother with their "history" either! Preposterous!

Europe's data protection laws cut data storage by making information-wrangling pricier

that one in the corner Silver badge

> The cost savings appear to assume that there was no benefit to holding the information. Doesn't sound likely.

If you're not concerned with any privacy implications, it is (initially, at least[1]) dirt cheap just to grab every bit of PII you can and hang onto it, rather than spending time & effort trying to decide if there really is any benefit to you from hanging onto that data: it looks juicy and interesting, we'll think of a use for it.

A lot of stuff, of all sorts, gets hung onto without any benefit, or the apparent benefit evaporates. Haven't you ever been a part of the occasional "emptying of all the cupboards and dusty storage bins" in the office/factory?

[1] as time passes and the company/clients/customers increase the unnecessary information can take up enough space to become notable - by which time it continues out of habit, assuming *someone* knows why it is being kept around.

China could be doing better at censorship, think tank finds

that one in the corner Silver badge

Was starting to feel sympathetic towards the censors

The article painted an all too familiar picture: a large organisation, nobody seems to be actually taking overall control (but are basking in the claims that they are in total control), resources being spent over here to duplicate effort whilst over there is lacking the necessary to get the job done.

Then was finally reminded at the end, nope, just for once, this is what we *want* to have happen, we don't want an efficient censorship.

Still have a bit of fellow feeling for the grunts at the bottom, just trying to get the day's work done, despite it all.

Orgs are having a major identity crisis while crims reap the rewards

that one in the corner Silver badge

Social engineering, the second oldest profession

Making phone calls, convincing people to bypass the mechanical security...

Have we become so focussed on the clever coding and machine based security that we've been forgetting to train everyone to look out for the old tricks? Human Factors Engineering, if you will.

(Ok, some of it is just good old fashioned laxity - not using your TFA - which is just the back-end Human Factors kicking in)

Neuralink patient masters mind-mouse maneuvers – if Musk is to be believed

that one in the corner Silver badge

Long term care after the tech is declared obsolete

> However, there is no information on what might constitute an adverse reaction nor how the implant would be removed if required.

We are rightly concerned about how Neuralink are going to behave if things go wrong, but we should also be asking about their plans if things go right.

How long are these implants intended to work? What will happen when they reach their natural end of life (physical breakdown of parts)? What about when they are declared obsolete and are unnaturally declared EOL by Neuralink - having moved onto the Next Big Thing with a cohort of Bright Young Minds who have no idea how the existing system works?

Are patients going to receive an email saying, sorry, but in the next hour your ability to communicate with the outside world is going to be switched off? You did keep up practising with the clunky old pick-a-word board, didn't you?

The recipients of Second Sight's visual implants have been through this - are there signs that Neuralink (or any other commercial implant company) are taking their long-term responsibilities seriously?

At a bare minimum, placing *full* details of the device and support systems into escrow, in hopes that someone is able to at least try and help the victims.

that one in the corner Silver badge

masters mind-mouse maneuvers

I was relieved when the first sentence pointed out that this is referring to a *computer* mouse.

I had visions of Neuralink just connecting the human patient's device to the lab wifi and wondering why the caged mice were suddenly so attentive - and why the patient was asking about cheese.

Superapp Gojek fine-tunes each new error message for a week. What? Why?

that one in the corner Silver badge

"motion designer" is in some ways just a new way to say "animator."

> The difference is that the format and style has changed over the years.

It has changed enough "over the years" to warrant a new name, something that "animator" just doesn't conjure up. Hmm, interesting, tell use more.

> In the past, animations were heavy on being verbal explainers. "They used to have a character and say things like 'this is Bob …', but they don't do that anymore.

Why does this sound suspiciously like a description of low-effort early-morning children's TV cartoons? Do you think that is their entire reference pool for the work of animators?

> Higher-end studios prefer something more abstract," he explained.

What? What! They prefer something "more abstract" than anything those olden times (/sarcasm) animators turned out?

Something more abstract than Prelude Taking A Line For A Walk. One of my personal favourites, the marvellous Erica Russels' Feet Of Song and Triangle (you may remember her style from a number of UK TV ads and some music videos back in the 80's and 90's). Or go and dig out some of the Drawn-on-film animation such as the nobody-ever-saw-it (/sarcasm) A Colour Box ad for the GPO that was shown in cinemas across the UK (i.e. these abstracts were never intended, nor treated, as some niche product, hidden away from the masses - which makes it all the stranger that someone who "knows enough about animation to realise what they are doing now is so very different it warrants a new name" seems so ignorant).

> A wordless animated image may also be a clever way for a business that operates in five different countries

"Wordless" - well, although all the above integrate the sounds with the visuals, they are wordless. Okay, the dance pieces work do rely on the soundtrack, so how about something from Jan Švankmajer: Passionate Discourse - you could even use part of that as a progress bar for, say, a diff/merge operation!

All of the really great animations work without narration or text (ok, the aforementioned A Colour Box has some text in it, so I probably should have left it out - except it is such a nice example of dissemination of the abstract; ok, ok, I like it and want to share it). If you switch off the sound and just watch the narration-heavy The Man Who Planted Trees you are still shown a story with a depth, from the abstractions, that a straight filmed version would struggle to capture (sorry, sorry, getting a bit into film reviewer mode there).

> "It goes back to ten to twenty years ago when we surfed online and there was nothing to tell you if a website was loading. Then they introduced a preloader – a bar that progressively was filled in, so visually the user doesn't think the browser is hung," mused the creative director.

As Dan 55 pointed out, this "creative" director has missed *another* piece of history, the 30 year old browser progress indicator (and still that ignores the pre-browser progress bars, let alone the even older "print out a dot every now and again, to let the User know we haven't crashed" precursor to the Mosaic planet icon).

So not only do we *not* need 'a new way to say "animator"', attempting to do so shows a total lack of awareness of the subject (gosh, some of those references are almost 80 years old, they can't possibly be of relevance to Today's Exciting New Apps) but is actually insulting to those people who pioneered the methods.


For more, start with the 4Mations YouTube list.


Once more, I am merely a programmer, not a Creative, let alone a Creative Director, so what do I know; we operate in totally different areas of work and I can not possibly expect to hold them up to the same requirements about depth of research or general knowledge that are needed for the boarish work I've done to earn a buck.

Chunks of deorbiting ESA satellite are expected to reach the ground

that one in the corner Silver badge

Live and Direct

The Zik Zak corporation turned this sort of thing into an annual festival, Sky Clearance Day: bring your best metal umbrella and join the fun!

Max Headroom, prr-prr-prrredicting the future - well, 20 minutes into it, at least.

that one in the corner Silver badge

Re: If you'd asked me ten minutes ago...

Don't know about the Spinal Tap guy, but Maggie O'Connell's boyfriend was killed by a falling satellite in Northern Exposure - that was a particularly hardy piece of kit, as it still had its antennae intact at the funeral (they just don't make 'em like that anymore).

Forgetting the history of Unix is coding us into a corner

that one in the corner Silver badge

Re: Man pages

You had enough space to keep all the man pages (unformatted) online all the time? Luxury!

If only Ikea had expanded internationally earlier, they would have made a killing with a Billy perfectly sized for the ring-binder man pages.

Attempts to demolish guardrails in AI image generators blamed for lewd Taylor Swift deepfakes

that one in the corner Silver badge

Re: why is nobody...

And the things they've done to ducks. Oh dear heaven, the ducks!

Developer's default setting created turbulence in the flight simulator

that one in the corner Silver badge

Re: sort of on topic...

Was it one of those highly educated sorts who somehow manages to get themselves into endless scrapes on an almost weekly basis?

OpenAI reassures: GPT-4 gives 'a mild uplift' to creators of biochemical weapons

that one in the corner Silver badge

One step closer to taking over the world

> OpenAI compared results produced by the two groups, paying close attention to how accurate, complete, and innovative the responses were.

Accurate? Complete?

OpenAI already has people on-hand who are expert enough to judge the accuracy of instructions for creating bioweapons?

Or did they just work through all of the answers and see which ones gave working results on their tests groups?

Either way, are OpenAI taking the Ernst Blofeld approach to Market Domination?

Microsoft seeks Rust developers to rewrite core C# code

that one in the corner Silver badge

Re: Rockstar

> Maybe the move is to attract young, brave, naive, rockstar developers

Or to attract grey beards who have been pissing themselves with laughter over C-octothorpe since it appeared, by replacing it with something that demonstrably helps with memory management.

The literal Rolls-Royce of EVs is recalled over fire risk

that one in the corner Silver badge

Re: Required edits:

> British marque's' -> 'German marque's


British brand's' -> 'German brands's

As neither of those are France.

that one in the corner Silver badge

The marque that sparks

This isn't just any EV...

Zen Internet warns customers of an impending IP address change

that one in the corner Silver badge

> most of our customers primarily use just one IP address

And how many of their customers only *have* just one IP address?

Like those who joined after Zen were well established and had stopped giving out multiples as an incentive...

that one in the corner Silver badge

Re: Hard Work

> sorting out their situation with Zen or moving to a new provide

And we go around and around again - I changed to Zen in the first place on the advice of El Reg comments

We only have the one IP and haven't (yet) received this notice, so fingers crossed, but with the tales of customer service degrading...

Then again, with OpenReach seeming to have taken the government's deadlines for death of POTS and supply of fibre as a way to push back providing fibre on our exchange, there is time to take a breath before the next forced change to our service and contract "upgrade".

40 years since Elite became the most fun you could have with 22 kilobytes

that one in the corner Silver badge

Re: Better than Dangerous

I backed out and took the refund when they dropped standalone single-player mode and insisted it all had to be massive multiplayer online or nothing. I know, very much not down with da yoof and all that, but I'm quite happy to play video games only sporadically and take my own sweet time to progress (one day I may even finish Riven, no hurry).

I was quite annoyed about that change (as an original BBC Elite player I was looking forwards to the new sparkly bits and pretty pictures) but at least they did refund us.

Elon Musk's brain-computer interface outfit Neuralink tests its tech on a human

that one in the corner Silver badge

Re: This could be a really big deal

> I could also see fighter pilots and the pilots of Musk's Mars ships connected directly into their crafts[1], no joy sticks, screens or any of that, direct input and control.

No screens?

No *screens*?!

You want this thing to pump data back *into* the pilot's mind? Forget Firefox, worry about Firewall Down!


[1] their craft! The plural of "aircraft" is : "aircraft"! Unless you were thinking they would learn macrame and the carving of scrimshaw[2] whilst the craft's FSD flew them (in)to Mars?

[2] perhaps a scene of a privateer with four or five cannon along the port side.

[3] at this point, trying to list all the synonyms for disbelief suitable as a response to this, my cortically implanted thesaurus overheated and shut down and just emitted a single "yikes" before emailing to schedule a service.

Linus Torvalds flames Google kernel contributor over filesystem suggestion

that one in the corner Silver badge

Re: I am Ignorant.

> Off the top of my head, find and tar need unique inodes

True. And if you read a few of the messages, Linus accepts that.

What he isn't accepting is that there is any good reason to support someone running tar over the particular FS in question, eventfs (and/or tracefs, the distinction is a bit confusing). Nor does he worry about people having tar fail when trying to archive procfs! If you try to tar eventfs, the argument appears to go, even if it "worked", what good is the result?

The additional argument (over inodes that will about allow for processes in tight loops doing mkdir/rmdir all the live long day) is - more than I want to worry about! Came back and read some more Register articles instead.

Fairberry project brings a hardware keyboard to the Fairphone

that one in the corner Silver badge

Sharp Zaurus

Purely for the nostalgia, I just picked up my Zaurii, pulled down the slider on each and revelled in the feel of the keyboards - the sleek silver of the 5500 and the bold black 6000: so much easier than a touch keyboard and far less likely to just slip from the grasp. A decent single-handed thumb typing experience (and being able to use a stylus, just like the Palm Pilots, made for simple precise pointing as well, but that is another discussion, about the cost of a Wacom-enabled 'phone...)

I hear tales that a modern 'phone can be used with the thumb of the hand holding it, but I have yet to manage that without speaking out loud all the gt£%@4ghy%)'$°] that inevitably results.

The pen is mightier than the keyboard for turbocharging your noggin

that one in the corner Silver badge

Re: Why aren't you taking notes?

> I have to exercise and practice many times to get things to stick.

Aside from the simple "horses for courses" and "diff'rent strokes" from person to person, you are also talking about needing to put different amounts of effort into different parts of the process.

In other words, "taking notes" and "exercise and practice" are not in conflict with each other but act in different ways (and with different strengths between individuals).

So my experience differs from yours: if I am in a lecture and don't take a continuous stream of notes then the words won't even enter short term memory, but will tend to just wash past me; the act of note taking engages focus (for me). Otherwise anything else going on may capture focus (the people nattering in the audience, what that person is doing on their phone, the architecture of the lecture room).

Once past the hurdle of focus, the info bounces around short term memory so the writing can be notes (in full sentence form) rather than simple dictation.

Reviewing the notes provides the chance to see if there are any questions left at the end (and maybe the chance to ask them).

But if I then want to use the new information later on, I still have to practice - looking over the notes again in the following weeks, applying what they say to exercises, comparing it to other material in related areas. Drilling it into long term memory and tying it in to everything else (if it can't be tied into other learnt material then it isn't going to stay for the long haul - don't ask me to memorise a list of kings of England).

We put salt in our tea so you don't have to

that one in the corner Silver badge

Re: Ramble, blither

> Most butter in Tibet is rancid

According to my in-depth studies of an old Journal Of Record[1], the supply of Tibetan rancid yak butter has been endangered by the import of modern detergents, as the yaks are no longer rancid.

[1] The Beano

What is Model Collapse and how to avoid it

that one in the corner Silver badge

What steps should the machine learning community take?

A: stop blindly scraping everything off the Web.

Even without seeing your own output again, scraping everything was never going to provide a totally sane dataset in the first place.

But it was cheap and easy, because the only way they know to build these models is to keep on shovelling into the training maw.

But labelling, manually attaching provenance and "truthiness" values is expensive (and gets away from the Tech Boys Building Bigger Machines), and becomes unfair for smaller companies.

Shame they don't have a better way of building their models.