* Posts by MarkMLl

159 publicly visible posts • joined 29 Jun 2019

Page:

Moon hotel startup hopes you get lunar lunacy, drop $1M deposit for 2032 stay

MarkMLl
Mushroom

Anybody who has watched 2001

would know that space station management was franchised (a Hilton logo was prominent) but that there was no hotel on the Moon and that the Lunar outposts were national assets.

Gmail preparing to drop POP3 mail fetching

MarkMLl

Re: Thunderbird for the win

I agree with Liam FWIW: even if Thunderbird isn't perfect, it provides a useful focal point for developers to collaborate when working out how to talk to commercial services such as Gmail.

Another thing its that its message threading is better-than-half-decent (possibly because of its legacy of being used for Usenet) and its searching facility is similarly fairly good. I mention this in the context of having had to trawl through my Gmail account to find stuff relating to various "legal crap", and I'm well overdue for moving all routine stuff over to a non-Gmail address with local storage.

MarkMLl

Re: that's called privacy

> The only other mail server it's likely to pass through is the sender's.

Disagree. As policy, we've been sending time-sensitive emails directly for the last 30 years or so to people who pay for our service, and by now we are very much in the minority: most people assume that anything not using one of the big service providers as a "Smart Host" can be blacklisted.

The difference is that when we send mail directly we can find out when something's jammed and /why/. If we used a Smart Host, even one provided by a specialist like AAISP, there would be no way that we could find out what was going on without 'phoning them or waiting for the timeout.

Porsche panic in Russia as pricey status symbols forget how to car

MarkMLl
FAIL

Old news.

Russia's in the middle of turning off its 3G, on which the VTS relies to talk to the security servers.

Nothing to do with satellites.

Tuxedo Computers slams lid on Arm Linux laptop after 18 months of pain

MarkMLl

Re: So what's the meat of the article?

OK So what you're saying- if I understand you properly this time round- is that Tuxedo tried to put Linux on laptops which other people brand as Medion, but found it wasn't feasible and pulled out.

MarkMLl

Re: So what relevance does Mediatek have to this article?

> Medion, not Mediatek (the latter wasn't even mentioned in the article).

>

> Medion is a brand name of computers and electronics products sold across Europe in chain stores like Aldi, Lidl and a number of consumer electronic stores.

>

> Mediatek is a manufacturer of SoCs, some of which use ARM IP.

My apologies to Liam and everybody else for confusing the two.

Mea maxima culpa.

MarkMLl

Re: So what's the meat of the article?

> You surprise me, Mark.

That's been on my resume for 30 years or so :-)

So what relevance does Mediatek have to this article? Are you actually saying "if you want an ARM-based laptop look at this one from Mediatek? Because what you wrote was

"[...]and in a somewhat unusual move, it hinted that the Medion SPRCHRGD 14 was the model it aimed to OEM."

which implied that Tuxedo intended to stay involved with the architecture and form-factor by buying something in.

MarkMLl

So what's the meat of the article?

So what are you actually saying here: that they've dropped their custom design based on a Qualcomm chip and instead intend to buy in a laptop based on a Mediatek chip?

Canonical pushes Ubuntu LTS support even further - if you pay

MarkMLl
Big Brother

But pay how much?

...or are we back in mainframe territory where mortals aren't allowed to discuss such things?

UK asks cyberspies to probe whether Chinese buses can be switched off remotely

MarkMLl
Coat

So what are downvotes for anyway?

> I'd be interested to see comments from the downvoters.

Regrettably, there are a lot of people convinced that a coherent posting highlighting an issue with which they are uncomfortable demands a downvote.

MarkMLl

Re: Well... yeah, but nah?

The fundamental issue is that it should be the fleet operator's responsibility to update firmware etc., and the manufacturer should not have been allowed to retain direct access.

Obviously the operator should be open to the possibility that there are "backdoors", but the fact that the manufacturer is open about having access means that something went very badly wrong in the procurement process.

Ironclad OS project popping out Unix-like kernel in a unique mix of languages

MarkMLl
Meh

Wirth not involved?

> Although the late great Niklaus "Bucky" Wirth wasn't directly involved, Ada's syntax – and its strong typing - are visibly inspired by Pascal.

At least some of the ALGOL-68 "Minority Report" authors were involved in the early days.

"The European community was especially responsive, particularly valuable since language research had been more active there than in the United States over the previous decade. ONR London paid several academics (Dijkstra, Hoare, Wirth, ...) to provide inputs, but we got more valuable aid from European industry." -- http://archive.adaic.com/pol-hist/history/holwg-93/holwg-93.htm

In practice it owes more to ALGOL-68 and Modula-2 than to Pascal.

Famed software engineer DJB tries Fil-C… and likes what he sees

MarkMLl

Re: Type checking and compatibility

> Compiled language don't do the kind of type conversions that run-time-typed languages do (can't say about Python - its years since I did a few experiments with it). I've actually used both Algol-W and Pascal (long long ago), and neither does that sort of thing.

But (choosing the trivial example) they do promote byte to word and so on, sometimes with very unclear definitions of sign propagation and so on: I was reading the ALGOL-W manual and some of Hoare's contributions a few days ago.

One of the peculiar things about the Pascal community is that the members almost always express the view "everything Wirth did is perfect!" until somebody suggests that some part of the language be upgraded to be compatible with some of his later ideas.

MarkMLl

Re: Type checking and compatibility

> TRUE, FALSE, UNINITIALIZED, and ERROR

I once worked on a preprocessor for documents to be submitted to Ventura Publisher, and working from memory there were something like seven states of a flag. I really can't remember the details after 35 years or so, but it's very easy to imagine there being separate states for "undefined", "undeclared", "true-but-may-be-overridden" and so on.

MarkMLl
Coat

Type checking and compatibility

I was thinking the other day that it's interesting that some of the earliest "surviving" languages (ALGOL-W and Pascal) plus of course C have type as an attribute of a variable and massage the result of an expression to allow it to be assigned. This automatic type conversion is a real hazard, particularly where assignments are chained.

As I understand it Python v3 does the exact opposite: variables are initially untyped but an expression has a type.

And Wirth's later languages tightened up the assignment and automatic conversion rules a great deal.

It appears to be fashionable for commentators to argue that type-safe languages aren't needed since the majority of recent security vulnerabilities etc. are logic errors rather than simple off-by-one or use-after-free. But that ignores the fact that a high proportion of current software is 20+ years old and that dozens of errors have been weeded out over time: sometimes resulting in real damage to individuals and companies using them.

Strong typing would have prevented many of those errors getting into production code. As such anything that can be done to promote it is a Good Thing, particularly if it has some chance of being applied to legacy code since rewriting it risks introducing more subtle logic errors.

Gadget geeks aghast at guru's geriatric GPU

MarkMLl

Re: Oh, Linus is running a recent graphics card!

That's actually quite an endorsement: imagine what would happen to the developer who broke support for it!

Firefox is fine. The people running it are not

MarkMLl

To lose one is an accident....

There are so many groundbreaking products that the Mozilla Foundation has "misplaced", that in any reasonable industry the executives would be accused of pilferage.

HTML editor: at least that lives on as Kompozer, and has actually been enhanced over the years to support CSS.

Visual Javascript: vanished without trace, possibly inside Sun.

Various server-side stuff: ditto.

At the very least, they could have been a viable competitor to WordPress. As it is they're a one-trick pony that gets a bit of new bling every time it's pased on to a new management team.

Wikidata: Attempting to bridge FOSS ideals and direct democracy

MarkMLl

Cat out of bag...

"Multiple other projects also use the vast linked data store that underpins ubiquitous internet encyclopaedia Wikipedia, and some of them are helping the fight for democracy."

So considering that El Reg reports that the US executive has its sights on people who don't agree with it in the Library of Congress and Copyright office, one has to ask: what jurisdiction does the Wikimedia Foundation operate under, and to what extent could the US government^H^H administration^H^H no let's be honest here TRUMP screw it around now that its underlying agenda has been highlighted?

OpenDylan sheds some parentheses in 2025.1 update

MarkMLl
Coat

Re: Inspirational

> The road less traveled of good crazy could have provided more insights imho (if practicable at all). Homoiconically represented programs (eg. LISP) are trees, with trunks, branches and leaves (directed acyclic graphs). There's no need for a Program Counter (PC) to go through them, one just traverses the tree (data structure), and any part of that code can be moved anywhere without bother, even out of what would be sequential PC order, as long as the tree structure is preserved (this FPGA project may have followed that philosophy).

> [...] As Tenstorrent's Jim Keller said in interview "Everybody who describes [a] program, they describe a graph, and the graph needs to be lowered [...] to the hardware". So, the closer the programming language is to being graph-homomorphic, and the closer the hardware is to being a graph processing beast, the better off we might eventually end up, imho! (maybe ...)

And the nearer you get to the hardware, the more you need an understanding of how the program counter, stackframes and so on hang together. Far too much tosh has been written by people who know how to wave their abstract hands, but have no expertise when it comes to getting them dirty: other than milking investment from the gullible.

MarkMLl
Meh

C syntax as "magic"

> C syntax is magical programmer catnip. You sprinkle it on anything and it suddenly becomes "practical" and "readable".

Looking at this in the context of the ALGOL family, the thing is that there's so many different issues. The first one that people think about is using braces rather than begin/end: but in the overall scheme of things that's trivial (potentially just a macro expansion) and is immediately eclipsed by such things as the dangling-else issue (fixed in some ALGOL-derivatives, hacked around in others, ignored in many including C and Pascal).

Then there's things like declaration order: some languages put a type followed by things to be associated with that type, others do it the other way round.

But the elephant in the room is infix notation: C (and most other ALGOL derivatives) embraced that since it was basically a convention that every schoolboy had been familiar with for centuries, LISP (and APL, Forth and Smalltalk) thought they could do better and- I'm tempted to say- arguably harmed the industry and the underlying technology by fragmenting the notation.

MarkMLl

Although you and I have argued about this several times, and I've usually pointed out that if a peripheral can't adequately distinguish between parentheses and braces then one should look at its font handling and question whether it really is the best tool for the job.

After all, there's plenty of more serious related issues: 0 vs O, l vs I, l vs 1 and so on: and those comparisons will look different to every one of us depending on browser and OS.

MarkMLl
Coat

Re: Ran out of memory

> So am I reading the PDF right, that LISP-2 stalled when they ran out of memory on their development machine?

>

> Shame, that.

Leaving aside the situation at SDS, according to Waychoff John McCarthy was seriously pissed by the fact that Burroughs wouldn't tell him how to circumvent their mainframe memory protection, and as a result replaced Stanford's B5500 with an IBM S/360. And it was years before IBM had working virtual memory, hence the old joke about the letters standing for "I'd Buy Memory"...

Exif marks the spot as fresh version of PNG image standard arrives

MarkMLl

Re: APNG?

No, MNG is something different entirely: as you'd see if you took the trouble to check Wikipedia before posting.

The problem appears to be that MNG is "all singing, all dancing" with relatively large files and a complex decoder as a result. APNG is generally simpler, and has the advantage that a decoder that doesn't understand it as a PNG extension will still display the first frame.

In any event, the belated endorsement of APNG has to be applauded since it will hopefully avoid further duplication of effort.

The elusive goal of Unix – or Linux – simplicity

MarkMLl

Re: "Advocacy..."

Most PDAs and featurephones had some sort of pointing device: a four-way rocker or similar, and multiple buttons.

What is really so difficult about the idea that screen areas that /can/ be clicked on should be visually distinct, and should have popup hints telling the user what he is about to do, at least until he is familiar with the UI?

Smartphone screens are- and always have been- capable of far more resolution than early Macs, Windows or GEM systems: and arguably outperform the Xerox workstations on which the WIMP metaphor started off. There's really no excuse for walking away from well-established design principles that allow anybody familiar with one family of systems to quickly adjust to some other.

When Windows started pushing a common user interface in the early '90s there was a lot of hot air asserting that it would "stifle innovation". I was no lover of Microsoft (having dealt with them commercially) but I certainly never promoted that viewpoint, and I think that history demonstrates that a system based on menus and a two- (or possibly three-) button pointing device is vastly superior to one in which every application program requires the operator to memorise an arcane list of key-combinations that grew larger with every release.

I remember a specialist wordprocessor called the Redactron, from a company led by a woman and promoted as freeing women from drudgery. But /boy/: despite having a fancy keyboard the poor girl operating it had to memorise a truly obscene number of shortcuts.

By all means: /allow/ keyboard shortcuts in the design philosophy. By all means, /allow/ fancy context-sensitive areas of the screen (multi-finger zoom etc.). But for people who do not use that particular piece of software dozens of times a day, provide the universally-understood menu system as a fallback.

However, I have to admit at this point that perhaps I am being reactionary, and perhaps I am advocating a "traditional" solution because I am unfamiliar with the design guides published by the various 'phone OS suppliers (Apple, Google) and the people who would like their app to look like it works on a 'phone even if running on the desktop or in a browser.

But I'm still left bothered by the suspicion that most smartphones are used only as terminals to Facebook and Twitter, so really those are the only UIs that the vast majority of users need to be familiar with.

MarkMLl

"Advocacy..."

Half the problems start when one or more of the people involved in the debate does so from a position of incomplete information and limited experience. They get worse when they insist that their viewpoint is so important that they are entitled or even obliged to argue from a position of ignorance.

I remember a friend who worked at- IIRC- Red Hat, who started discussing user interfaces with some colleagues and was surprised that none of them had heard of the CUA (Common User Access) guidelines: which had of course been invented by their corporate parent IBM, and for a long time were broadly respected by Microsoft.

Now that obviously was ancient history, and he's observed in the past that many of the currently-active "professionals" are actually younger than Windows '95 and NT which form the foundations of "Windows as we know it". But one would have thought that /somewhere/ there would be a record of hacks that had been found to work and idioms which were universally understood, and that this would be taught.

Which leads us to the demise of the organised WIMP user interface, in favour of the utter mess which- IMO- we have when considering Android etc. Why did it happen: was it simply because nobody recognised that the UI worked and was worth respecting? Was it simply because nobody had managed to /explain/ to the new generation that it worked? Was is because corporates felt that they couldn't attract new blood if they weren't given groundbreaking work to do?

And I wonder how many times this has happened in the past. Mainframes were swept away (until people realised that they needed centralised databases). Minis were swept away (until people realised that they needed some sort of multitasking and interprocess communications). Windows- at least in its classic form- has been swept away, and replaced by "UI de jour" either implemented directly on the screen or in a browser. Script-based unix startup has been swept away and replaced by systemd...

And every damn time, there are people eager to argue the notional advantages of the new system. Whether or not they understand the old one.

Liz Warren, Trump admin agree on something: Army should have right to repair

MarkMLl

> The rule of thumb is that you can expect the printed part to only have 30% of the strength of thesame part in the same plastic, but injection molded.

That rule of thumb needs a further term, even for plastic. "You can expect the printed part to only have 30% of the strength of the same part to the same design in the same plastic, but injection molded."

Using 3D printing you can add strength members to a part in locations and at angles that would be completely unattainable in a (single-part) moulding, and fill relatively-unstressed parts of the body with a structural foam of appropriate (and varying) density.

So in practical terms you might be able to make something which fits into the same volume, but by reconfiguring reinforcing ribs etc. is not only of comparable strength but is also significantly lighter.

MarkMLl

Leaving aside the current crap about 3D printers, predatory behaviour from John Deere and the like: this is something that's been brewing for years after the army brought certain facts to the attention of a Congressional committee.

Basically, they'd been put in a position where they were obliged to buy off-the-shelf kit, which invariably had a "return to supplier" repair policy.

The example given at the hearing was that of portable generators failing in IIRC Afghanistan, which the army's own mechanics were entirely capable of repairing but weren't allowed to.

Frankly, I'm surprised it's taken this long to sort itself out.

A new Lazarus arises – for the fourth time – for Pascal programming fans

MarkMLl

Re: I might give this a spin

Definitely worth investigating. You will find a high level of compatibility with the various "standard" Pascal implementations (ISO, Turbo, Delphi and so on), and it's fairly trouble-free particularly on Windows or Linux.

And the Lazarus IDE gives you very good debugging facilities.

There's mailing lists and a forum which is generally helpful, until some retard starts off yes another "Why isn't Pascal more popular when it's better than everything else?" thread.

My position is that it will probably "see me out". But I'm not entirely happy with the bloat that's crept into the language, or- as I've said somewhere above- the documentation situation.

MarkMLl

Re: No OOP in the new book?

> And if there is something OOP makes harder, is reading code, since you have to follow inheritance.

You clearly haven't tried reading code heavily based on generics :-(

MarkMLl

Raspberry Pi and ARM

Actually, FPC and Lazarus have supported all Raspberry Pis from v1 onwards, and before that other ARM platforms like the NSLU-2.

They've never supported IBM mainframes and a number of others- notably the Itanic- haven't got very far, but apart from that their platform coverage is comprehensive.

MarkMLl

Re: No OOP in the new book?

> I need to mention here that the book does not go into Windows programming, OOP, software components, or the Lazarus GUI builder.

More seriously, it explicitly says that it is omitting all consideration of the RTL (FPC standard libraries) and FCL (FPC Class Libraries).

These are, by now, utterly immense, and suffer from patchy documentation (much machine-generated), sparse indexing, and members of the user community who tell newcomers that they should be using some facility that is completely unfindable unless you know where to look.

GCC 15 is close: COBOL and Itanium are in, but ALGOL is out

MarkMLl

Re: ALGOL-68 is out

> I have in front of me the 1973 "Algol Primer for Burroughs B6700" by de Souza and Manley of Otago University, and the I/O doesn't strike me as strange. There's no memory access as such; that definitely existed in ESPOL but I don't have an ESPOL manual.

Even at the application level, Burroughs ALGOL was a bit odd because they'd borrowed FORTRAN-style format notation. And code written by Burroughs themselves often grouped all the formatting in a table at the start of the deck, which could make it very difficult to follow.

There's ESPOL documentation at Bitsavers.

MarkMLl

Re: ALGOL-68 is out

> Boroughs algol was always a direct algol 60 descendent. Not a hint of '68 to it.

But that does not necessarily apply to whatever's currently being shipped by Unisys, which is why I checked.

MarkMLl

Re: ALGOL-68 is out

> I once had a job writing a translator from Burroughs Algol to PL/1. As I remember it (it was a long time ago) the Algol dialect was somewhat weird, with instructions for extracting bits from the OS' memory, and a very strange IO system.

That sounds suspiciously like the B5700, which had a completely separate set of "stream instructions" for character processing which- apart from anything else- bypassed all memory protection.

I take it that you're aware of Paul Kimpel's emulator.

MarkMLl

Re: ALGOL-68 is out

> I'm fairly sure it was never a real influence on Ada and that like, or they would have been the better for it.

That authors of the "Minority Report" on ALGOL-68 i.e. Wirth et al. were contracted by the HOLWG to advise on the early stages of Ada development.

As such, even if they didn't like what ALGOL-68 matured into, they were well-informed on the issues that needed to be addressed by any successor to ALGOL-60.

MarkMLl

ALGOL-68 is out

ALGOL, i.e. -60, was never in: and they're very much different.

However an interesting question is whether The Unisys (nee Burroughs) implementations of ALGOL hew to the -60 or -68 language: I've got limited time to delve into the manuals right now.

I was, however, looking at a 1970ish Burroughs ALGOL compiler a couple of days ago and noticed that they'd not adopted a trivial tweak to the language to eliminate the notorious "dangling else" issue: and that tweak was published by the CACM as "Revised Report on the Algorithmic Language ALGOL 60" in 1963.

So extrapolating from that unfortunate example I'd expect them to still be using ALGOL-60 "warts and all", and a very quick perusal of their ALGOL manual dated 2023 appears to confirm that.

https://public.support.unisys.com/aseries/docs/ClearPath-MCP-21.0/86000098-519/86000098-519.pdf 4-61 p259

with their systems programming language NEWP being structurally similar.

So while ALGOL-68 remains of broad interest to language history nerds, only its influence- on Ada, PL/SQL and so on- is really relevant.

Free95 claims to be a GPL 3 Windows clone, but it's giving vaporware vibes

MarkMLl

Text mode?

Appears to be hardcoded to drive VGA at 80x25.

If somebody just wanted text-mode Win-32, they could do far worse than revisit Sanos which could at least run its own toolset.

Type-safe C-killer Delphi hits 30, but a replacement has risen

MarkMLl

Re: The bottom line...

> Rust will just be VS Code and an ever growing choice of extensions, which is how it’s done now.

"slow compilation and debugging difficulty remain big challenges."

https://devclass.com/2025/02/18/state-of-rust-survey-2024-most-rust-developers-worry-about-the-future-of-the-language/, cited by El Reg 20th February.

However history shows that you can't just stick unrelated plugins into a tool, and expect them to work together. As a specific example it's not too difficult to have a form designer and it's not too difficult to have a debugger, but there are very few IDEs which have managed to integrate a debugger into a form's event handler: and most of those either came from or were heavily influenced by Borland.

MarkMLl

Re: What a trollish subheading.

Why should that be trollish? It's a statement of fact: it was basically Pascal which introduced the idea of a distinction between floats and integers, and moved the idea of records from being solely related to I/O to being a core part of the language.

I'm up to here ^ with arguing the merits of different languages, but the fact remains that most of the ideas that were introduced by Wirth in Pascal (and tightened up in Modula-2 etc.) have subsequently been embraced by most other languages- or their successors, or by things that claim to be the same language while in fact being unrecognisable (I'm looking at you, free-form FORTRAN).

We don't have to like Pascal or use it, but that doesn't mean we shouldn't respect it and recognise its place in history.

MarkMLl

Re: Compared to Delphi, FPC and Lazarus are a joke

> Compared to Delphi, FPC and Lazarus are a joke

>

> They are mostly stuck at the Delphi 7 era of many eons ago.

In that case make a one-time posting to the forum or mailing list, telling the developers what's gone wrong.

I use FPC/Lazarus, but that does not necessarily make me a shill for it.

MarkMLl

Re: A colleague of mine uses Delphi/Lazarus

> (/me not knowing Object Pascal or Lazarus): does Object Pascal and/or Lazarus not let you use the forward declaration of regular Pascal to work around that problem?

Yes.

MarkMLl
Flame

The bottom line...

...is that "more sexy" languages have nothing remotely comparable with the Lazarus IDE, which as well as including a form designer etc. has fully-integrated debugging.

Rust, in particular, is going to find that a very high bar to clear.

MarkMLl

Re: Not

> It did install, but after seeing a warning about a non-existent lazarus directory, I bailed and uninstalled.

That was probably the warning it gives you when it's about to set up a local configuration directory (~/.lazarus or similar) to hold the IDE's state.

Most people would consider that a courtesy...

However I would say that most users get it from the Lazarus repo rather than from Debian/Ubuntu, since that way you've got some level of confidence that you've got a version with bugfixes etc.

Legacy systems running UK's collector are taxing – in more ways than one

MarkMLl

Yes, but WHAT SORT of legacy systems? Are we talking about an ICL mainframe, something more recent by Fujitsu or ICL, multiple racks full of x86 servers...

Or is it unmaintainable because somebody doesn't want to touch the software any more (Solaris on SPARC anybody)?

'Maybe the problem is you' ... Linus Torvalds wades into Linux kernel Rust driver drama

MarkMLl

Re: Fair comment by Linus

> ...if the C API developers change their C API that breaks the Rust wrappers, the kernel will fail to compile because another component in the kernel - the Rust wrappers - are broken now and won't compile. Therefore either the C API developers will have to fix the Rust wrappers themselves to allow a full kernel compilation, or take steps to exclude a part of the kernel - the Rust wrappers...

If an API is changed in a way not detectable by the toolchain then any code not aware of that change will break.

The kernel community as a whole ought to be welcoming any methodical attempt to codify the API between subsystems: not necessarily in Rust, but in something testable. And they should definitely be welcoming any attempt to avoid the sort of edge cases that were aired when this stuff was discussed in the context of bcachefs.

Because as things stand I'd say that there's a real risk that somebody- probably Poettering- will fork the kernel in order to be able to favour Rust, and that any distro that is interested in commercial acceptance will use the one better able to withstand regulatory scrutiny.

How Windows got to version 3 – an illustrated history

MarkMLl

Re: Terminal network...

There's a considerable degree of internal support- "hooks"- in DOS from at least v3 onwards, as you can see from the data structures etc. in Ralph Brown's Interrupt List.

I can't speak for MS-DOS, but you could certainly get a networking layer from IBM easily and cheaply, and that gave you streams (?) and mailslots which were compatible with those later implemented by Windows for WorkGroups and Workgroups for DOS: I've programmed them for interprocess communications, e.g. to write an SMS server.

What DOS lacked was server capability, for which you originally had to go to some poorly-understood (to the average sales/support people) product from 3Com+MS. However the significance here is that it was this combination that became LAN Manager, and to at least some extent it was the LAN Manager API which gave both OS/2 and Win-32 their filehandling and interprocess communications APIs... I'm a little unclear whether than includes the "godothisoncompletion()" callback but that was certainly in place by OS/2 v1.

Another significant strand is whether the DLL structure came from OS/2, LAN Manager, or something even older: i.e. "European" multitasking DOS-4. I've actually come across a development tool from that era which backported DLLs onto straight DOS in lieu of using overlays.

So irrespective of the extent to which IBM and MS were able (through wisdom or luck) to plan things to each others disadvantage, some of the ancient history might have roots even deeper than Nina Kalinina realises.

The latest language in the GNU Compiler Collection: Algol-68

MarkMLl

Re: I would try it, but...

Actually, APL was much easier when it used a printing terminal with overstrikes: memorising the 30ish base characters was relatively easy.

MarkMLl

The seminal structured languages

In any event, and irrespective of their relative standing as the heir to ALGOL-60, the really important thing is that ALGOL-68 and Pascal reflect a recognition of the importance of types and data structures in a language.

Many machines of the day actually treated integers as a subset of floating point numbers, and most of them supported the concept of a record as the fundamental structure of input and output. But a robust codification of type and record handling was novel, and it is worth remembering- and hopefully teaching, if only in passing- where such things originated.

MarkMLl

Re: Lead to a bunch of stuff at what was RSRE Malvern

> ... should have been combined as they are in other languages like Java or Pascal.

Although regrettably today's leading Pascal implementation (Free Pascal, with the Lazarus IDE) still stubbornly refuses to provide any facility to embed SQL etc. inline.

Having a substantial amount of an ALGOL-60 dialect that relies excessively on tables declared at the start of the card deck, such separation makes me cringe.

MarkMLl

Re: I would try it, but...

> why are there no programmer focused keyboards out there on the market?

You mean like https://web.archive.org/web/20200217111612/http://www.aplusdev.org/keybdBW.html ?

Page: