Re: Excellent Work !
Burroughs ALGOL-based mainframe emulation: https://retro-b5500.blogspot.com/
53 publicly visible posts • joined 29 Jun 2019
But one of the most significant things about Smalltalk as described in the "Blue Book" was that it went to significant lengths to encourage commenting etc. to make large-scale programming manageable.
Kay, unlike many who attempt to design a language, was intimately familiar with the state of the art: warts and all.
I'm disappointed that your editorial staff aren't sufficiently on the ball to have picked up the lisp/list inconsistency.
The more time I spend with software projects' excuse for documentation, the more I respect the rigour that the Ada community attempted.
I've dug around the history a bit, and in actual fact the Strawman requirements did mention LISP's indeterminate-length lists as something useful to have. Other than that it was ignored, and in the end they explicitly based the language on Pascal (i.e. as distinct from ALGOL-60 etc.).
And then it appears that the DoD's HOLWG actually hired Dijkstra, Hoare and Wirth as consultants: they being the prime movers behind the "Minority Report" which pointed out flaws in ALGOL-68 as first defined.
All of which could be very easily interpreted as an aggressive dismissal of a whole bunch of ivory tower academics including John McCarthy and van Wijngaarden's coterie.
But the bottom line is that neither Ada nor ALGOL-68 had an easy/cheap implementation which allowed an engineer or project specifier to take a copy home or run it standalone on his office workstation. And that's probably why C (and, in its day, Turbo Pascal etc.) outsold it something like 5,000-to-1. Hell, I've seen more copies of LISP sold than Ada...
"software to run your factory"... customised for you by a specialist, built on an Oracle database after they bypassed the technical departments and made a pitch to the directors. The price goes up exponentially every year, and eventually you're told that you have to move to "the cloud" because they're no longer going to support on-site servers. And since you've not been investing in your own people to handle the maintenance, you don't have the slightest idea how to disentangle things and move to an alternative.
Once you're ensconced in the cloud, somebody cuts a cable at the other end of the country and a chunk of the national telecoms infrastructure goes down. So your factory stops.
Well done Liam, very good indeed.
The one thing I'd add is that generally speaking ownership of a particular piece of free software /has/ been retained by the developers, or delegated to somebody who is expected to act in the project's interest. That is why non-compliance with the selected license can be policed.
However, the end user can fairly claim to own the binaries that he is running, have a non-revokable right to continue running them, potentially move them between computers, and to turn to whoever he chooses for support and maintenance.
This is something that Liam and I have been sparring over for the last ten years or so.
The first thing I'd say is that on Linux- in fact I'd hazard any modern unix- everything /isn't/ a file: network interfaces aren't files, sockets aren't files, USB devices aren't files... and even in the case where some device or OS interface /does/ exist as a name in the filespace it very often needs a specialist API which controls it via IOCTLs.
Second, if we do magically decide that we can do without secondary storage and have everything inside the application program(s) like a 1980s home computer or like a PDA how do we organise it and ensure that it will scale?
I can sympathise with Liam's uneasiness at the idea of having data which isn't immediately accessible to the CPU. However, what is the alternative? There really does have to be some sort of organisation even for something which has a single-level addressspace, and if we assume that the leading contenders are environments like Lisp or Smalltalk we have to ask: how is internal data organised, and in particular how is any form of security implemented?
The original Smalltalk books (Goldberg et al.) casually remarked on cases where system owners were free to change low-level details. However the early non-PARC implementors were quick to point out that such things made systems virtually unmanageable since there was absolutely no way that a program (some species of object bundle) could make any assumptions about what already existed on the system.
To the best of my knowledge, there is no persistent single-level environment where every object has an owner and well-defined access rights. Hence there is no way of saying "this stuff belongs to the user and can be read by nobody else", "this stuff belongs to the underlying environment system and can only be updated by its maintainers", and "this stuff is the intellectual property of some middleware bundle released as open-source, and if modified it can no longer be passed off as The Real Thing".
As such, I have reluctantly decided that the idea of file-less flat systems is a non-starter.
> I agree with Tony Hoare.
"Many years later we asked our customers whether they wished us to provide an option to switch off [bounds] checks in the interest of efficiency on production runs. Unanimously, they urged us not to—they already knew how frequently subscript errors occur on production runs where failure to detect them could be disastrous. I note with fear and horror that even in 1980, language designers and users have not learned this lesson. In any respectable branch of engineering, failure to observe such elementary precautions would have long been against the law."
The bottom line is that by ignoring most of the available hardware protection capabilities in an attempt to make software appear to run slightly faster, Microsoft et al. have left their systems vulnerable to any amount of bugs and malware. Even ignoring the cost of theft and extortion, the time wasted scanning for potential problems and fixing any that get through is vastly greater than the time they were able to save.
> You mean like the Manchester MU5 of the late sixties?
...which shipped in '74, some fifteen years after Burroughs started working on their descriptor-based architecture which was released commercially in I think '63.
I admit to being slightly dubious about the ultimate provenance of that architecture. The tag that indicated whether a word in memory was data or a descriptor might have been a side-effect of designing hardware to run ALGOL, which later turned out to have broader applicability.
Well, I remember people running WordPerfect multiuser on CCP/M-86.
But just what are we looking at here? Is this a fairly straighforward compilation of available sources, or is there some sort of shim involved in the same way that Linux used to offer shims to support the APIs of some of the established unix flavours? And how's the UI handled: curses/termcap or something more esoteric?
Confirmation that a modern Linux distro remains compatible with legacy software is at least as big news as the availability of Lotus or Wp for a current OS.
Jenny List, writing at https://hackaday.com/2021/01/29/why-blobs-are-important-and-why-you-should-care/ , suggests that many controller ASIC contain bought-in IP which might include firmware provided under NDA, the source of which quite simply can't be made public.
As others have said, this is hardly a new problem. In addition, Debian does have various binary firmware collections in the "non-free" area of its main repository.
The real problem is when installation requires non-free firmware... and the target system has no supported removable medium from which it may be loaded (not to mention the virtually-undocumented naming conventions etc.). Or even (and I've seen this on SPARC systems... anybody remember SPARC?) when the installation CD contained a blob for the SCSI controller which it didn't actually install.
I agree. Basically, Pascal was a rush job designed in between April when Wirth threw his toys out of the pram and resigned from the ALGOL-68 committee and the intake of new graduates in the Autumn.
But it made too much of a splash, and Wirth was never able to really popularise the "done right" version that was Modula-2.
What is not generally known is that Borland had an 8-bit Modula-2 that they decided not to sell themselves but licensed to IIRC Echelon (Ciarcia's company). However they didn't license the documentation, which meant that it was essentially unsellable.
The 16-bit implementation became TopSpeed, published by JPI and later Clarion.
I believe that Cray also used Pascal-based OSes. However I'd caution that it's likely that the implementation language was a long way from (standard) ISO Pascal or (de-facto standard) Turbo Pascal, in the same way that Burroughs' implementation language for the MCP (ESPOL) was distinct from the ALGOLs that they used for application programming.
Noting Torvalds' public fulminations regarding C++ http://harmful.cat-v.org/software/c++/linus I'm not sure it's prudent to describe Haiku as being written in a relatively modern programming language.
I would certainly agree that using a language which offers decent type checking and modularisation is better than using assembler or K&R C. But let's face it, just about /everything/ has learnt those lessons from Wirth over the last 50 years, and adopting a toolset that offers too much in the way of prepackaged objects can contribute to code which is both enormously bloated and incredibly difficult to maintain when something goes wrong in a layer which the developers normally take for granted.
I don't know who advises the ACM grandees on this sort of thing, but the charitable assumption is that he was working from home and the wrong side of the paywall that protects the ACM archives from plebs like us.
Everybody agrees that Aho (, Sethi) and Ullman wrote and maintained a comprehensive book describing compiler writing. But crediting them with major contributions to the field?
What about Irons (recursive ascent), Grau and Waychoff (recursive descent), Knuth (who famously wrote a mainframe ALGOL compiler over his Summer vacation), and Wirth (an undeniable "doer" and influential on just about every major language)? What about Schorre (compiler-compilers) and Richards in the UK (BCPL?) What about Hoare and Dijkstra, who laid much groundwork even if not significant compiler authors in their own right? Hell, what about Alan Kay (Smalltalk)?
Crediting Aho and Ullman as substantial contributors of original work is vastly wide of the mark, and smacks of the current tendency to lionise "media personalities" and to listen to those who make the most noise.
Thanks for the more recent photos which I think just about wrap this up. I'd comment that Google Maps shows a cluster of (maintenance?) buildings in the area from which that the original Twitter photo was almost certainly taken, so presumably somebody got out quickly to take a photo or there might have been a camera set up taking timelapse shots.
Finally, I think we could all do with remembering the island and people of Puerto Rico, which is much more than just a convenient foundation for the USA's astronomical facilities.
That's been "photoshopped". If you look carefully the cables have completely vanished, and the towers in the "after" shot have fewer tiers than in the "before" one.
Genuine photo at https://astronomy.stackexchange.com/q/38585/7982
Which just goes to show that one shouldn't take stuff on Twitter at face value.
The problem with ALGOL-68 was that when Wirth's ALGOL-W was rejected he threw his toys out of the pram and resigned from the committee. If he'd stayed but voted against the proposed standard than it's likely that a majority of the committee would have followed his lead... it was basically Wirth+Hoare vs van Wijngaarden and a number of his students whom he'd coopted.
And what exactly McCarthy was doing there is unclear, since he'd already abandoned ALGOL for Lisp.
MarkMLl
There was something odd with the magstripe on those early dispensers: Burroughs used some sort of "pepperpot encoding" which I suspect relied more on obscurity than cryptographic rigour.
The TC500 was an interesting brute. A mechanical golfball driven by steel tapes (to circumvent IBMs patent which used steel wires), and a small disc for main memory to circumvent GOK how many patents on using a drum (in particular, I suspect, the LGP-30 which was a similar serial-logic machine).
And software was typically loaded into it using paper tape made out of recycled bog roll. Which was also used for diagnostic software, which more often than not would tear or jam with a customer breathing down your neck.
The bottom line is that we should be asking ourselves: if we can't make it, should we be using it?
International telecoms and cross-manufacturer operability were long ensured by CCITT standards, and I'm pretty sure that they weren't beholden to FRAND agreements. If some technique is so obtuse that somebody can claim that it's protectable by patent then it's probably not the sort of thing that we want in an international standard, even if it would appear to be desirable to keep the teenagers happy.
And while 99% of 5G will probably be teenage chatter and porn, it's the 1% which is business and emergency services that really matters.
MarkMLl
MarkMLl
I'm entirely OK with the idea that a header file is a factual description of a library (etc.) which is otherwise opaque. The function names are a statement of fact, the types- in conjunction with documented compiler behaviour- are a statement of fact, and the order and type of parameters are a statement of fact. The names of function parameters are not necessarily a statement of fact, but can be changed without altering the API.
What I'm rather less OK with is when a header file contains macros, which are themselves functional and as such are not a simple statement of fact. In my opinion complex macros have no place in published header files, since they blur the distinction between the purely factual definition and the implementation detail.
MarkMLl
> For decades now we've had people needing to program as part of their day job and they simply don't have the education or aptitude to be able to use something like C correctly. But that doesn't mean that they can't write perfectly reasonable code in something else that internally relies on C.
It also doesn't mean that their preference for Python or Javascript should carry as much weight as the opinion of somebody who through long experience and study knows what he's doing.
> What do the inexperienced programmers code in to become experienced programmers?
Anything that's available, but not on live projects.
That might be the reason that so many large projects these days run into enormous problems: industry and government isn't prepared to support an infrastructure of dummy runs and experiments most of which are expected to fail, so instead of cutting their teeth on- for example- a moderate-sized program which is one of many running on a bank's mainframe novice programmers are expected to be effective members of a team writing a comprehensive integrated system: and if their collective inexperience breaks it there's no fallback position.
> Yes. But for the practical purposes of this discussion (?), the vast majority of Perl use cases is pattern matching and substitution.
I'd suggest that that's unfair, and I'd point out that most of the substantial corpus of add-ons in CPAN can't be invoked directly from a search which implies that most developers recognise that Perl- once it had started to move away from its AWK heritage- was something more general.
It certainly has exemplary pattern-handling capabilities for something spawned in the 1990s. So exemplary in fact that most languages developed since are at least as comprehensive in that area.
> > Python's major problem is the significance of tabs and indentation [ ... ]
> That. We tried this once with Make. Everyone hated, and still hates, it. But no, Python had to do it again, because once is not enough. And it has nothing to do with helping code readability.
Actually, we tried it with FORTRAN which required that ordinary statements started in col 7. And in ALGOL and COBOL which might have been more relaxed about start column but in most early implementations still reserved a chunk of each input record for a line number.
"Some [preferred] the use of spaces for indentation, in the style of Python or Haskell. However, we have had extensive experience tracking down build and test failures caused by cross-language builds where a Python snippet embedded in another language, for instance through a SWIG invocation, is subtly and invisibly broken by a change in the indentation of the surrounding code. Our position is therefore that, although spaces for indentation is nice for small programs, it doesn't scale well, and the bigger and more heterogeneous the code base, the more trouble it can cause." -- Rob Pike https://talks.golang.org/2012/splash.article
And finally, for another 2d worth, I'd add that Dijkstra's comment about "coding bums" AIUI echoed John McCarthy's words and sentiment which were originally inspired by a certain type of student who obsessed about shaving an instruction or so off a sequence of operations in the same way that a downhill "ski bum" obsessed about shaving a second off his run.
I'm quite sure that you can write impenetrable code in any language, and would suggest that if a language comes along that prevents you from turning a hard-to-understand algorithm into a difficult-to-follow instruction sequence that there's probably some severe downsides to it. But while I'd certainly agree that some of the things in Perl give sloppy workers much more rope than they deserve, I think that the real problem is managerial: most fields of applied science /require/ adequate documentation and review, and the complexity and pervasiveness of computer software makes that requirement even more important.
Ultimately, saying "this code works even though nobody really understands it" is neither an excuse to allow the situation to continue, nor adequate justification to rip it out and redo from scratch. And selecting Python or Javascript for the sole reason that they're popular with inexperienced users is asking for trouble.
AWK is fundamentally different, in that it was designed to match patterns and insert replacements rather than to perform sequential operations with its own robust flow control. Virtually every "shop" had something like that which it used to do things like systematic modification of source files before they were compiled/assembled: I used something called Stage2 (W.M.Waite) for that sort of job.
And irrespective of whether it really was the first, by the time Perl reached v4 it offered a vast improvement when compared with the alternatives. /Much/ more significant than that offered by Python when compared with Perl etc.
Perl's major problem, leaving aside the perennial tendency of twits to advertise their superiority by writing impenetrable one-liners, was the sigil prefixes. Python's major problem is the significance of tabs and indentation, which can make it virtually unusable as a component of some larger system which emits code fragments. Quite frankly, the World deserves something better.
132 little electromagnet assemblies, each delivering enough impulse to a hammer to bang the paper and ribbon onto the right character of the rotating drum and then get out of the way /fast/. Printing long sequences of the same character in the same column was particularly cruel- particularly when you allow that Burroughs at that time wasn't exactly renowned for its R&D expenditure and the printer electronics was basically 1960s technology.
Even worse was the "comb" of magnets interleaved with the hammers. This was inherently fragile and was inclined towards "rapid unscheduled disassembly": I'm sure I've got a few bits in a gash box since in their day they were quite usefully strong.
Don't get me going on Burroughs's power supplies, which typically had lots of transistors in parallel which were inclined to "unzip" releasing things into the atmosphere that today would be considered Very Bad News Indeed.
The Burroughs TD830 (certainly the "J" variant) was based on a 6800 and similarly allowed a binary to be loaded: somebody used the ISO2047 (?) character set- the one with lightning bolts etc.- for a game of Space Invaders.
Allowing that this was decades before code-signing, encryption on the leased line back to the mainframe and so on the security implications are horrendous.
"...easier to debug": that is of course an interesting point, since neither Delphi nor Lazarus/FPC are entirely happy when debugging an app split out into DLLs on account of the large amount of memory management (strings passed as parameters etc.) behind the scenes.