"If it's old, it's obsolete; and if it's obsolete, it needs to die."
That's a bit harsh on some of us, eh what?
Business IT is driven by the need for the new. Not necessarily your business's need, but certainly that of vendors and service providers desperate for new revenue, the dismissal of the old once it's done its real job, and the inevitable prying open of the corporate checkbook. If it's old, it's obsolete; and if it's obsolete, …
This post has been deleted by its author
Now all you need to do is recreate the 6502 in Doom, like this adding machine.
is in fact still being manufactured, but you can't (legally) get a 6501 (Because Motorola sued and won). So I'm glad I got my 6501 before the gavel fell.
That said, it took some years to get me to drop the practice of writing code that could run on the early 650[12] with no ROR instruction, and on the newfangled ones.
This post has been deleted by its author
This post has been deleted by its author
Yes, indeed, and if it hasn't got enough flaws then we can definitely trust Microsoft to add in a few new ones.
I am still reeling from being caught out by their ASR rules cockup from last Friday. Seeing icons disappearing from the desktop (and the Quicklaunch bar - yes, I still prefer to use that!) as well as programs disappearing from the start menu was quite horrifying. That little lot lost me more than a day of earnings.
It's funny, I worry that I spend to much time feeding and caring for my Ubuntu MATE installation that I use for work. Then I see the monstrous stuff that happens to Windows users, and realise that while I probably spend too much time on system administration, at least I can keep working, unlike the Windows users faced with an unfunctional machine courtesy of another Microsoft snafu.
There's enough programs that are good enough for most of us. There's enough special stuff you will have a hard time replacing. So.... good luck, I guess.
I'm Linux only at home (with some BSD in a VM, which does not much for now). Most of the stuff I do at work could done on Linux as well. Not all. And my employer is a Windows shop (except for servers, but then they have a ton of oracle, and that has its own issues).
I'm pretty sure you meant "cheap packing tape" (unless you really mean "the kind of duct tape suitable for actually using on ductwork", although even that stuff is decently tough). If MS wrote duct-tape software, it might look ugly as hell but it would at least work correctly instead of falling over because it's Tuesday. Or because it's not Tuesday.
Mint Mate 20.3 here. Though I confess to running Win10 in a VM all day today, because SDRplay doesn't yet have (c'mon, guys) a native Linux version of SDRuno...
But I do most everything on Linux, because MS has been getting significantly more annoying with each release of Windows, which I must use for work. Imagine yourself beavering along, creating a spreadsheet, or Powerpoint, or whatever, when all of a sudden, a largish rectangular popup, (which halts your work) informs you that "you can incorporate photos from your Android device into documents...try it!"...or some such useless invitation to try a feature I never wanted. You need to explicitly click on these before you can get back to what you were doing. AND, this is on the volume licensed corporate edition of Win10 and Office!
Interesting that you spend time feeding and caring for your Ubuntu installation, my employer issued Windows laptop just gets switched on and used and switched off. No time spent feeding or caring. It just works day in day out so I barely know it’s there.
However, on my personal machines I have completely rejected Ubuntu in favour of its ancestral distros for pretty much the same reason that you aim at Windows. I’m fact my goal is to be rid of both Windows and Linux for personal use.
Just a single data point here.
I have "converted" two non-technical users from Windows to Linux (I started with Ubuntu, now Mint), because I tired of having to drive to their places to help them adjust to something that Microsoft "updated", which somehow damaged their workflow. This would happen approximately every three months, and it was beginning to wear on me.
After conversion to Linux,and after the obviously necessary adjustment period, calls for service are minimal...one per year if that.
Your mileage may vary, but in my experience, Linux doesn't suck any worse than Windows.
my employer issued Windows laptop just ... No time spent feeding or caring
Perhaps on your part, but unknown to you most of the time, if the employer has a half decent It setup then there will be an army of geeks doing the feeding and caring for you. I've had times when it's been hard to get work done between the prompts to close everything for a software update pushed down from the IT bods with little respect for how the users actually might like to use their machines (e.g. the assumption that no-one might leave something running past 6pm and it's OK to just close everything (without saving) and start a software update).
If you spent £$ Several Million developing a system back in the 80s that still does the job then it can be an entirely sensible thing to keep that running, albeit on an emulated version of the original OS because the original machines have long since been rendered down for their gold and tin. It's sobering to see systems you worked on that used to occupy a room full of mainframe nodes and storage drives running in a small corner of a modern linux box.
Sadly, sometimes the businesses who understand this correct sentence add on a second one: "If the code's still running then it's sensible not to keep a staff of 1980s programmers around". This leads to the multimillion system collapsing and nobody being around to fix it, which usually leads to finding someone who wrote code in the 1980s in the hope that their familiarity with the unusual environment will mean they can fix it fast. While this is good news for the person who can probably command a nice compensation package, it's not great for anyone else. By all means keep the old system around, but have a plan of how to maintain it and a plan of how to replace it and execute one of them with the other in reserve.
> It's sobering to see systems you worked on that used to occupy a room full of mainframe nodes and storage drives running in a small corner of a modern linux box.
I rather liked the idea that we could get a linux box to replace the obsolete AccuScore scoring/lane control system at the local bowling alley (which ran on a PDP-11/73) and as part of my University job have one of the students kick off looking into it for a final year project.
Unfortunately the lad kept himself to the scorekeeping software part, which while it had lots of nice features that made it very specific to the BUSA league that year and our local league history up to that point was felt by reviewers to be lacking in documentation and guidance on adapting it for general use.
(As mentioned briefly earlier, when the machine was finally binned I did get to take some of the remaining bits up the road for the admins to see, which was appreciated)
El Reg could set up a new site.. geriatric.com or similar, where like minded types can while away the hours discussing ancient stuff like when datacentres were just mere shoeboxes in the road, pissed on by whippets.
What should be happening is El Reg fronting and calling out bullshit and fads instead of cutting and pasting marketing feeds from "friends".
Sorry, didn't mean to rant, but some of the stories on the register are a week old. That's a long time in computing and it's getting really old, really slowly. I'll shut up now.
What you're describing is what I used to know as 'churning', the endless recycling of the same old ideas under the guise of them being brand new.
The unfortunate fact of life is that software doesn't wear our. Hardware platforms might age but even they can have unlimited life if designed properly. This leads to a marketing problem -- once you've gone through the 'expanding into a vacuum' growth phase how do you continue growing? Hence the churning. This is most obvious with the constant updates that are described by journalists in breathless, exciting, articles about how icons have moved here or menus there, all vitally important operating system design features (of course). Security's the same -- I figure that if there were no viruses then they'd have to be invented to keep hordes of people in a job (which would be largely redundant if the systems were designed and tested correctly in the first place).
"All sound and fury, signifying nothing"
the endless recycling of the same old ideas under the guise of them being brand new
When I started my working life, computing meant using a terminal (they were actually not dumb those IBM ones) to interact with "someone else's computer" "somewhere else".
Then "personal" computers came along and we could do our own thing.
Then networks came along, and we could share "stuff".
And now, the fashion is to use a terminal (not dumb, but not actually doing any of the work) to interact with "someone else's computer" "somewhere else". Only now, it's called "cloud" or "web" and the youngsters think it's new. It's really the same old stuff - but with faster wires, faster processors in both teh terminals and "other people's computers", and better displays.
Where's the old fart icon ? Guess I'll have to settle for one of these.
For those wishing to try, you may be interested in this little book on Object Oriented FORTH.
Took me 30 seconds to find a pdf download.
On my bookshelves I've got three Forth books that I've had for yonks(*), plus the 1981 BYTE book on Threaded Interpretive Languages by Loeliger. I've also got copies of most of Anton Ertl's papers on TIL implementation. Don't need any more.
(*) 40+ years for at least one. I'm a high priest of vintage tech, most of which was new when I learnt it. Now get off my lawn.
FORTH is inherently object oriented which is one reason why many proper programmers don't like it. Its an inside out language where you define the problem in terms of words that describe the nature and properties of the things that you're working with. The end result is a program along the lines of "MAKE IT WORK". Most programmers who have fiddled with it treat it like a conventional programming language in terms of functions and libraries which is why it got the rather disparaging description of "The World's First Write Only Language".
I'm not advocating its widespread use (although I suspect the underlying mechanism behind it is also used in the virtual machines that support semi-interpreted languages). There is a lot to be learned from knowing it, though, and its actually quite useful for low level jobs like debugging complex bare metal, enabling complex functionality to be created, and used, with minimal programming effort.
the rather disparaging description of "The World's First Write Only Language"
This was based on a total misrepresentation - of other languages as being usefully readable.
And that arose from confusing the word "readable" with "comprehensible".
Of course ChatGPT is going to bring this happy situation to the whole rest of human existence.
>FORTH is inherently object oriented which is one reason why many properprocedural programmers don't like it.
Object-oriented programming requires a way of thinking that is difficult for those brought up on procedural programming. One of my Uni. lecturer's point-of-view was that we should be teaching logical and declarative programming languages (such as Prolog) as people's first language which is a small step to object-oriented...
Object-oriented languages are fine for UI, not very useful for real-programmers that use only the command-line!
Most of the windows on my screen are terminal emulators running shells + two emacs windows so I'm old school. One of the systems I've worked on(*) was a fully OO GIS. OO is very useful for modelling real world objects, which is after all where it came from (Simula being the spiritual precursor to Smalltalk). As such it's good for handling real world utility networks and assets. We had a lot of customers all over the world, many with terabytes of data (and this was in the 90s when disks were measured in gigabytes).
(*) Designed and wrote the language and system level level code for to be accurate.
God thats troll bait if ever I saw it
I find it much easier to program using OO because I only have to define the interface to the class and let class worry about whats going on inside.
Which makes it much easier to create/debug/improve the guts of the class so long as it still uses the same interface, plus makes extending the classes a doddle if you need to add various functions to vary the behaviour of the class
Eg
I define a RS232 output class, it gets the data stream and fires it out through the RS232, then extend that into Seimens class which defines the data stream/setup to cope with Seimens controls, or a Fanuc class which defines the data stream/setup somewhat differently.
Of course I could build each one as an individual module but using inheritance makes life so much easier. especially when you find another bug in the RS232 class.
But like any programing technique, it depends on the problem you are trying to solve
"...its actually quite useful for low level jobs like debugging complex bare metal, enabling complex functionality to be created, and used, with minimal programming effort."
Which is why a significant number of embedded systems I have worked on have had, at least at some point in their development, a Forth/RPN style command line interface.
"For those wishing to try, you may be interested in this little book on Object Oriented FORTH."
Dick Pountain ... a flashback to the old Byte magazine ... I am getting old.
I first saw a OO programming language for early Windows, (based on an OO forth), called Actor, in a Byte issue.
Too tied to Windows and hamstrung by Windows memory management to survive but was interesting for the time.
I think (Open)VMS is still kicking and is probably an example of author's contention. I have read that VMS was a pretty decent OS design and implementation even by today's standards. At the time I found Sun's keeping its corporate data on a vaxcluster amusing.
Just worked out how to add an icon. Went for Jimmy Edwards (Whack-O!)
Refurbishing an intel intellec 4 mod 40 microcomputer from 1975.
Just got some 1702A EPROMs (new, old stock) from my memory main man in Scandinavia, so time to build a programmer.
Only thing that I'm missing is a copy of the paper tape native 4040 assembler that loaded (very slowly) via the Teletype...
Even intel don't have any copies or even details of where to look!
(Why didn't I keep a copy? And why did I throw away a lot of legacy kit that was "just taking up room", as the wife said at the time?)
Any Vulture aficionados have any comments?
(Why didn't I keep a copy? And why did I throw away a lot of legacy kit that was "just taking up room", as the wife said at the time?)
From personal experience (and similar regret), the final clause has a lot to do with it. Especially if said in a particular tone of voice.
You could just put together a simple cross assembler first. For a simple processor like this one you could actually build it as a set of macros for another assembler, its been done before (e.g. for 8048/8051) but it largely depends on what you have to hand. If you have to write the thing from scratch then the programs flex and bison will pretty much do it for you -- flex parses the program stream into tokens, numbers and the like and bison, although designed to parse syntax, can be used to plant appropriate bytes in a file.
Once you've got the cross then you could write the assembler in assembler.
I've been a victim of "just taking up room". My lost device was a complete "in a box" Lisa system that was gifted to me (because it was just "taking up room").
FWIW I've got a copy of the 1972 Intel data book for these parts. Its more of a catalog than a technical handbook, though.
The idea is to write a simple assembler model in 'c', being aware of the restrictions in the 4040 processor.
All tables and their handling code must fit into a 256 byte page.
No programme flow from one page to another.
Eight levels of stack.
4K total memory space, code AND label storage.
THEN translate it into actual real live native 4040 code.
The big brains at intel wrote this in 1973. So how hard can it be? Some of the design decisions can be inferred given the platform.
(Taking notes from "Real Programmers Don't use Pascal", all the structured programming in the world won't help, this needs real... Well, modesty forbids! But I don't want my eulogy to include the phrase "And he'd almost got that assembler programme to work...")
A more serious point is that I started with the intellec 4 mod 40 at the ripe of old age of 22, so I feel that I was part of a very small set of 4040 coders in the UK and 50 years on, that set feels a lot smaller...
I wonder if the lofty sages at Vulture Central could put in a shout-out for anybody who might have a copy said RAM based assembler tape?
"Help me, El Reg, you're my only hope..."
:)
Doesn't get much more "classic" than the 4040!
Good luck in your search. It's out there somewhere. (Perhaps as a printed listing?)
"He died at the console, of hunger and thirst.
Next day he was buried, face down, 9 edge first."
--https://www.gnu.org/fun/jokes/last.bug.en.html
One bit of vintage technology that needs to return is dedicated lines, the original backbone of industry. How many times were dedicated lines hacked? Ma Bell would love to put its landlines back into service. Electric utilities used to use their power lines for their phone service (as my dad learned when he toured the Beckjord generating plant along the Ohio River). JC Penney used to have its credit card center in Plano Texas and transactions went through just fine using their old CompuAdd registers and dedicated lines.
It's really about time industry quit using bunch of holes held together with string (the 'Net) for their business critical, proprietary and confidential data. And until PEBKAC can be entirely eliminated, dedicated lines are as close to being secure as there is.
..also landline phones. It occurred to me a while back that if my house was burgled the scrotes might nick all computing equipment and my mobile phone because of the resale value(*) and if that happens how do I contact the authorities or important organisations like banks to get things sorted out?
But it's pretty unlikely that the scrotes will nick a landline handset. So my disaster recovery plan consists of off-grid backups (pen and paper) and a manky looking landline handset. The handset is now plugged into a cunningly disguised VoIP adaptor (everyone in the UK is going to have to do that over the next couple of years if they also want a landline).
(*)Not necessarily the hardware but I bet there are people who will pay good many for this stuff for the information contained on them.
Resale Value? That must have been important to the folks who broke into my house. They took the time to disconnect my Atari VCS and leave it behind while taking the TV. "OK, Boomer" decades ahead of time. Also took some old silver coins, and probably used them in a vending machine.
As for VOIP landlines, unless you provide uninterruptible power, good luck.
As for VOIP landlines, unless you provide uninterruptible power, good luck.
Why - do you think burglars are going to nick my electricity?
But if you're just making a general observation about them not working in a power cut please note what I wrote in my post:
(everyone in the UK is going to have to do that over the next couple of years if they also want a landline)
That ship has sailed in the UK.
The old analogue service will cease being available to new customers this year. The service will be switched off completely in 2025.
Ofcom have already green-lit this and only require that CPs offer some form of power protection to vulnerable customers (or presumably those who wish to pay extra).
"2.2.3 In summary, the principles state that:
• Providers should have at least one solution available that enables access to emergency
services for a minimum of one hour in the event of a power outage in the premises;
• The solution should be suitable for customers’ needs and should be offered free of
charge to those who are at risk as they are dependent on their landline;
• Providers should i) take steps to identify at risk customers and ii) engage in effective
communications to ensure all customers understand the risk and eligibility criteria and
can request the protection solution; and
• Providers should have a process to ensure that customers who move to a new house or
whose circumstances change are aware of the risk and protection solution available."
The service will be switched off completely in 2025
Given how close that is, and how much of the country is still to be fibre'd up*, all I'll say is that it might be by December the 1057th 2025 - if things go well !
* I don't think the OpenRetch plans will have my house "connectable" by fibre by end of 2025. And there's a difference between having fibre available to the pole, and actually having installed in every house served by that pole.
Well said, Rupert. However the resistance to this is almost impenetrable. I've spend a decade trying to get early tech training to include simple bare metal design and development within tight constraints, but I've got nowhere with the official educators. The general opinion is that "you don't need to know that any more" because the grossly over-specified black box will do that for you under the hood. Even the PI is significantly over specified for many applications -- a lot of which don't even need to run over an OS (and would probably run faster without). I based my proposed teaching materials on the 40 pin PIC devices, which have the same memory, more interfaces and higher speed than a BBC Micro on a single chip but are rendered completely transparent by exhaustive detailed documentation. But, as I said, the educators were just not interested (quite possibly because they'd have to do some fast learning themselves).
The net result is that our designers and developers get further and further from the real runtime machine, so nobody (except possibly the cyber crims) really knows what's actually being executed in real time. Consequently the technologies we rely on ever more to keep our societies running get more and more fragile and unreliable, if for no other reason than that nobody really understands them. The ultimate outcome is pretty obvious unless we continue to bury our heads in the sand.
> ``The general opinion is that `you don't need to know that any more' because the grossly over-specified black box will do that for you under the hood.''
Every time I hear someone say something like that, I flash on the first episode of James Burke's `Connections' -- where he describes how complex things built upon simple ideas are the then built upon -- and when he asks "Who would know how to do X?" (one of his examples, was, if memory serves, how to fix a tractor) if something awful happened. We throw away all that old knowledge with some peril.
I flash on the first episode of James Burke's `Connections'
That was a brilliant series of programs which should really be essential viewing for any budding engineer. He had an ability to put things across that few ever manage. I know Hamster fronted a modern day take on it, but it really wasn't very good in comparison.
Another thing to try is to ask someone if they could make a pencil - and at first I bet they'd say something along the lines of "it's simple enough, why not". Then you drill down - do you know how to make the "lead" that goes down the middle ? Could you cut down the tree and turn it into the pieces of wood that go around it ? Could you make the paint that goes over that ? And don't forget, you need to make the tools you need to do all those things as well. You quickly realise that there is probably not a single person on the planet who actually knows how to make a pencil. So when the nuclear apocalypse happens (looks vaguely in the direction of Moscow) and we stop being able to buy pencils made by others (using materials made by others, using tools made by others, etc., etc.), we won't even have something to write with.
Totally in agreement with the post above. It's not just the code efficiency aspect, or cost, security, provability, simplicity - it's the fact that you have both feet on the ground. The actual machine code instructions are fundamental, there is no level below*, no abstraction, no mystery.
I'm sure I'm not alone amongst engineers, in having a "learning style" that requires a continuous chain of understanding right down to bare metal. Educationally, this is how science (and mathematics) is taught, there is a clear path to the fundamental maxims, definitions and root equations.
Odd that a similar approach is not taken with computer science - it's all abstracted to death:
"A method in object-oriented programming is a procedure associated with a class. A method defines the behavior of the objects that are created from the class. Another way to say this is that a method is an action that an object is able to perform. The association between method and class is called binding."
Great!!...
That might be a clever shorthand for what is intended, but offers no visibility as to what actually happens with real inputs, when imperfect or deliberately malformed. You are utterly reliant on reams of unknown code.
Either it's just my old-fashioned "bottom up" approach, or maybe Computer Science is unique in being a subject best taught from the top down?
"Either it's just my old-fashioned "bottom up" approach, or maybe Computer Science is unique in being a subject best taught from the top down?"
I suspect this "uniqueness" of compsci is at best, wishful thinking.
Without some mental model of what is actually happening under the hood much modern technology might as well be magic (sorcery) and as (un)reliable.
...then I'm pretty sure that anything less than a decade old probably hasn't really had all the knots pulled out of it yet and almost certainly contains as yet undiscovered weird failure modes.
There's nothing convinces me that something is good so much as that it's over a decade old and still in use.
It's been around the block a few times. We've found and fixed most of the properly stupid faults in it. We've given that thing a proper workout and it's not been found wanting. Old tech makes me feel a damn sight more confident that it's fit for purpose than anything that's still new and shiny.
Shees! I worked with some Wintel admins who started replacing server hardware as soon as it came of the initial warranty. (Luckily it was leased but switching every time something came off warranty meant we paid top dollar for the latest and greatest.) The department I worked in had projects waiting for more disk space and I recall saying in a meeting something to the effect "Damn! So you're the guys who are running up the IT budget so I can't add more storage!". I wasn't very popular with them after that.
Seriously, old tech is just that. It was cute in the early days but businesses and customers simply don't work the way we used to. We can't simply go back to writing our programs in assembly and centralizing all IT within a company to a single mainframe or server.
Centralization of software running on a central "Cloud" (a modern variant of the mainframe) does make sense for some types of government workloads, though.
Of fresh grads , ink still dripping from the degree, being thrown alive into somewhere with industrial process controls from the 1970's/1980's
The controls have not been replaced or upgraded since installation, because A they work and B it would cost 200 million to convert the plant and sensor system into something more modern (thats 1 million for the computering tech plus sensors and 199 million for 6 months lost production) and their job would be mantaining said equipment so it didn't fall over or stop production
So for us down in the industrial/manufacturing park a 5 year lifespan for a machine or operating system is laughable when we look for a min 10 year lifespan and hopefully much longer, but we need the skills to keep them going (which involves never attatching windows powered equipment to the internet.... )
So here I am in the twilight years (Icon) of what I shall call a career, I have a Engineer fresh from Uni in the last year or so working with me on a project.
His degree taught him C, embedded C, programmed controllers & lots of other useful bits over his 4 - 5 years course that our facility needed in a fresh face as the aging (Annoyingly younger than me) head towards early retirement.
What's the one thing he has zero knowledge on (Although RS232 & over IR receivers & transmitters make it three things), that could be considered rather crucial?
Zero knowledge of network communications, how it works etc other than how to ping a device.
5 - 10 years, or "hopefully longer". Pah, that's nothing.
Anon for what might be an obvious reason. I'm working on a project where it takes more than that just to build the damn thing, and then it's expected to carry on working for "at least" 25 years. Or if politics kicks difficult questions down the road with it's replacement like they did with this, it'll have to last 40+ years like it's predecessor (that I had a very small part in building back in the 1980s !)
Lets just say, obsolescence is quite a problem when stuff is obsolete before you even get it into service.