I know there's a risk that this comment may just sound like the ranting is some old bawbag, but do hear me out.
About 25 years ago, I worked for a smallish financial institution. The even smaller investment arm was good. Very good. They frequently topped the charts with their performance, using an in-house dBase application (eventually Clipper) that they had honed over 8 or 9 years by that stage.
There were frustrations. Data had to be loaded each day from other systems, but the biggest issue was the network IO for each client workstation. This was seen as a Moore's Law issue at the time, so more hardware was thrown at it, but really it was that IO problem.
After a while, they bought in an Oracle based system running on dedicated minicomputers. The initial budget was £1million and they ran way beyond that by the time the new system was working.
But... Their investment performance didn't so much tail off as fall off a cliff. Whatever their previous modus operandi, the new system simply did not allow it, and within a short time they became also-rans in the investment game.
I often think of this experience, as around the time of their migration I wanted to explore the possibility of a centralised system fro running the clipper app, like a linux box, running dosemu, which would resolve their IO problems.
I'm not suggesting that that old Clipper program could have been stretch out indefinitely, but that following a perfectly valid and acceptable path tossed the baby out with the bathwater.
The main point stretched by this example, though, is that we in technology, I think, are so deeply inculcated with the assumptions of "upgrade", "better", "improved" and various other ideological concepts that are so closely allied to the technology industry that we never even question them. We think such assumptions of improvement are a natural part of our lives, even though we should be able to learn from the conclusions of, say, Enlightenment thinkers who similarly got mired in a philosophy of improvement. We do not stop to question ourselves frequently enough, or ever.
It may be argued that this ideology of improvement is what has driven the technology industry to achieve the heights it has. That may be the case, but we also need to think about limits to these assumptions, because of other unintended consequences, and also because reason dictates that "believing" in principles like Moore's Law is highly unlikely to be sustainable. We may, ourselves, become part of a problem, if we leave such underpinning assumptions unexamined.