That's a really nice...
[digital library of private things | application you've used for years | community you've built up over 2 decades] (delete as applicable) you got there... it would be a shame if anything were to happen to it.
79 publicly visible posts • joined 16 Jul 2007
... is that it's easy to remember and read out a dotted quad but essentially impossible to remember or speak ff:12:34:56:78:89:0a:ff ... and it probably really is that simple.
If only they'd just added a couple more bytes to the address and left it with room to add more as needed.
That's how they still work - if you want to do it that way. These days the favoured way is to dynamically link a shared object library, the reason being that hundreds of applications will all be using the same code so you might as well have the OS only load it into RAM once. In other words after not-very-much linking it becomes hugely more efficient, not less efficient, to load the entire library instead of just bringing in bits and pieces.
Only about 1 in 10 people I've ever worked with over the last 35 years or so have really been software engineers... the rest do it for various reasons but not because *it's their thing*, they just kind of accidentally fell into it like accountancy or sysadmin or estate agents, and trudge along in a mediocre fashion, never really caring much for what they do.
That's not quite the case though: because RAM is not reserved by a single application it's easy to conflate RAM usage with the simple requirement of disk space. The reality is that RAM usage is amortised over the usage of multiple applications simultaneously, and not all applications are loaded at once. So while you are totally correct to say that it is way cheaper to bung 4GB of RAM into a server rather than shave 4GB off of the memory usage of the service running on it, it's also still true of an application that runs on a million installations: because 4GB of RAM installed on a million computers is amortised over the varying usage of a thousand applications. Your application isn't the only application to benefit from the 4GB of RAM: the other 1000 different applications also benefit from it, and that makes all 1000 applications quicker and easier to build and maintain, and that makes all 1000 applications cheaper to make, and that means that those million users pay correspondingly less for their software, and it *still* works out cheaper for them to just buy a 4GB stick than it is to pay for the extra cost of the engineering time to make 1000 applications a bit more efficient.
There are other concerns that the commissioners of software have long been concerned about need to be considered yet it's fashionable amongst us older devs to pretend that back in my day, we had to lick t' road clean before fatha would even let us go to school and we made every byte count, etc. etc.
The problem is if you cast your mind back to software development in the 70s and 80s and even 90s ... it was fucking awful. The tools were primitive. There wasn't a lot in the way of useful abstraction, which means stuff took ages to develop, was usually fragile and tied to specific bits of hardware, and riddled with the sorts of potential security bugs that would have you hung by the foreskin until sorry for in modern day computing (but fortunately back then everything was air-gapped).
Not only did stuff take ages to develop, it didn't actually do anything much that everyone takes for granted these days. Software almost certainly wouldn't have worked with any other character set than US ASCII. It wouldn't have handled RTL script. It very likely didn't have any undo feature, or any cut and paste or global clipboard. Your fonts would have been shitty bitmaps on a low-res monochrome screen instead of beautifully rendered glyphs in 32 bit colour rendered slightly larger on your 4K monitor for your tired miserable old eyes. It might not, if you go far enough back, even have had a GUI. It won't have been able to even access more than 2GB of RAM until it got a 64 bit OS. It would likely have just crashed when it ran out of physical memory. All of these things have been added and demonstrably made using software vastly better than it used to be... and it's all cost space. Everything's come at a cost in space, and CPU power.
And while it was taking ages to develop, very few people had much of the required patience and autistic attention to detail to actually do it properly, and so they commanded a very high price, which they charged for a very long time, and this vexes people who pay for these things to get developed, so they're very keen to make it a) easier and therefore less of an exclusive and hence expensive club and b) quicker so it costs even less to make and gets to market faster. And these two last drivers of market forces are the full force of what's driven software development for the last ... 3 decades or so? Make it quicker. Make it cheaper.
It's still vastly cheaper to buy RAM than it is to optimise software. Vastly, vastly, cheaper. Even at todays slightly higher prices. And ... I'm fine with that, because I can concentrate on the first, and most difficult bit of software development - make it work - for longer before I have to worry about the next bit - making it fast.
Today my crappy £200 phone has about 10 times the processing power of, and unfeasibly more memory than, the Windows PC I first used Microsoft Office on. I think it's a fairly low bar to reach to get a word processor running on something "low-powered", or perhaps everyone has become so inured to the incredible bloat of Windows products that it has become some sort of fantastical dream.
"This is one of the perils of trying to automate the assessment of an employee's value."
Are you kidding me ... honestly, this is one of the key symptoms of internal rot in a company when it starts to try and automate things like this. On the one hand - lol, Canonical have just shot themselves in the foot with facepalmingly accurate corporate stupidity. And on the other hand - would you want to remain at a company that actually set any merit behind this sort of treatment.
So long and thanks for all the work, Till.
Having had the pleasure of many languages over the years, including .net, I'd rather stick with Java than almost any other flavour-of-the-month.
Currently doing C code. I forget sometimes how wonderfully simple it is. And just how long it takes to get anything at all of any significance working.
... by which I mean, the same as if they were in the industry using their skills to do the job. Only then will you have a chance to attract people good enough to teach it well.
I would have loved to teach computer science... but not for less than one third of my current income.
I have absolutely no idea. I just install every update that comes my way from the Software Update centre. And then one day it just refused to boot after one of those updates - I forget the exact failure but it was obscure enough that I couldn't actually find a solution on the interwebs.
I have had a lot of trouble with hanging as well - some kernels are not so well tested as they should be perhaps. And don't get me started on Nvidia driver failures.
Even Mint, the go-to replacement for WIndows, has been something of a huge pain in the arse for me, at one point breaking so thoroughly I had to reinstall it completely after a year in use. We have to be honest about it or it won't get fixed: it's difficult, still extremely fragile, and prone to total breakage. When you have to use a mobile phone to trawl Mint forums for obscure issues (on *very* mainstream hardware) because it can no longer boot, you know it's not quite all there yet for everyone.
At the risk of enraging everyone who stands for the GIMP - and good on you, it's worth fighting for - I was rather hoping V3 would have addressed its chronically grim user interface finally. With at least two extraordinarily good competitors from which to crib from - Adobe, for all its sins, and Affinity Designer, a truly excellent piece of software - I had thought the people behind it might have taken a bit of reflection to heart, a bit of deep soul searching, perhaps asked some users of these (very much paid for) tools, why are you paying money to use these things when you could have the GIMP for free? And the reason is, because they are so infinitely nicer to use, that we'd rather give people money to keep making them better.
Well, in the case of Affinity anyway. Adobe can go fuNO CARRIER
It's a bit of a shame that Javascript has found itself in a desktop widget environment in a place where it literally doesn't need to be. Javascript has few redeeming features - in fact the only redeeming feature it has it that by a quirk of history it wound up being the defacto environment for remotely delivered code despite being demonstrably awful at anything nontrivial. Kill it with fire.
I currently make my living rewriting ancient C code into Java code. Some of this C code is in control of things that can and would actually kill people in an extraordinarily messy way. It is possibly the worst codebase I have encountered in 40 years, and by happy chance, I don't think it's actually killed anyone yet.
The new code runs at approximately the same speed, not that anyone can tell because the difference between "insanely fast" and "slightly less insanely fast" is hard to perceive of course. What is most interesting though is that for every 1000 lines of C code or so, I can replace it with about 100 lines of Java code, and I now have the added benefits of being able to reason about what it's doing, write easy-to-run-and-maintain unit tests for it, and most importantly I don't have to worry about the enormous number of completely unchecked memory issues that existed in the old C code.
...and to some extent, "the elderly/infirm/young/other helpless minority" because this leads the main perpetrators of the issue at large - ie. mostly everbody - thinks it's going to hurt someone else and so maybe, just maybe, they'll be all right Jack, and just carry on as they are, forgetting that they might one day also be old, or infirm, or have kids, or - shock horror - poor one day, because there's not an awful lot of need for software engineers when there's no civilisation left to need an internet for because all the poor people who used to work in the fields are now dead and, in fact, so are the fields.
What sort of CIO exactly has sat on this for a all this time and done sweet FA about it? The move to OpenJDK should have been done a decade ago. There are *no* excuses. *No* reasons not to have done it. Don't come bullshitting about compliance or compatibility - because it is just that, bullshit. Do your goddammed jobs. Or perhaps, GTFO and let somebody who has a clue run the show instead.
These are not actually memory leaks, they are object leaks. The difference is a memory leak just soaks away into nothing and becomes untraceable and leads to all sorts of bullshit like use-after-free etc., whereas an object leak is entirely traceable, and the only side effect is an OOME rather than undefined or hackable behaviour of use-after-free.
While you're having fun poking at Tim Sweeney it should be made clear to you that Epic does *not* mandate that purchases are made through EGS - it is purely a convenience. And Epic have actively supported entirely novel mechanisms to manage purchases - eg. digital blockchain stuff (for better or worse - the point is it's not like Apple or Steam or Google where such things are *explicitly banned*)
Rather than just downvote this comment it's probably worth explaining why it's nonsense, and this from someone who has supported native Linux gaming for 25 years (as a creator) and exclusively runs a Linux rig now:
Epic have no real need to expand their market share by about 1%, which is the realistic maximum size of the Linux gaming market, at a cost to them that will most likely actually exceed any profits they might have made on that 1%.
And it is extraordinarily *hard* to write pathologically nasty code in Java. The VM conveniently provides a sandbox to stop RAM overcommitment snafus, although unfortunately thread creation is still unbounded.
To toss petrol on the fire ... 90% of all the C++ code out there could be rewritten in a third of the time in Java or C# and be just as effective as it was before, but without the memory safety issues.
That might have been the case 20 years ago but it is definitely not the case now and it still evolves into an ever more performance, ever more efficient, ever more expressive, and ever more concise platform. Where it succeeded is that it made more people happy more of the time than any other language before it, which is why it's now ubiquitous. Unfortunately this means it has also attracted the lion's share of mediocre programming talent as well, but hey, at least they can't write code containing use-after-free...