The Register Home Page

* Posts by doublelayer

10898 publicly visible posts • joined 22 Feb 2018

Venezuela loses president, but gains empty Starlink internet offer

doublelayer Silver badge

True, but unless they get a move on, that won't make the free credits available to anyone because those expire in less than a month. If Starlink gets fully licensed to operate on February 4th, then nobody gets to have those credits. I don't think Starlink is the primary goal of any of the people who are or claim to be in control of Venezuela, and licensing, even abnormally, takes a while.

doublelayer Silver badge

What does their desire have to do with whether the service is available in those places in practice? The article's point was not whether Starlink in Venezuela is good or bad but whether it is actually there, which it mostly isn't. Its availability in any of the countries you name depends mostly on the number of people with access to dishes and whether they are able to operate them, and Spacex's willingness to exaggerate makes that harder to know, even though Spacex could provide exact statistics on the number of users they have in those countries any time they wanted.

Calling out hyperbole or dishonesty is not the same as opposing a technology or a specific product. Pretending that they are makes your counterarguments pointless and suggests you don't have an argument about what the article was talking about.

doublelayer Silver badge

Re: Ground Stations?

That's not true; a ground station at the same latitude but the opposite side of the planet, so in this case somewhere in the Philippines, is not available to a satellite over Venezuela. There is a limit on which ground stations a satellite can successfully transmit to. The orbits are not parallel with lines of latitude, nor are their connections to ground stations anywhere on the orbital path.

In practice, they can use other stations by communicating between satellites, meaning that they can also use stations at latitudes or longitudes the original satellite can't reach. That slows down the connection, which is one reason to ask about the availability of local ground stations to minimize inter-satellite traffic.

What if Linux ran Windows… and meant it? Meet Loss32

doublelayer Silver badge

You are entirely correct about the answer to that question, but that's not the set of options users generally get. Here are more likely questions, each of which users face individually for different programs or services:

1. Would you like to pay $x for a permanent license to this program or $x/4 per year? Users decide based on an assumption of how long they'll need this thing.

2. Would you like to pay $x/year for this software or nothing for a version that doesn't have all of the features but maybe it has enough to do what you want? The user decides based on their willingness to experiment rather than to buy the thing they know will work. As someone who always starts with the open option because it's free to see how well it works, I find those who choose the other option a little confusing, but I've seen plenty of them.

3. Would you like to pay $x/year for a service or set up your own server and configure and host it with some required system and network admin? The user chooses based on their knowledge of or willingness to learn terms like "firewall rules", "NAT circumvention", "log-based banning of malicious attempts", "sizing your VM for performance requirements", and things we underestimate the complexity of because they're the bread and butter of our jobs.

doublelayer Silver badge

Re: The last thing we want

That's orthogonal to groups 1 and 2 and therefore isn't part of this. The question to answer to determine which of those two you're in is whether you would want them to move to Linux because, assuming you are correct, it would be easy for them to do so or whether you don't care that they're not. Your response suggests group 2 is more likely, but there's insufficient information to tell. The fact that you get asked for support has no effect on whether you want to and are trying to increase Linux's user base among the nontechnical.

doublelayer Silver badge

Re: The last thing we want

You need to decide what group of Linux or open source supporters you are in, with the general options being group 1, you use this and don't care if others do, or group 2, you use this and think it would be better if the general public did too.

If you are in group 1, then you can ignore this. Whether or not this gets developed is not something that you can stop, and if you refuse to use it, then your stuff won't change. There's plenty of Linux software right now, and they're not going to recompile it for Windows only because this would take a long time to build even if they had a bunch of people wanting to and they currently don't.

If you're group 2, however, then you need to understand that you're losing already and no intentional ignoring of Windows is going to change that. Not running some piece of Windows software isn't going to kill Windows, it's going to keep the people who want to run it away from stuff that isn't Windows. If you want them to stop using Windows, then you will have to figure out how to make it possible for them to switch without as much pain, whether that's this or some other type of emulation so their software can still work, finding a workable alternative which already works on Linux, or convincing the people who wrote that software to build it for Linux. I assume you're not doing any of those. I'm happy to be proven wrong.

This attitude is why I'm closer to group 1 these days. I give people Linux machines sometimes, and some of them can and have used those to good effect, but I'm not trying to force people not to use Windows because there are too many problems I can't fix single-handedly and too few people doing anything to try to fix them. There are advantages that we Linux users would get if others started using it, and that's not going to happen as long as we ignore, deny, or as you have chosen in your post, embrace those problems.

UK urged to unplug from US tech giants as digital sovereignty fears grow

doublelayer Silver badge

Re: Please Do

You don't appear to know what the vulnerabilities in MD5 let someone do or how hard it is to do those things in practice. Cryptography is by design very panicky, and as soon as problems begin to appear, cryptographers look to make a better algorithm that will resist them. This is what they should be doing, but it can lead you to overestimate how broken they are.

This doesn't disprove your original point that people with the access to copy encrypted data may store it up to decrypt later, but it does change how practical it can be. They don't need to wait in the hope of finding a complete vulnerability if they're willing to brute force it right now, but that would be expensive. Most of the vulnerabilities found in cryptographic algorithms would reduce the expense but starting from such a high point that it's still not worth doing for most data. For example, DES is indeed considered too weak to use and can be cracked, but that's because it uses tiny keys. Effectively, DES is insecure because computers are cheap enough that it isn't too costly to throw a bunch of them at it, not its other weaknesses.

If you can avoid surveillance, that's great and probably worth doing, but if you can't, it's not quite as over as your summary suggests.

Gmail preparing to drop POP3 mail fetching

doublelayer Silver badge

Re: An email server on my andriod

You can do that. It's not really that hard to run the software on a phone. What's hard is running the software in a way that isn't a huge pain for everyone involved.

Your phone moves from one network to another regularly. People need the address of the server to send mail to it. That means you'll need to do extra work to keep associating your name with the new address of your phone, except that most of the networks you'll be on won't give you a public address because NAT, so having a DDNS system will just let anyone track your general location but they still can't send you mail. It also means that, when your phone runs out of battery or you're in a place with bad signal or you need to turn on airplane mode, then email can't be delivered to you and will time out on the sender's system.

doublelayer Silver badge

Re: Sorry, unless you ruin your own email server (somewhere)

And you can have your own domain and still use Google for that with GSuite if you want. It does appear more professional to do so rather than to ask people to send what might be sensitive personal or financial information to mybusiness@gmail.com. They might be running it equally unprofessionally behind the domain name, so just because they've chosen to connect a more normal mail system doesn't prove anything, but if they're operating from a single, likely shared inbox, it does prove a few gaps exist.

doublelayer Silver badge

Re: Thunderbird for the win

Where does this conflation of IMAP with Google or with this decision come from? IMAP does none of what you're accusing it of. It exists to make email work more normally when more than one device is in use so that they share state. It's entirely unrelated to this. Google could import mail with IMAP too, but evidently they chose not to write that and dropped the feature rather than change it, meaning they're not even using the protocol you're accusing them of abusing, at least not in the place this article is talking about.

doublelayer Silver badge

Re: IMAP4 Was Originally Published in Dec 1994.

IMAP does download email from the server and let you control it. You have the choice not to and only use the server's copy, but no mail client does that by default. Your argument works against webmail, but not against IMAP.

doublelayer Silver badge

Re: IMAP4 Was Originally Published in Dec 1994.

Your argument can never end. They think the example in the article was a niche use case with few users. What from the article indicates otherwise, since that's an opinion? The only thing likely to convince one of you that the problem is bigger or smaller than expected is statistics on user count and purposes, and the article indeed lacks those because Google is the only one who could provide the former and hasn't done so. Until then, you'll end up arguing over the one anecdote from the article or your own assumptions to try to decide how important this feature was to those who used it.

doublelayer Silver badge

Re: POP3S

This is GMail connecting to other people, not running the service themselves. If the other people don't operate POP3S, then they can't use it. I'm not sure what the people using this feature are connecting to, but older services still recommending POP3 probably didn't set up and run a POP3S path. However we judge Google's decision, we can't propose something as a solution if Google couldn't use it in practice.

Baby's got clack: HP pushes PC-in-a-keyboard for businesses with hot desks

doublelayer Silver badge

This thing isn't a thin client or optimized for that. The lowest amount of CPU you can get in it is an AMD Ryzen 5 330 (4 cores, base 2.0 GHz, up to 4.5 GHz). That's overkill for a thin client, which tiny and cheap boxes with Celerons can do just fine. It will also cost more than such things. I'm not entirely sure what this is, though, because unless leaving out the screen drops the price significantly, it seems about the same as a laptop but more inconvenient.

Your smart TV is watching you and nobody's stopping it

doublelayer Silver badge

Re: Look at the bigger picture

Their fear is that some ISP, not necessarily yours, will share access to your (if you've got their equipment) or your neighbors' connections. Not your internal networks, but the ability to send traffic anyway. Basically, Amazon Sidewalk for anyone with ISP equipment, not just those with Amazon IoT products. Short of living on enough land that you can't see your neighbors' WiFi, there's not a lot that could be done against such a thing.

doublelayer Silver badge

Re: Look at the bigger picture

What you suggested is not too hard and is not really worth doing. What Recluse suggested is much harder but theoretically can help. Your approach works on the device level, blocking entire devices from network access. There's a way that's much more user-friendly to do this: don't give it connection credentials or cables. That accomplishes almost the same goals and doesn't require users to do anything with their firewall configuration. Unless you have something that can usefully do something on the internal network, the results are identical.

Recluse's suggestion involved allowing devices to access the internet but only allowing certain addresses as destinations. That lets you use a device where you do want to connect it to something without allowing it to do more. It also means a frequent need to allowlist another address, especially for systems that handle video which usually have a lot of CDN nodes with different addresses. That isn't easy for the untrained user, but it's also not easy for the trained one when you don't have documentation on what each address is and why the device wants to talk to it. The difference, knowing how to manage firewalls and what networking jargon means, turns out to be a relatively small part of the problem.

doublelayer Silver badge

Re: Not here

That's not a high-bandwidth system, but in theory you could implement a communication path that way. But to manage it, you need the device sending video content to do most of the work, implement a system to tell the difference between hidden data channels and what the codes are supposed to be, accept data sent over that connection, send it to the destination chosen by the screen, and not tell people that's happening. Since each manufacturer is doing this on its own and something like that might be the most convincing ways to get regulators to act, the manufacturers can't just implement a standard back channel and all use it. So I think you can be relatively confident that that is not happening. Whether your HDMI source is trustworthy depends on what it does itself, not what the TV sends to it.

Techie turned the tables on office bullies with remote access rumble

doublelayer Silver badge

Re: not a time for sneaky revenge

This is true, but analyze it a little further. Why can't the person being bullied do what you're recommending themselves? There might be a good reason. That good reason might apply to the bystander too. For that matter, the correct response to a bullying situation would depend a lot on the form and topic of the bullying, because, especially in a workplace, someone can bully a person in a way that HR won't object to if they put their mind to it, in which case threatening them with HR is not a productive way to respond to it. By all means encourage people to take action when they can, but don't automatically assume that someone who doesn't take the action you recommend fails to do so because they can't be bothered; the situation might be more complex than you know.

doublelayer Silver badge

Re: Neat trick

There is no need for you to continue reading it, but you may be overestimating the murder rate. There are many episodes without any murders in them to many complaints from the comments from people who want the rate to be higher. Comments about the BOFH often show a lack of originality with the insistence on killings and basically only two accepted methods, but the articles themselves show more variation.

IPv6 just turned 30 and still hasn’t taken over the world, but don't call it a failure

doublelayer Silver badge

Re: The real reason nobody wants to use it

IPV6 would have made that so much worse. It would have looked something like this:

fc10::1 router

fc10::2 PC

fc10::3 and 4 NAS drives

If you wanted them all to have public IPs, then it would get a bit longer to include your public prefix instead of fc10, just as if you had your own IPV4 block, it would be a little longer than 10.0.0. Your local addresses could still be 1, 2, 3. You didn't have to choose fc10, that's just the most common subset of the fc and fd addresses that are reserved for private networks.

doublelayer Silver badge

Re: The real reason nobody wants to use it

If that is what they're saying, they've got it wrong. If you used an ISP-assigned block of addresses, you wouldn't keep them when switching ISP, so you would need to have a plan for doing something about the change in addresses. If you had your own block and just let the ISP announce it, then you could take that to your new ISP with some downtime as they sort out the announcement process, but probably not too much if you overlap ISP contracts. And you can do the same thing with an IPV6 block you've allocated directly. Big companies tend to do that. Small networks tend not to bother. Both should be completely expected by any competent network admin.

doublelayer Silver badge

Re: The real reason nobody wants to use it

The implementation work, getting equipment and software to handle it, getting addresses assigned, getting international routing stable, is all the same. That's the work that's noticeably hard and expensive, whereas the difficulty reading out a string of digits is much smaller if it's relevant at all. What part, other than your memorization or reading, do you think would have been easier with 40-bit instead of 128-bit addresses?

Cybersecurity pros admit to moonlighting as ransomware scum

doublelayer Silver badge

Re: Motives?

Or, just maybe, there was greed and stupidity in the opposite direction from people hoping to get a large ransom and thinking that they could manage that by disabling the protections they were in charge of. In fact, that's not a maybe, that's a definitely. The only uncertain part is whether your assumptions have any correctness at all. There's lots of negligence around in security, but we already know that the victims in this case employed people and paid them to manage it which puts them above plenty of negligent companies who decide that having security staff is already more than they're willing to pay for.

The most durable tech is boring, old, and everywhere

doublelayer Silver badge

Re: In the raw

I'm guessing you use one of a small number of languages that use the most simple character encoding available, because when I get plain text files written in something that doesn't use the Latin alphabet or even one that does but has plenty of diacritics, you get all sorts of fun character encoding games to play. Of course, you might have meant UTF-8 plain text files which is the closest thing to a standard, but you didn't say it and you still have to play the games sometimes when someone hasn't understood or checked that they did that correctly.

You presumably also don't have images, diagrams, meaningful formatting or layout, navigation, or the many other things that plain text doesn't provide. If you are using a text-based format like LaTeX or markdown, then you're not using plain text because you're dependent on all the software used to translate that to the form unfamiliar users will be using. As long as we limit ourselves to the two or three things that text can do, it's great, but don't be surprised that the rest of the world isn't following suit. PDF introduces lots of new problems, but it does fix some of text's.

AI faces closing time at the cash buffet

doublelayer Silver badge

Re: digital transformation!!!!

I won't claim the term is good or defend those who use it, but there are more levels of what can be considered digital. If a computer is involved but doesn't do the work, processes could be considered not digital. For example, if the process for doing something is that I send person A an email, they do manual tasks, they send me an email, then the computers are nothing more than a faster memo distributor. There are many businesses who still have things like this around even though they could be automated so people don't have to do mechanical processes on data. There are at least more businesses with things like that than those trying to do it all using only paper.

doublelayer Silver badge

Re: Gold rush

But after the frontend has handled the clicks, something still needs to get the information and process it, and Java is still often used in that. Certainly not universally, but just because something is a web UI doesn't tell you anything about what language does most of the work which will be whatever the team that built it wanted it to be, and if that team was from one of the many companies that have been building Java-based backends for business applications for twenty years, they're often not changing.

doublelayer Silver badge

Re: Pointless

In each of your claimed cases, I see the chatbots happily lying. Does the product have feature X? The description should tell you for sure, but the chatbot can easily answer yes when the answer is no. When you complain that it doesn't have it, they can point out that the description didn't say it could, which is why the LLM had no information, which is why it randomly guessed yes. If you care about products, read the descriptions. They are not very long. Manuals are slightly better but mostly similar.

Terms and conditions are painful to read, but getting a summary of them which might leave out the parts that are important isn't helping you. If you don't care, you have the option that many take: don't read them at all and just agree. I don't recommend it, but I understand why people do it.

China wants to ban making yourself into an AI to keep aged relatives company

doublelayer Silver badge

Re: Great Danes

In my experience, the dog's thought process was probably much closer to "if I go over there, will that guy play with me? Oh, probably not, they're walking fast*".

* Some dogs, especially young retrievers, haven't learned the fast walk thing and assume that, if you're running, then there must be something fun to run after, so they will run too. If there isn't something fun to run after, then maybe you're the something fun and you want to play the chase each other around in circles game. They eventually learn what an uninterested person looks like, though.

Tis the season when tech leaders rub their crystal balls

doublelayer Silver badge

The last line seems optimistic. Companies have been able to differentiate themselves on good customer service for some time though the bad service they're differentiating from is different with LLM options. When they do, customers who interact with them tend to notice. However, they often can't sell themselves on that because people buying don't know what the quality will be apart from a few notorious ones. The closest I've seen is companies that announce that their customer service agents are local to the country, which in my experience is no guarantee of anything because it's the quality of their knowledge and procedures rather than their time zone or accent that makes the real difference. If it hasn't been a big selling point so far, why should we expect that companies willing to avoid the LLM craze will succeed in selling that to perspective customers?

IT team forced to camp in the office for days after Y2K bug found in boss's side project

doublelayer Silver badge

Anything is possible if we just assume that, but if there was a bug in something else, that should have been in the article. Explaining its absence from the article requires a lot more assumptions including that this was known by somebody not including the people employed to work on Y2K problems in advance; not fixed, mitigated (terminated the affected application before midnight), or hidden; only had a problem at midnight and did not continue in crash mode afterward; and got eventually fixed by someone who didn't tell the IT people about it. That's a lot of things to need to believe and no evidence to use for them.

I think Occam's razor applies here. I can also paint a picture of how this could have been caused by a screensaver. I can't prove this any more than you can prove your story, but since even the reporter doesn't have all the information, we can go with it. The screensaver wasn't intended to count through zero and into the negatives. It was planning to do something special when the timer reached zero, such as displaying an image telling IT staff to start looking for problems, but when trying to do this special case, it did something with RAM it shouldn't have in what was likely a Windows 9x desktop so therefore the screensaver was running with enough permissions to take down everything. The check was only looking for remainingTime == 0, not <= 0, so it didn't trigger again.

doublelayer Silver badge

It would be correctly functioning if it just counted down through zero into negatives, but that's not what happened. I'm not sure what happened, but it evidently crashed at zero, which is not correct behavior or necessary. I keep assuming that there was a divide by zero in it somewhere except I can't find a reason why it would need to divide anything by the remaining time and that shouldn't have crashed the system, just the application. So while you're right that it wasn't a y2K bug, it was definitely not correctly functioning.

Death, torture, and amputation: How cybercrime shook the world in 2025

doublelayer Silver badge

Re: Force of nature

We separate intent and opportunity and rightly so all the time. Your examples are flawed because, in them, there is no malice anywhere and therefore the only thing we ever consider is whether there was negligence. Even then, they're often wrong because there are many cases where we either should or do (correctly or not) decide that there was not negligence and the harm, even though it was preventable, was not the result of significant enough failure to act that we're going to hold someone criminally responsible.

In the case of cybercrime, there is malice and it needs to be considered. That is what we would do in a non-computery context as well. Imagine that a factory has a gate they should lock but didn't. A terrorist uses that gate to bring a bomb close to the building, then detonates it, killing people inside the building who would have been less injured if it had gone off outside the gate. We would often pay a lot more attention to the guy with the bomb than the guy with the key to the gate. Your statement would be equivalent to "This reads like a catalogue of tragedies caused by 'terrorism', but the real culprit is front-line security guards, not magic bombers." and I don't think you'd find many who agree.

You're trying to state a valid, if a bit obvious, point about mismanagement, but you're doing so in a very wrong way when you use statements like "magic hackers" and "real culprit". Those are downplaying the most culpable people. I'm pretty sure you don't want to do that, and all I can say is that you should work on your phrasing because you keep doing that, sometimes in even worse contexts, to blame the only group you seem to care about. If you want to convince people of anything, side arguments clarifying or correcting redirections you don't agree with, or at least won't defend when called out, just delay and weaken that attempt.

Keeping Windows and macOS alive past their sell-by date

doublelayer Silver badge

Re: OS X / macOS: Download from Apple directly

What definition are you using for "clean install", because most of us appear to be using the one that means that the thing you installed four years ago and forgot about gets wiped out. That can help when people have had a machine for twelve years and have a stack of third-party drivers, software that starts at login, license checkers, and settings they probably should update but don't know about. The more you're careful about what you install and change, the less benefit you get from that, but this article is clearly directed at those with less technical skill which, in my experience, often have a few things they forgot about which aren't helping their system's speed or stability.

doublelayer Silver badge

It's nothing like what you describe. Your version suggests it's mandatory or government-supported, when any tax or lack of tax is due to private agreements between Microsoft and computer manufacturers with no external legal backing and subject to anti-competition law if they go too far and regulators are looking (admittedly not often, but when they want to start looking, I think there are bigger problems in tech for them to start with).

The old version was that OEMs could license every machine with a discount or just the machines they wanted to at full price. As long as most customers wanted Windows and intended to get a license with their hardware, the first option was cheaper for the OEM. Things are likely different now but probably go along similar lines. You can dislike it, and you can even raise a regulatory complaint or, where supported by your local laws, file suit about this, but not until you know what's happening and what's not.

doublelayer Silver badge

Re: OS X / macOS: Download from Apple directly

Because this discussion is entirely about getting a clean reinstall of Mac OS, and going through software update from Mac OS will upgrade it in place.

doublelayer Silver badge

Re: Another option

I wasn't one of them, but maybe it was the fact that the proposed solution to an old but likely serviceable computer was to spend money on a newer computer they don't need, segment all their stuff into two different machines when they could avoid doing so, and the one part that was accurate was the long list at the end of reasons why this would be harmful. There can be more reasons to oppose something than alleging that it simply wouldn't work. Something that works but is negative on most or all the elements it's supposed to help with isn't a very good proposal.

Their old computer can likely run a browser for free, no Chromebook required. The things they want to do outside the browser will determine whether the best approach is to stick with the original operating system but harden it or replace it with something likely Linux-based, but neither would involve an extra expense. If they are going to spend money, depending on how much they want to do, they could likely get a better newer computer to run everything on rather than getting something cheap which will only last a little bit longer.

'PromptQuest' is the worst game of 2025. You play it when trying to make chatbots work

doublelayer Silver badge

Re: No pseudo-thinkers here, mate

And I think you're throwing your time away.

"Had a very polite conversation with ChatGPT yesterday, where it returned a load of nonsense on a topic I know well, so we "discussed" how its algorithms were leading its output astray and it had got itself in this mess, and what the right answer actually was. Took about five minutes. Imagine trying to track that down with with Google"

I can, in fact, imagine that. Instead of getting useless answers because they are incorrect on something I already know, I could already know it. Total time taken: 0. The problem comes when I don't already know the answer. It took you little time to recognize that you were receiving nonsense because you knew about it, but if you didn't know about it, you would spend much longer figuring that out before you could resume the useless conversation. A search might be annoying, but there's less completely invented data to lead me astray so the biggest likely annoyance will be not finding what I'm looking for rather than being given something that looks like what I'm looking for but is completely wrong.

And the rest of that was also wasted. You can instruct a chatbot like a student or trainee. The difference is that the student and trainee will remember and learn if they're any good whereas the chatbot is incapable of doing that. All your careful instruction was immediately deleted from the bot when the session ended. Ask your question again and it will likely give you wrong answers again because it does not remember that. It praising you about what you did is not helpful because, even if it could learn, it's effectively telling you that instead of getting correct information from it, your job is to teach it the information you know when it is not giving you correct answers.

From video games to cyber defense: If you don't think like a hacker, you won't win

doublelayer Silver badge

That's a mostly useless pedantic statement since many things people want done can't be done without an internet connection, so I'll raise you more pedantry. If someone thinks disconnecting from the internet will protect them, they don't know what they're talking about. Things were hacked before public networks and things are hacked without public networks. Sometimes, it's a big thing like Stuxnet. Sometimes, it's local pranksters who figured out a hole. Someone who uses "air gap that" without thinking is as useless as someone who thinks a lock on the front door handles it.

You don't need Linux to run free and open source software

doublelayer Silver badge

You can update the OS in place. That's been possible for quite a while and it has substantially improved. It can go wrong, as can anything else you try, but it works very frequently. The reason that's not being suggested is that the article and its predecessor were aimed at the less technical user of an old machine who might benefit from the cleaning that a complete reinstall does.

That's also what backups are for. You can often back up everything from an application so it reinstalls with little effort if you're willing to first go to the effort of setting up that backup regime. The major exception is anything with a license check system tied to a hardware identifier, which is much more annoying. In that case or if the preparation is undesired, then you can save installers and re-execute them.

doublelayer Silver badge

Re: Amazingly title happens to be correct;

To stop you from going a bit too far, you should acquaint yourself with the dictionary for this poster, including:

proprietary adj. Any license poster Eric 9001, formerly known as GNU Enjoyer*, doesn't happen to like, namely pretty much all of them.

Synonyms: open, open source, FOSS, GPL [without specifying a version], pretty much any other adjective other than "free", which will be insistently interpreted as a license that poster does like and, if it's not, they'll fight you without any good reasons.

* Of course I can't prove they're one and the same without access to the user database, but I am quite confident and believe I could convince if it was necessary.

doublelayer Silver badge

"Or, perhaps, they're primarily making it for themselves, and making it available for others, but aren't concerned with mass-adoption."

I think this is a lot of it. Partially I think that because it describes much of the things I've developed and released under open licenses, but I don't think this is all me putting my own flaws on everybody. A lot of UI work is a boring slog and isn't necessary if you assume that people will figure it out the same way I did when I wrote it. Building something which can be given to the general public without training is hard work that doesn't benefit the programmers much, so it often does not happen. That leaves you with the luck of the draw on whether the programmer's initial idea works well or not.

Memory is running out, and so are excuses for software bloat

doublelayer Silver badge

Re: Memory <> Datacentres

True, but it will work in many servers, so anyone wanting to buy server memory can buy it from those guys and anyone choosing what kind of RAM to make with their factory this month would do well to choose consumer, dropping prices for both groups. There's a bit of complexity since some of this is specifically intended for GPUs or other areas where not even normal server memory is involved, and that would likely delay the speed of prices coming back to normal, but even if everyone who bought memory decided to shred it rather than sell it, the prices would still come back to normal as long as they weren't continuously buying more.

doublelayer Silver badge

Re: "long shaken their heads at the profligate ways of modern engineering"

You're correct about many of the specifics, but wrong about your admonition:

"the environmental disaster can be attributed to mismanagement, and lack of care when disposing of waste, but you make it sound like it was a deliberate act."

No, this entire comment thread has been about mismanagement and laziness and the consequences thereof to compare about what people see, correctly or incorrectly, as exactly that among software writers. Programmers generally aren't trying to make their code run in lots of RAM for no reason. Those who use too much are doing so because it's easier, they don't know how to do it better, or they don't understand that there are consequences when they waste it. Similarly, laziness with what you do with hazardous waste can have consequences and people sometimes ignore them because it's easier not to without intending to cause a disaster. In all their examples, they were saying that there are things which could have been designed or maintained better but were not due to laziness, either to make the case that software isn't special and shouldn't be singled out or contrasted as if it's unique there or to blame programmers as members of a wider group. They did not say and clearly do not believe that this is a malicious or pre-planned deficiency.

doublelayer Silver badge

Re: "long shaken their heads at the profligate ways of modern engineering"

But, if the data is stored in string form for presentation so it doesn't have to be calculated for each row using also limited CPU cycles, it does save disk space and RAM during processing. Another version of how it worked did use one byte to store the year number but a basic string processing system which would concatenate "19" to it, resulting in a presented value of "19100", which if properly parsed as an integer for calculations on something else would have made 2000 into 17200.

Those little differences were minimal for individual records but quite a bit more severe when you had a large set of them, a 5 MB hard drive to store them and whatever was attached to the dates on, and a single CPU shared by lots of software. The people who wrote it almost certainly knew this wouldn't work in 2000, but they considered their resource limits and assumed that, by 1995, nobody was going to be running this 20-year-old code or, if they were, resources would be less limited and it could be improved, which it was but at a much higher cost than they anticipated. And in some cases, they should have done it differently from the start anyway. Modern programmers mostly don't have any chance of justifying cutting corners for a bit of RAM efficiency, and those who play memory golf without being very certain it's a good idea can make things worse while still feeling superior.

doublelayer Silver badge

It certainly won't happen overnight, but if they build a new plant to make the fastest memory, then the existing plants that are currently making that can go back to making slower and cheaper commodity stuff. There is some new construction on the way already, and if demand remained this strong for quite a long time, manufacturers would do something about it. Also, demand probably won't stay this high for long because this level of demand is expensive for those doing the demanding. There is a gap between "it's going to be fixed next week" and "it won't grow infinitely". We are in that gap. We will have to deal with high prices and inability to get what we want at a moment's notice, but we don't have to plan for a time, next year or ever, when everyone's got a 1 GB dumb terminal because RAM is too expensive to have anything else.

doublelayer Silver badge

Re: Too True

I suppose it depends what kind of script you were doing that with, because there's some chance that it had less effect than you considered. For example, if this was disk space you were worried about, then the likely change was zero because the old version and the new version would fit into the same two 4 KB filesystem sectors that you were likely using. But maybe these were JavaScript libraries and you were worried about bandwidth, except that traffic was usually compressed at least with gzip at the time and gzip can and would have compressed that kind of thing significantly already. Also, if that was it, you would have gotten more effect by running a minifier against the JS before making the production version which would have removed far more bytes than that. You could have also saved some time by writing a quick script to remove trailing spaces from any line of a file rather than doing it manually.

It's a fine way to deal with boredom, but I wouldn't jump to the conclusion that it helped anyone or even everyone in aggregate. Similarly, when people claim they're working efficiently, it's possible they're playing a puzzle game about whether they can get something to work with less, a game which is fun (and when it stops being fun people stop trying), but it only helps people some of the time.

doublelayer Silver badge

If it drags on long enough, more places will manufacture RAM. There are new companies that are in that market, with both Chinese and Indian manufacturers eager to spread out the market a little. Existing manufacturers can also increase production eventually. If the bubble doesn't break on its own, they will eventually make more because they're otherwise leaving money on the table. And, to get that far, the people buying huge amounts of it will have to continue doing so consistently even though it doesn't fail that quickly and they have limited funds. Until then, some computers will be more expensive, so people will use the ones they have for longer. This problem isn't going to grow infinitely any more than the AI companies can.

doublelayer Silver badge

Re: "long shaken their heads at the profligate ways of modern engineering"

So you agree with your first sentence then go on to perform exactly the conflation I warned against. That assumption is often wrong in many ways. For example, we happily take credit, or at least many posters here do and I happily back them up, whenever the Y2K problem is mentioned. The prevailing description of it as a non-problem that didn't deserve the attention it got is correctly countered with the fact that it was solved through significant and widespread effort. But that was only necessary because of a quest for efficiency, and all the expensive problems avoided through prudent but still expensive work in the 1990s were first created by people shaving a few bytes off RAM use. The programmers who did that in the first place have the sometimes reasonable excuse that bytes of RAM were very limited when they wrote their version. People doing similar things today don't have that excuse. If you do things like that, you are making a bad program in an effort to demonstrate your meaningless savings.

Knowing how to use assembly did not and does not correlate with knowing enough about many other aspects of good programming. Good programmers learn both. Bad programmers sometimes learn neither. But sometimes bad programmers learn the assembly part and sometimes they use it without having gained the rest, and they have been doing this for decades. The reasons for the many changes we have seen should be familiar to any good programmer of the time; it's balancing resources. In their case, it could easily be limited CPU cycles versus limited bytes of RAM, and of course we still handle that tradeoff today. However, another important one is RAM versus programmer time, and programmer time is often the more expensive nowadays. If you write a program that uses up 80 MB of RAM and it takes you two hours, I write a program that produces identical results, runs in 8 MB, and take two weeks, it's very likely that yours will be better for the user. Either they got their program faster, you got more time to test it and fix bugs, or you can add more features than I can. Meanwhile, 72 MB of RAM on a desktop is easily available. If it's an embedded device or there's another reason why the RAM can't be afforded, then that might change.

doublelayer Silver badge

Re: "long shaken their heads at the profligate ways of modern engineering"

To extend that, one of the other consequences of being forced to fit software into very small amounts of memory was and still is that corners get cut. Lots of good practices use a little more RAM and make sure that it's not going to get an answer wrong, if it fails there's something it can recover from, or that it checks security criteria every time rather than having a guess or cache which can become a gaping security hole. Efficiency is not the only good part of engineering standards, nor does managing efficiency prove or suggest that people have been sticking to other good practices. Unfortunately, a lot of people who like complaining or proclaiming themselves superior frequently conflate them.

New boss was bad, his attitude was ugly, so the tech team pranked him good

doublelayer Silver badge

Re: Smut

That's a loop. I already started that clause with ""people smart enough to install the planted material without leaving some traces"". The opposite is also true: the people who think planting false evidence against a coworker they don't like is a good idea are likely not smart enough to avoid getting caught. It's a lot like the people who decide to sabotage their former employers. Most people smart enough to do that without ending up in prison realize that the temporary pleasure of hurting the people who annoyed you isn't worth the risks, hence all the articles in El Reg of people who chose to do the damage and don't have a clue why they were identified and charged in a matter of days after doing their worst.