* Posts by Lee D

3668 posts • joined 14 Feb 2013

Elecrow CrowPi2: Neat way to get your boffins-to-be hooked on Linux from an early age and tinkering in no time

Lee D Silver badge

Re: "The kiddiwonks won't even know they're learning"

No different to telling kids veggies are horrible but they can have something nice if they eat them.

If you don't let that kind of nonsense get to their ears, then they just eat whatever they like.

My daughter has been addicted to peas since a young age. She hates fizzy drinks. She doesn't eat much chocolate. She'd rather have a slice of bread or a sandwich.

After initial world exposure and instinct, kids learn their disgust reactions from their parents (along with their racism, sexism, ableism, etc.). Don't expose them to that, and they just grow up "normal" (which nowadays means not like everyone else). This also works the same for many things - bursting into tears because they had a slight graze or a spot of mud on their hand. If you fuss over them, they'll cry every time and grow up as one of those people cleaning their hands every two seconds with harsh chemicals.

And - an oddball one, but experimentation with my own has borne this out - if you tell kids that they'll feel sick if they read in the car... they will feel sick if they read in the car. If you don't tell them that, they won't. That one is open to more variables (e.g. travel sickness anyway, etc.) but I believe it to be the same.

My kid is 11 now. She goes to a foreign private school. She makes friends everywhere because I've tried not to pass my social awkwardness to her. She eats whatever she likes, and chooses of her own accord to eat sensible foods in moderation and enjoys doing so (she still likes a McD's but who doesn't?). If you try to bribe her into a van with a chocolate bar, you're going to have a hell of time... not least because despite her wimpy appearance (and my "complete victim" childhood) she's a junior black belt but also she couldn't care less about chocolate.

The ONE neurosis she has, which is completely artificial and not based on anything - a fear of dogs. And I can tell you EXACTLY where that comes from. My own mother, who will pull you right out of the way if a dog goes past because SHE has a neurotic fear of dogs too. It's a complete phobia based on zero data - I know, because neither me nor my daughter have ever been bit but we both have a fear of dogs because my mother did the same to both of us whenever we were out walking and a dog came past. I got over that, though, by just fighting the instinct to see dogs as fearful. They still make me react that way, but my brain overrules it deliberately and consciously a microsecond later.

If you teach kids that learning is boring, school is hard (particularly bloody Maths, I have to say, as a mathematician), brussels sprouts are yucky, chocolate is a reward, girls are girly, boys are "just being boys" when they assault girls, that gay people are "weird" or "wrong", mother cuddles you every single time you get the smallest hint of an ouchie, etc. then guess how they grow up to treat those things. And guess what they teach their children.

My daughter literally asked for more lessons as she was bored in lockdown. She not only reads far beyond her age but she writes books and encourages her (author) grandfather on the direction of how his own (adult-reading-age) published books should go and edits them for him. She grew up in an Essex state school and now attends a Spanish private school on her own merit. She's as skinny as a rake, and as fit as a fiddle.

The buck stops at you as a parent. Don't pass your prejudices and neuroses onto your - or other - children.

Safety driver at the wheel of self-driving Uber car that killed a pedestrian is charged with negligent homicide

Lee D Silver badge

Re: I've seen the video many times

(shrug) Doesn't matter.

"I hit the guy because I was on my phone but there was no way I could have reacted in time anyway" isn't a defence. It's a confession.

Fact is, she didn't even know what was happening on the road because she was looking at her phone. Even a microsecond of actual braking would have absolved her... just looking out the front screen, she could have argued that she did what she could but just didn't see the pedestrian - happens a thousand times a day around the world.

But when you're looking at your phone... you're just not driving. That's it. That's how the law sees it. Because otherwise "looking at your phone" becomes normalised, and you become immune whenever anything happens. That's NOT what you want.

It's almost two separate offences. Driving without due care and attention is convictable whether or not you struck a person or had any kind of "accident" (or, in 99.9% of cases, a "deliberate through negligence"). And hitting or killing a pedestrian isn't - in itself - necessarily a convictable offence. She's being jailed for the first, really, it just happened to lead to the second.

All she had to do was not look at her phone. Like all the idiots on the road have to do is not look at their phone. Or the cyclist this morning that I nearly took out who, on a blind bend, decided to text on his phone in his hands, meaning he strayed in a straight line instead of following the bend and ended up crossing the line so the first I knew as a driver approaching around a blind bend was a cyclist not looking, not with his hands on the handlebars, and on the wrong side of the road with only yards of warning (multiple blind bends in quick succession).

I have zero sympathy for someone who is on their phone while driving. Automated sytsem or not. If the system didn't need you, you wouldn't be in the driver's seat, you could be on the back seat having a nap and a Whatsapp. You're in the driver's seat, with brake pedals under your feet, for a reason. Put the damn phone away from the second you move off until you reach your destination. There really is no excuse these days.

Lee D Silver badge

Re: However

Then don't take the job.

Never mind that you can run Meet on any old computer, Google unveils specialised hardware for vid-chat plat

Lee D Silver badge

Re: Obsolescence?

They were there.

Literally, Google Meet only survived the chopping block because of coronavirus. They were scheduled to shut it down just when it started, and had been telling everyone to move to other services for years. Corona hit and they reactivated and spent actual development time on the service for the first time in years.

Who knows how long that stay-of-execution will last, but they've already re-implemented all the meeting-size limits for Google GSuite for Education customers that they had to soup up to cope, unless you want to pay a stupendous annual payment, so it won't be far off if they don't make money on it, I imagine.

Microsoft submits Linux kernel patches for a 'complete virtualization stack' with Linux and Hyper-V

Lee D Silver badge

Host in-house.

I say nothing that precludes hosting in-house. As many schools etc. do for their MIS, even if it's web-based.

The trouble is you read web-based and immediately think that you have to rent a service from a third-party with no control.

Lee D Silver badge

Ah, now you jumped from "We'll use a web service" to "We'll use a proprietary, external, contracted web service with a third party who has no migration facility".

That's a very different matter.

I don't know if you noticed, but desktop apps without Internet dependency are increasingly rare. So you're already tied in like that, the only difference is where the data resides.

And yes, I've done several similar things. And there's nothing stopping you getting data out of Google Apps. It's actually quite easy, made even easier if you're willing to pay money. The other way - I wouldn't know. Microsoft being awkward again?

Lee D Silver badge

Increasingly, I don't care what the underlying OS is.

If it's not do-able from a browser, pretty much it's not worth the effort to support the infrastructure to run it yourself.

If someone said "start up a company (of any size)" to me today, I wouldn't even consider non-web services, whether locally-hosted or not. The problem is once you go down the route of something tied to the OS or machine, you're never going to come back out of that. Start without any OS dependency and you can chop and change as you like.

Schools, especially, have gone from on-site, to managed services, to everything in the web/cloud... even their internal MIS systems. Thousands of users growing up with no more capability than a filtered, monitored, browser will allow them, while staff increasingly move all their stuff to online services anyway. The end-device hardly matters any more (I actually have more Chromebooks on site than Windows machines, and that's been the case for a while now).

It's 2020. Being tied to an OS because of some clunky bit of software that isn't cross-platform and wants to draw on your desktop? Very old hat.

And I work entirely in IT. I'm literally talking myself back from a techy, in-house, specialist, business-critical job to a "keyboard shuffler" where so long as people have a compliant browser, everything else is someone else's problem.

And dual-boot? Very old hat. Who cares what the underlying OS is, it should be able to run the "other OS" in a VM at pretty much the same speed. Our server infrastructure is entirely like that - I have 50% Linux and 50% Windows VMs on a HyperV setup - so why can't the clients? The servers do far more than the clients ever would, with far more a performance requirement.

It's about time we just moved up one layer from "That only works on Windows" or even "that's x86 only".

Tesco self-service separates innocent Reg reader from beer after collapsing into heap of Windows dialog boxes

Lee D Silver badge

Re: Not unusual in retail.

"It's the thing that tells the till how much to charge your customer for the goods that they've scanned, and you allow compromise of it."

Networked or not, if you don't see the problem with that, there's really no explaining it to you.

Russian hacker selling how-to vid on exploiting unsupported Magento installations to skim credit card details for $5,000

Lee D Silver badge

Rule #57 of deploying IT services:

If the cost/time/effort of doing an upgrade is greater than starting all over again, start all over again but this time with a product that supports upgrades better.

An upgrade should be just that - it shouldn't involve redevelopment of things. If going from v1 to v2 means everything you did on v1 is useless, then it's not v2. It's SomeOtherProduct v1.

Take your pick: 'Hack-proof' blockchain-powered padlock defeated by Bluetooth replay attack or 1kg lump hammer

Lee D Silver badge

If you want proper access control, install proper access control.

If you can't afford proper access control (which is often cheaper than this junk!) then you are buying snake-oil. No different to a fake burglar alarm, those devices that make it look like you have the TV on at home, or a blinking LED and a sticker in your car to try to convince people it's alarmed/tracked.

A padlock is not security. They're all easily bypassed in seconds. It's "casual" access control. Often the thing it's on is less secure than the padlock (e.g. the door, the hasp, the chain, the gate, the fence, etc.).

Tying it into the Internet doesn't help anyone. Why you'd want to pass around access to a padlock via an app I can't really fathom - you need a better process if that's a requirement.

Spend the extra, put proper access control on it, and then you can open the gate / door / whatever with a remote-buzz/text direct to a wired access control system that will go mad if you tamper with it.

€65 retail for this apparently. And a piece of junk. Spend the extra and buy a cheap access control system with proper security and control.

Nvidia to acquire Arm for $40bn, promises to keep its licensing business alive

Lee D Silver badge

It is actually disappointing to see such heavy emphasis on AI in all their press releases.

I mean... ARM aren't exactly world-renowned AI-first devices. They're a CPU manufacturer.

I'm really hoping this "AI" junk to die.

Unexpected risks of using Apple ID: 'Sign in with Apple' will be blocked for Epic Games

Lee D Silver badge

Re: Bad decision

I regard this the same as the Amazon Prime Video / Google Chromecast spat.

You guys sort it out. All I know is that I have Prime Video, and Chromecast hardware. I'm not going to buy a Firestick to stream my videos, nor am I going to rebuy them on Google Play Movies.

If you don't want to work together, fine. I'll literally choose one or find an alternative to you both and then even when you "fix" the "problem" (of your own making), then I'll not return to whatever ones I decided to turn off.

You play your petty tit-for-tat (i.e. you couldn't buy a Chromecast via Amazon for a while, I don't know if you still can?). As a customer, I'll adjust my purchasing accordingly - especially as regards your interoperability with other services that I use. And at least one, and maybe both, of you will lose out on my custom in the future.

I can 'proceed without you', judge tells Julian Assange after courtroom outburst

Lee D Silver badge

Re: Assange

"No, manning used authorized access to a system to exfiltrate information which assange published."

So he misused credentials to access classified materials and then use them in an unauthorised way by sending them to an unauthorised party without classified access.

He's also on record as having encouraged Manning to do that, deliberately and specifically.

So you're STILL saying he didn't conspire to break into a classified military system?

So tell me... if I told you to break into your company's salary database - and even allegedly offered techniques, assistance and further contacts to advise you, and then you send it all to me, and I published it online, you think that's not classed as conspiracy for unauthorised computer access? That the act itself isn't unauthorised access or dissemination of privilieged data? That you should go unpunished if it showed that the CEO lied about what he was paid? That neither of us could ever possibly go to jail?

And that's just "private" information. Classified is a whole other ball game.

Seriously, have five minutes and think about it. If I asked a guy at the company you're working at now to give me a copy of your browser history, salary, personnel record, and then I put it online and embarrassed you, you think we should both go unpunished? That it's not consipiracy on my part? And now scale that up to classified information (and please don't say "but it wasn't high-level" - it was classified. That's the end of it, in the eyes of the law and the military).

Lee D Silver badge

Re: Assange

So you're saying he didn't conspire to break into a classified military system?

Lee D Silver badge

Re: Assange

You mean that he didn't encourage people with access to a classified military system to access files they weren't authorised to, for purposes they weren't authorised for, and then hand him the data so that he could leak it to The Guardian?

I never understand quite what people think is going to happen in those circumstances because I can't fathom that it would be anything other than bad for all involved. Which is... well... Manning went to jail, Assange is wanted, and Snowden has to spend the rest of his life hiding in Russia because everybody else wants him.

The only reasoning I ever hear is "the end justifies the means", but the fact is that the means were still quite clearly criminal and the "victim" was a nation state. That's never going to end well.

It's like robbing a bank vault in order to "expose" that the bank was holding the funds of a tax evader or something. Sure you can say it was for "journalistic" (cough) reasons, but the fact is that you still robbed a bank.

And no court is going to just excuse one because of the other - they'll want to punish both for their individual and respective crimes. (P.S. I don't see any other nation state calling for the US to be brought before an international court for the things that were "revealed", and the evidence has been in the public domain for years now, so good luck with getting that done).

When Huawei leaves, the UK doesn't lead in 5G, says new report commissioned by... er... Huawei

Lee D Silver badge

I run my entire house from 4G.

BT are too stupid to sell me a decent speed ("probably 4Mbps down, 1 up" in the middle of a city center inside the M25...). Nobody else has any facility independent of BT.

I would probably go to 5G if I knew there was a good boost in signal/speed. Otherwise, forget it. As it is, I'm just artificially limited on 4G... unless the limit on 4G is 30Mbps exactly, despite the fact that I live opposite the pole that supplies it and have a 4G antenna on my window facing it.

But 4G is enough to run my entire house and all the gadgets and even stream HDTV out of the aerial to my 4G phone. And I have an unlimited data package with a stupendous "fair use" bit before they start slowing me down (1000Gb/month). And it costs me £18 a month, and a little £50 wifi router that's the size of a bar of soap and fits in my pocket if I should decide to go somewhere (it tends to stay at home now, connected to the antenna, because then I can stream from that to my phone wherever I am).

5G isn't going to sell - especially as the dongles/routers are still relatively rare and expensive - until you can provide a service capable of utilising them. There's no point me buying a 1000mph car if every road has a 30mph speed limit, except to show off.

Strange how they can provide service to the 4G mast that they will upgrade to 5G soon enough to cope with thousands of users doing what I do, but they can't just upgrade the line that's still sitting in my porch, unused, because they charge a £160 activation fee, £25 a month, on a 24-month contract, to give me more than 4 down, 1 up.

Adobe Illustrator's open source rival Inkscape delivers v1.0.1 - with experimental Scribus PDF export

Lee D Silver badge

Re: I need some pointers

I call that "The GIMP factor".

I still just dig out Paint Shop Pro 7 whenever I actually need to just do something.

An interface shouldn't require a god-damn training week in order to do simple things like import a couple of images and layer them over each other, and then save them out.

Paragon 'optimistic' that its NTFS driver will be accepted into the Linux Kernel

Lee D Silver badge

Good luck.

That's just the same mega-patch chopped into ten, crudely at obvious file boundaries.

There's still no attempt to actually make it friendly.

When one whole patch is "This adds NTFS journal", without barely a word of explanation in the code, it's just never going to get in.

They're trying to code-dump and it's taken them four versions (after the initial junk they just lobbed at people) to get to this? I can't see it ever getting in in that instance.

As I've said before, this is a maintenance burden that you're trying to pass off to 10-20 years worth of deep-level Linux kernel developers because once it's started to be used, it's their responsibility to keep it working, secure and - most importantly - reliable, within the context of the entire kernel.

You're going to have to do a damn sight better than this.

Classy move: C++ 20 wins final approval in ISO technical ballot, formal publication expected by end of year

Lee D Silver badge

Re: The old problem of the programmer perisope view of the world..

I love the way that you destroy your own argument. So as long as you're the kind of programmer who wants to spend their time finding and fixing obscure compiler bugs, not to mention stripping out all mentions of anything in the standards newer than 22 years old, and consider anyone who doesn't to be a twat and to throw their code away, then C++ is a great language.

But "my professional experience is writing a very specific very narrow X type of software therefor my opinion is universally applicable".

You - and the guy above with a similar concern - miss one point.

At the end of the day, it's a language. Languages have to be understandable. Computer languages, especially, need to be understandable and unambiguous. When people start using niche features in ways that modify the expected outcomes of simple statements drastically, understandability goes out the window, and you end up with a billion "accents", but not just ones where certain words are inferrable from context or where you could compensate via the grammar for not understanding a particular word.

No. Accents that totally and radically modify the meaning of the code and are unfathomable, untranslatable, and yet perfectly valid code.

It's like saying that English and Italian are the same language because we have some Latin roots in both.

The irony being that all mainstream programming languages are written in English (in terms of keywords etc.) but yet C++ appears to be written in gibberish. You literally stand more chance of understanding a random coder plucked from the world's spoken language than you do their C++ code.

And the "fix" from the expert that tells you not to program in small isolated niches is "throw all that junk out and program in a version of the language that's 20-years-old, because that's the last time it worked properly - apart from all those compiler bugs that I have to fix myself".

Now I'm a great fan of C99, I consider it "definitive" in terms of C coding, and coding styles documents for all kinds of projects tend to agree with me (not to mention very specific compiler switches). But when we have to stick with something 20+ years old to make a language serviceable, yet developers are taught and grow up with features like this that are in all mainstream compilers and expect them to be present, and people continue pushing more junk into it, it's time to call it something else and move on.

Quite literally: "C18 addressed defects in C11 without introducing new language features" and C11 only added things that weren't present in the core language (e.g. multithreading primitives), standardising a few de-facto headers (e.g. complex number types), and removing stuff that was dangerous (gets). 19 years of evolution consisted of the kinds of things that didn't affect any existing code whatsoever.

I've yet to see C18/C11 code in the wild in any significant capacity. Hell, there's more talk of Rust in the Linux kernel than in C18/C11.

19 years of C++ just turned it into a further abomination from whence it started (shortly after standardisations started using years).

Call it something else, so you can at least differentiate between "C++" and "C++ with new knobs on every year". Then you can see if people are actually using those features rather than just saying they're coding "C++" and not specifying a version.

Seriously, a language that's only viable when you ignore 90% of the language development of the last 20 years isn't one you want to support, maintain, compile or analyse (I think you'd struggle to even write a parser from scratch that could say definitively what version of C++ was required to compile it, iwhtout basiically implementing a full C++21 compiler).

And, no. I never, ever, ever, ever want to have to "fix" my compiler against bugs nobody else has run into. And I certainly wouldn't put my professional reputation on the line against such a compiler making my code run as intended. Debugging is often a difficult enough task as it is, without having to consider that the compiler might have hit an untested edge case.

Lee D Silver badge

God, I'm glad I don't go near C++, it's increasingly getting to become such a nightmare that I don't even understand some of the new features at all. That "concepts" things just sounds like utter nonsense to me.

And though the "three-way-comparison" thing sound good at first, I just see a nightmare of overloaded operators in its future.

All the C++ code I see is generally nothing more than overly complex C code shoved into some paradigm that makes little sense in the circumstances and often generates far more problems for maintainability than it does actually resolve any readability or "object" resolution issues.

I have literally abandoned contributing to some OS projects when they went from C to C++ for no fathomable reason beyond "Hey, I know C++, so you guys all need to learn it too", and then littered the code with tons of unnecessary furniture while leaving 99% of the actual logic exactly as it always was, but with horrendously complex access to simple variables.

I get it, I really do, I remember reading my first books on C++ many, many years ago and it all clicked and I thought that it was amazing and wonderful and so full of potential. And everything I've seen since was a disaster of unreadable code with unpredictable consequences. "C with classes", I would pay you money for. C++ I wouldn't give you tuppence for. And as it evolves it just seems to get worse and worse, and that's when the compilers fully support all the new stuff (which they never do).

There was a time where you could pick up a programming language, learn it in its entirety in a few weeks at absolute worst (for a competent programmer), and then be able to read and understand every bit of code ever written in that language - if not in intent and programmer choice, than at least in syntax and outcome and general gist.

I find that C++ is just a cryptic cipher.

And that's someone who's modified their own Linux kernel code (not just "applied a patch" but actually written code and fixed bugs that nobody else had on obscure hardware), ported other's applications between OS and architectures, written their own programs, etc.

Gimme C99 and a decent gcc any day to all this nonsense. No wonder people are jumping on Go and Rust and everything else (though they are increasingly starting to follow the "just shove the feature in and worry about how people are supposed to read it later" philosophy too).

Angry 123-Reg customers in the UK wake up to another day where hosted mail doesn't get through to users on Microsoft email accounts

Lee D Silver badge

Last dealing I had with this particular shower was when they rang one of my web design clients (who had complained their site was not working) and told them that the user must have deleted all their websites.

Strange then, that the FTP was entirely blank yet I was the ONLY person with any access to it - even my customer didn't have the details available to them and wouldn't have known what to do with them if they did (we're talking the days of FTP and Perl CGI), and couldn't have deleted EVERYTHING (including the cgi-bin folder? I don't think so). They paid for it, but only I had the login.

After much yelling and screaming and fielding of my client's accusations, I got on the phone with 123Reg and they told me I must have deleted the FTP. Yeah, because I go through customer's accounts that I haven't dealt with in months, that were working fine the day before, and just randomly blank out their FTP leaving nothing behind, as a user with permissions that just didn't exist on FTP (i.e. you *can't* delete the cgi-bin folder!). I wouldn't mind but I hadn't used FTP for about a week prior to that at all.

I demanded they provide me with logs, then, to prove what IP the login had come from because it was a serious, customer-affecting, money-taking server and a serious data intrusion if I wasn't the one to do it. They literally told me that they don't keep FTP logs. But then went very quiet.

Miraculously a few minutes later the home folder had a cgi-bin in it again, albeit empty. I just uploaded from my latest backup and carried on and five minutes later it was all working again. I was not the only customer to be affected. I suspect they lost some storage and were keeping it quiet. Though my repeated request for those logs and any kind of access proving that *I* had done it never materialised.

Maybe check before you start accusing random third-parties of destroying websites of their customers. Because if the client had sued, I'd have been suing for defamation too.

Lee D Silver badge

1&1 gets blacklisted about once a month - we used to use them for our Exchange as an upstream mailserver.

It got to the point where I just added it in as a low-priority and we just sent emails from our leased line ourselves, because it happened so often.

Eventually we moved all to GMail, which doesn't have that kind of nonsense.

I mean, I get it, I run my own mailservers for dozens of domains for myself and for work, but if you keep up with the settings and don't spam, you don't get blacklisted. If you don't, then you will.

A big place like 123Reg or 1&1 shouldn't be subject to that, and should have enough backups in reserve that they can just switch to another block of IPs if they are blacklisted, until they can "un-blacklist" themselves (which only really happens if you're unresponsive when you have a lot of spam).

One of the basics of email: No unauthenticated email, and then responsively track and block user accounts who spam. If you get blacklisted it can take 24-48 hours to clear them even in the best instance, and the only way around that is to have plenty of addresses which you DO NOT USE to cycle through. Should not be difficult for someone of that size.

You certainly shouldn't be DOWN in terms of email. Unless you're stupid and did something like put the IP's directly into your SPF record so that you can't easily move to another set of IPs (there's a reason that you can use lookups and includes in SPF!).

Vivaldi offers users a 'break' from browsing. No, don't switch to Chrome... don't sw..

Lee D Silver badge

Paid Opera user from 3.5 up till 12.

I gave up on them when they just removed the email client, and then changed their custom web engine for another-Chrome-a-like.

Went with Vivaldi because that looked like the thing to do.

They ditched the mail-client (even though one preview accidentally included it!) too. So now I have a Chrome-a-like where the only setting I really change are cosmetic (e.g. removing that ridiculous "Start Page" junk) and hotkey (I like Ctrl-N for new tab, not new window).

I regularly question why I bother. It's just Chrome. But they make all the same mistakes as Opera did (and they are the same programmers!) - forcing a custom user-agent and then realising it just breaks lot of websites and reverting it in the next release.

But they've changed the application icon no less than FOUR TIMES since they started Vivaldi, each time listing it in the changelog like it's some marvellous advancement.

To be honest, Vivaldi is just another-Chrome that I use to save me running Chrome and Chrome Private Window, so I can have two different Google Drive etc. sign-ins simultaneously.

Each day I use it, I question more why I do.

Microsoft: We're getting rid of Flash by the end of the year - except you can still use it

Lee D Silver badge

Please give one example of something you want to do.

Because a few years ago (and I'm sure it's matured since then), I recompiled a 10-year-old GUI application for Webassembly and - with nothing more than minor tinkering to take account of the browser security DOM when accessing files or network ports (Websockets) - it just worked.

The problem with Flash is that it does nothing that those techs can't do, except when what you "want to do" is something you shouldn't ever be doing in a browser anyway (e.g. local filesystem access).

Lee D Silver badge




I've banned my users from having Flash for the last 6 years and I thought even that was slow!

Nintendo revives Game & Watch portable proto-console, adds color to 2.36-inch screen

Lee D Silver badge

Re: So... can it be hacked?

Unlikely, they use a 4-bit tiny-RAM, 32.768 Khz chip with a fixed LCD display (i.e. they can only show the exact characters that they show, only in a few positions).

You can emulate them on MAME, but they aren't exactly multi-purpose, mainly because of the display and (very) limited RAM (think handful-of-bytes, not kilobytes).

Lee D Silver badge

MAME emulates these perfectly, and if you get the artwork they look amazing (SVG scaled to your screen!).

I had two as a kid and they are spot-on emulated.

Nintendo et al only "reimagine" something to keep their IP cash flowing, all those years people asked for it they weren't really interested (they did a couple of Gameboy ports that were half-hearted, if I remember correctly, but they've never bothered to offer them properly).

I always think that companies like that should look at what people are so desperate to have that they "steal", and then offer those people that thing at a reasonable price. Everyone's a winner. Money for old rope, maybe even support of an open source project to make it happen and recognition of the preservation efforts of those involved, and users find it easier to buy a GOOD emulation / recreation of the thing they wanted rather than have to resort to software piracy.

Fun fact: They run off a 32.768KHz 4-bit chip internally, with a pittance of RAM (literally bytes).

Brexit border-line issues: Would you want to still be 'testing' software designed to stop Kent becoming a massive lorry park come 31 December?

Lee D Silver badge

Re: I am sure Boris will be on holiday when the $h1T hits the fan


Maybe he'll get stuck out there and not allowed back in.

While you lounged about all weekend Samsung fired up its biggest-ever chip factory and started cranking out 16Gb LPDDR5 DRAM

Lee D Silver badge

Re: Always registered.

So long as you don't buy Dell shite, with their stupid plastic caddies, etc. then PC components are pretty much interchangable.

I've yet to find an advantage to building your own PC, not least because even if you had bought Dell you could just phone them up and say "I bought this thing from you and need a new Part X". Build your own, you're on your own, and therefore you need to know what you're doing.

I don't see that a modern PC is a very complex device at all, to be honest, from the user point of view. Stick in a card, it either works or doesn't and if it doesn't, it generally means you didn't account for power requirements (which, to upgrade, pretty much means a new PSU or possibly even a PC anyway).

Lee D Silver badge

Re: Always registered.


20 years working in IT, and I've never had to match a RAM and motherboard and CPU socket combination. If you want to play silly marketing games, then someone else can mess about and guarantee compatibility for me, and I'll back it to them if not.

Set a standard, sell me a chip. Because the only thing I ever upgrade on a PC is the RAM anyway, and usually by slotting more in, rather than replacing what's there.

Last time I knew / understood the naming allocations of things like that, I was looking at adding 2MB to a 386... and then it turned out that we had SIP chip sockets for main RAM and not SIMMs on the motherboard (I kid you not... the only other place I've seen SIP's, mainstream-PC-wise, was in a Laserjet 4).

To be honest, I actually laugh when people tell me that they "built" their PC nowadays. What they mean is they bought a kit of already-compatible parts and plugged them together using the modern equivalent of ZIF sockets, and ended up with a machine that's no cheaper and no higher-performing than if they'd just bought a decent PC as a unit. Oh, unless you count that LED junk that slather everything with. No word of a lie, last time I selected a RAM chip for my personal use, I had to exclude those with RGB LEDs on them... I mean... WHY?!

P.S. Those two memory "features" sound bog-useless unless you're clearing RAM, and I can't imagine that you do that often enough to warrant specific support for that, presumably in the motherboard and the OS. If you do, it's after you're done with something, so who cares if it takes a few microseconds to finally release the RAM after clearing it?

What's 2 + 2? Personal info, sniffs Twitter: Anti-doxxing AI goes off the rails, bans tweets with numbers in them

Lee D Silver badge

AI at its finest, then.

Completely devoid of understanding, meaning, context, but instead just choosing things based on arbitrary learned properties of the training images (e.g. black text on a white image).

There will come a day when people will realise the inherent weakness to all AI (zero inference), and that even the "AI" we have is basically just a large, dumb statistical model on unknown parameters that it "formed" itself, with zero insight into that choice, or any ability to modify. For all you know, it's detecting copyrighted images by the second-from-the-left top pixel being slightly green.

Death Stranding: Essential worker simulator unites its players amid a lockdown far worse than the real-life one

Lee D Silver badge

Re: Excellent game

I love open-world games.

I got this free with a gaming laptop I bought, first day of its Steam release

I downloaded it, loaded it. It looked really good.

I still have the TWO HOURS of gameplay on my account for it.

In that time I estimate I had control of the game for approximately 5 minutes (for reference, 2 hours of cutscenes and crap gets you to the bit where you first run out to the building near the president's base, and first encounter the aliens for real and have to get back to the base).

I got back to the base. Deleted the game. I was SO BORED.

It ran beautifully, looked gorgeous, but it just wasn't a game. I was expecting at any moment things to kick off... but in reality it just didn't, and I got bored of the "this will be it... this will be the moment... oh... no... another cutscene" events (meeting the girl in the cave, finding the base, the car crashing, meeting the president, etc. etc. etc.).

I hated virtually every minute of the "game". It wasn't a game. I'm sure it does turn into some kind of game later but I got bored waiting. It's only because it was a redeemed code that I didn't pay for why I didn't try to refund it, but I would have if I could.

I'm now having the same issue with Control (again, getting bored of just too much crap in between playing and then just a basic fight in-between cutscenes that you expect to somehow "explain" themselves and it just never happens).

I bought a gaming laptop after 8 years of having an old one and not playing any "new" titles between (everything since GTA V, basically). I thought that would mean I'd have the absolute best of gaming, on a powerful computer, ready to play and catch up on all the amazing advancements in that time.

So far, apart from Batman Arkham Knight, they're all just - literally - pretty shite.

Southern Water customers could view others' personal data by tweaking URL parameters

Lee D Silver badge

Yep, but you do that AFTER they aren't responsive/helpful.

Lee D Silver badge

Threaten legal action?

Not a problem, mate.

"Hello, is that the Information Commissioner's Office? Yes, I'm being threatened with legal action for reporting a severe data leak that I'm hoping hasn't gone unreported to you guys despite the company in question being aware of it since date X.... I believe it to be a major GDPR violation affecting the personal information of tens of thousands of people on the Internet and is still currently unresolved."

Oh, you want to DROP your legal action against me now, do you? Funny that.

IBM ordered to pay £22k to whistleblower and told by judges: Teach your managers what discrimination means

Lee D Silver badge

"working hours are whatever time is necessary to deliver the work the business may require of me"

UK. You're an idiot to sign that.

A contract that REQUIRES you to opt out of working time directives is actually illegal.

Half a dozen workplaces over 10+ years, plus 10+ years of self-employment. I have never worked a weekend in my life. I have worked late nights, and worked one Sunday when the shit hit the fan, and then got MORE time off than one day in lieu, but I didn't begrudge that and it was so important that "my contract" didn't come into my consideration of that. Never worked a single hour of unpaid overtime, in fact never worked more than 10+ hours of overtime in total in all that time.

If that's the way you want to run your life, fine. But no company can force you to have such a contract, or make you opt out of working time (that's LITERALLY the point of that law!), or force you to work uncompensated.

If you choose to, fine. But your contract is literally illegal if it says that, and if it doesn't say it then you can't be sacked for not doing it.

The closest that the vast, vast majority of the people get in such things is an "As reasonably required..." which legally means "It's 5:01pm but we had to clean something up, so you can't just drop tools and walk out because it's past 5:00pm", not "you must work Saturdays uncompensated".

Lee D Silver badge

"with the practices including work calls scheduled for 8am on Saturdays"

The phrase there is "Sorry, I'm not contracted to work Saturdays". I've used it many times. If someone wants to sack me for that, that's on them - it'll cost them big and I'll be happy to just leave such a company and then sue them to oblivion.

And no. "Any other reasonable..." clauses do not automatically include unpaid overtime without negotiation on uncontracted days. And when you want those, the magic phrase there is "negotiation". My position will consist of "It's a weekend, I'll charge you ten times more per hour with a £1000 charge for the first hour" and we'll start negotiations there.

I don't understand why companies think such things are necessary. Even an international company - 8am on Saturday in the UK is what? 1am in the US? You'd have to be talking to someone in a Pacific Island late on a Friday their end for it to be 8am on a Saturday in the UK. It's just unnecessary, unless it's an absolute critical emergency and then my question would be why do you only have guys in the UK capable of handling your emergencies?

This is just toxic-workplaceism.

Huawei mobile mast installed next to secret MI5 data centre in London has 7 years to do whatever it is Huawei does

Lee D Silver badge

Anything that can be hacked by mere proximity to the building is insecure by definition.

If you can do anything inside that you don't want people outside to know, or vice versa, you have to take more precautions than "Can you just move that a bit further away, thanks lads".

And if everything is not completely encrypted and in a radio-dead-zone (whether by not allowing radio devices in or out, or by shielding) then it's game over anyway. Someone could stick something in a lamppost and get the same effect and you'd never know.

This is "security by proximity" (assuming that you have to be near it to be secure, or conversely that anyone far away can't access it), which is worse than "security by obscurity".

Relying on plain-text email is a 'barrier to entry' for kernel development, says Linux Foundation board member

Lee D Silver badge

Re: "they don't know how to send a plain-text email"

Barriers to entry?

Or a handy test for minimal IT skills?

Marketing: Wow, that LD8 data centre outage was crazy bad. Still, can't get worse, can it? Finance: HOLD MY BEER

Lee D Silver badge

Re: The Cloud

Yep, but with on-prem, you can do WHATEVER is necessary to resolve it, and also know everything you want to know about what's happening, instantly.

To many people, that's worth far more.

Not even including the fact that I don't have to worry about what they do with the hard drives with my customer's data on when we're done with them.

That's what annual bonfires are for, dontcherknow.

Backup a sec – is hard drive reliability improving? Annual failure rate from Backblaze comes in at its lowest yet

Lee D Silver badge

Within 4 years of me taking over a site with dozens of brand-new Seagate drives in servers, NAS and desktop, every single RAID array or device with them in had experienced enough complete failures for us to just bin them - sometimes multiple rapid failures, sometimes the rest of the "working" drives pulled and put in other circumstances and failed there too.

In 6 years of running WD (as new replacements for the above, or new systems entirely), I've had a couple of WD Blue failures in desktop clients. That's it.

I know this stuff gets very anecdotal and people get possessive over their favourite brand, but literally I would never chance my data on any of the Seagate drives that are available to me, as a consumer or an IT guy. Sure, I don't do big datacentre stuff, but most IT isn't that and they can make whatever decision they like.

Lee D Silver badge

Re: It's only been 65 years

I find it interesting that SSDs tend to fail in completely different ways:


And the write-life of SSDs is only ever growing. And when you think about that, that's the hard part in SSDs. How do you write to something that will maintain that data for a long period even if you never touch it again? That's why Flash and other tech took decade to be stable. Once it's there, of course you can read it a billion times, but how do you encode that state permanently enough, but where you can change it on will very quickly if necessary?

Hard drives are going to die, and I'm surprised they haven't already. And in the space occupied by an enterprise 3.5" hard drive, you can literally fit dozens, if not hundreds of NVMe-sized chips. I would imagine that in the space of even the largest enterprise hard drive you could easily fit ten to a hundred times the capacity with SSD. With no moving parts.

Now think - you're a hard drive manufacturer struggling to keep up with the technology to produce ever-tinier tracks and heads and cram in more platters. Or you can just fill a hard drive casing with dozens of rows of chips, beat it in capacity, and effectively sell your customer a "RAID" (remember when the I used to mean "inexpensive"?) over all those SSD chips without ever revealing that to the drive interface. One fails, just kick in one of dozens of hot spares and report that in your SMART output. Same capacity. Faster speed. Reliability of individual components made up for in extreme redundancy inside the device itself.

Write endurance in commercial SSD is into the thousands of total drive writes. The survey/paper above, for 1.4 million SSD drives, show that pretty much drives fail in other ways long before they ever hit the limits of write endurance.

I personally think that the large manufacturers should be totally abandoning mechanical hard drives and just ploughing all their money into cheap SSDs. At the enterprise level you can match or exceed everything about hard drive speed, response time, capacity, endurance, life-time and durability. Give it a few more years of R&D and hard drives will be obsolete. SSDs have come further in the last ten years than hard drives have managed in decades.

Lee D Silver badge

Re: It's only been 65 years

Fact is that you're spinning something at 7200rpm or 10,000rpm in a real-world environment and want to consistently write to and read that data from tracks whose sizes are magnitudes thinner than a human hair, in real-time, without delay, in a device that costs less than your year's broadband subscription and which will sit in a computer for years and years, subject to vibration, movement, electrical changes, temperatures etc.

Hard drives are an incredibly difficult thing to make. I'm amazed they've been able to continue this far, to be honest. When you consider SSDs, it doesn't seem as difficult - you want a silicon chip that does what a silicon chip does, and you'd like it to last a long time. But hard drives are immensely more complex and mechanical and subject to the whims of the outside world.

SQLite maximum database size increased to 281TB – but will anyone need one that big?

Lee D Silver badge

Re: Looks like I need to

No, but there's a middle-ground where it is. And things like Optane (formerly 3D XPoint) are non-volatile too.

The paths are meeting, and the middle ground is non-volatile, byte-addressable RAM in the gigabytes range, hitting RAM speeds, which has lifetime endurance.

As I say, budget computers are ALREADY coming with Optane. UK PC World are selling them (and that means that they are basically commodity as they only sell junk). 4Gb RAM and make the rest up with memory-speed NVRAM.

The technologies are merging and in just a few years we've gone from SSDs being the rarest most amazing thing, to being the default in every laptop. And, in fact, obsoleted by NVMe which are basically tiny, faster SSDs.

The future is an shared non-volatile single storage (like people assume we have had for years! The number of people who don't understand RAM vs storage!). You buy a 1TB chip and a portion of that becomes your RAM and the rest your storage.

It's been years coming, but it's literally viable today, and motherboards are literally being modified to support that scenario for Optane.

Lee D Silver badge

Re: Looks like I need to

SSD and NVMe.

Intel literally sell a product that's just an NVMe that you use as main RAM. Cheap computers are now coming with 4Gb real RAM and Optane for the rest.

A swap on an NVMe is never going to be noticed until the NVMe croaks - even SSD will make a RAM-struggling machine fly compared to its previous performance.

The prices haven't come down in years, however. I just replaced an 8-year-old laptop with a brand-new one and have only gone from 12Gb to 16Gb! Admittedly the 12Gb was the "top" configuration that machine was capable of, while the 16Gb was the bog-standard minimum (up to 64Gb), but that's quite telling - that's almost a stall in RAM prices compared to previous decades.

I've been fixed on 8Gb min spec in work for 6 years, and nobody is complaining.

I will upgrade my new laptop to 64Gb eventually but 16Gb and a a huge, stupendous-fast NVMe means you wouldn't even tell if it started to swap horrendously. Last I checked it was a couple of hundred quid to buy the parts, and that only because I have to replace the existing 2 x 8Gb chips with a 2 x 32Gb matched pair with some ludicrous speed. When I start to notice poor performance or excessive writes, I'll look into it. Or if I want to pull across all my old VMs and run them at the same time.

The gap between "temporary fast RAM storage" and "permanent slow disk storage" is basically getting towards zero now. The fastest RAM I can use in the machine is PC4-21300 (2666), which has a peak transfer rate of 21,300 MB/s, the NVMe I bought can do 3,400 MB / s. That gap is closer than ever, and some of the low-end RAM is slower than the high end NVMe's.

Lee D Silver badge

Once wrote a piece of software for a school I work for.

Kids run the program from a network share, it would make them login, give them tests and questions, and then record the results.

Logins, logs, results, etc. were all done by writing to the database.

I kid you not - 500 kids all sat tests, all at the same time, on a simple Windows network share, all reading and writing constantly, across the entire site, with only Windows locking on a single shared file. And SQLite just didn't care. It just worked. It was amazing.

As I wrote the code, I was double-impressed, because the code to write to the database was just a spin-until-the-lock-cleared kind of affair. I didn't think it'd scale past one class, to be honest, but testing was just unbelievable... it just worked. I once ran a thousand-simultaneous-user stress test on it locally while simultaneously in use throughout the school, it didn't even phase it.

And despite being a spin-lock on a Windows file lock on a network share on a single flat file database that's shared between all users, with complex SQL queries, writing, etc., nothing ever slowed down beyond "god damn instantaneous".

SQLite is in your browser, most likely, to hold your favourite, cookies, saved passwords, etc. Not to mention the amount of software that keen-programmer-eye spots it in (if not SQLite then it tends to be FirebirdSQL but Firebird is very much more complicated to set up even if it's effectively the exact same thing).

SQLite is one of those projects that we cannot let die.

Though anyone who's running a 120Tb database on it (presumably one HUGE FILE!), and feels they need more may want to seek help.

Linux kernel maintainers tear Paragon a new one after firm submits read-write NTFS driver in 27,000 lines of code

Lee D Silver badge

It didn't compile. People had to patch the Makefile.

It has out-of-bounds accesses. People had to run it through static analyzers to spot them. That's a potential "trash everyone's data" right there. It's not compliant with any of the kernel policies.

Also, generally, it would be a pull request of a well-maintained and reviewed tree, not a huge patch on a mailing list. You'd ask for it to be put into -staging or even ask for sign-off first, not just dump it in the mailing list.

They've done everything wrong so far. It's a dump. Or they could have just sent an email saying "We have X, what's the proper path to get that into the kernel?" and be put through to the right people, stage it in their trees for a year or two, iron out the bugs, etc.

Instead they throw a mega-patch onto a mailing list, which didn't build and had bugs visible in seconds in it, and then crowed about how expert they were in doing this. There's also NO TESTS of the filesystem - it doesn't use all the existing kernel filesystem testing procedures.

That's not the way to make friends in the kernel community.

Lee D Silver badge

Re: Bit harsh

Not really.

Let me give you an analogy.

I've got an old banger of a car. It still works. So I leave it in your garden one day with a note saying it's a gift to you.

The maintenance, updating, API-conversion and upheaval that happens in the Linux codebase is huge. So a dump-and-run is really an obligation, not unlike giving someone a pet dog for Christmas when you don't live with them.

That maintenance burden is FAR FAR more difficult than the initial code-drop is. You have to understand all the code, change it often, make sure it stays secure and bug-free, deal with support from users who now have NTFS 5.1 and why are they getting corruption now when they didn't on 5.0, and so on.

Literally, maintaining that code is harder than writing it in the first place. A dump-and-run is a common output from a company that doesn't WANT to maintain it any more, or deal with the user's complaining that it's unmaintained, or has a bug they haven't fixed.

There were several NTFS projects, historically. Hell, even one that emulated enough of the programming interface to use the original Windows DLLs to access the filesystem under Linux. They've all suffered the same fate - they might "work" but nobody maintains them, so they get bugs and get obsolete and don't work for modern versions of the filesystem, so they end up being a dead-weight in the kernel.

This isn't unique to NTFS, or even a particular subsystem of Linux - maintenance burden is the determining factor in a lot of the patch acceptance pipeline. Everyone involved has to understand it. It has to use the common code that it can (so NTFS doesn't have it's own special way of allocating a file or whatever that's different to everyone else). It has to work internally in a similar manner. Quirks have to be ironed out so they are clearly documented. Filesystem detection has to be consistent so it only offers to run disks that it knows it has the support for, and tied in. And so the filesystems maintainers have to be able to review it and not accept any NTFS-only nonsense.

Because Paragon - if history is anything to go by - will not be patching this code in years to come. They'll be dead, gone, forgotten, unsupportive, the one guy who opened the code will move to another company or whatever. That's how *so many* filesystems, drivers, subsystems, etc. die in Linux. Neglect, and nobody able to understand it to take it over, because it's an "oddball".

All the wifi drivers use the central 80211 framework, which was arrived at after dozens of independent and differing implementations that each vendor tried to put in just to run *their* card, and no other. The commonalities were pulled out, modularised, and then people were expected to conform or explain why they couldn't possibly do that. After decades, all the wifi drivers now use pretty much the same infrastructure even if they are radically different in capabilities, and those manufacturer's are long dead and gone.

Code-dumps are the problem here. Dump-and-run is seen as a charitable action, but it's just an obligation on Linux kernel maintainers to justify, fix, debug, handle user support, etc. for it for decades. And if they can't understand it, or it does things it shouldn't do, then they can't do that, and they'll reject it.

20 years ago people did exactly this kind of thing for NTFS. The result after all that time was one central common NTFS driver that's read-only. Because all those other contributors are long-gone and their code was something that "worked" for a brief period but was atrocious to integrate or debug.

Maintenance is king. Especially in an era where security of something like NTFS, and data security of the user, could easily trash or compromise someone's system and people won't be able to understand how or where it came from. You do *NOT* want a code-dump running your filesystem with any kind of useful data. And you certainly don't want to be responsible for someone else's codedump when your users all blame YOU for their data trashing itself on hardware that you don't even have available to yourself and cannot debug on.

This isn't a gift. It's certainly not a unique gift, it's happened dozens of times before. It's an obligation.

We've come to wish you an unhappy birthday: Microsoft to yank services from Internet Explorer, kill off Legacy Edge by 2021

Lee D Silver badge

Microsoft's entire legacy is based on the precept that everything that used to work will continue working in some fashion.

You can still install apps from the Visual Basic 4 days and they'll "just work".

That's not even true of binaries on other operating systems - you can't necessarily run an old glibc5 binary on a newer Linux at all... let alone expect it to integrate with the system integrations in place nowadays.

The second Microsoft make an OS without backwards compatibility, people forgo it and it becomes niche. Windows CE, Wnidows Mobile, those Windows Starter devices that only do Metro apps, etc.

The problem behind it is not that Microsoft allow it... it's that NOBODY sees that you need to update software even if it's "working". People would rather run-as-admin, give full security, override UAC, use a broken browser, etc. than actually update their software to be using tech from the correct decade.

What's broken is the software development model - where it's a vast, vast, vast, vast effort to update a program that's tied into that tech to use newer tech that secure, and far outside the reach of even businesses that paid millions for the app in the first place. The only way to support an IE ActiveX binary is to run IE. The only way to run a Silverlight program is to install Silverlight. The only way to run a .NET Framework 3.5 program is to install .NET Framework 3.5 (even if your machine had newer versions).

Farewell to notches and hole-punches? ZTE expected to announce mobe with under-display camera next month

Lee D Silver badge

Re: "The market is atrocious for phones and just does not understand what I want as a consumer."

I think it's far more like "HD/4K/3D etc.". Everyone wants it because it's there. Now that everyone has it, and either doesn't really use it or doesn't need anything more, it's started back going the other way.

People aren't gonna buy a new mobile every year in perpetuity. They aren't getting that much faster or better each time round any more.

Compare even an old Android with a new one, massive difference. Compare a new one with its competitors? Not much difference. Every phone has almost every option because they have to do that to compete. Nobody's going to buy a phone without GPS any more, for instance, even if they hardly ever use it.

And if you're not needing to buy a new phone, then you want one that'll last longer. Removable batteries are coming back. You can say "ditch headphones", maybe, because USB-C still allows that functionality and Bluetooth has obsoleted it. But when it's something that another function *doesn't* provide, then it becomes far more useful

It's not that phones won't still have 10 cameras. It's that ALL phones will have 10 cameras. So who cares about a phone with 11 cameras? And when a camera is "good enough" for selfies, then why does it need to be in the screen itself as a visible blob of obstruction?

There's only one generation that are really doing all the selfie-stuff, and that generation are nearly growing out of it. The next generation and the generations before don't really care about it.

We've reached saturation. The only way to distinguish is to give people things other phones don't have, or to make their phones different in some way.

Ed Snowden has raked in $1m+ from speeches – and Uncle Sam wants its cut, specifically, absolutely all of it

Lee D Silver badge

Re: Let's make it hypothetical:

Let me ask you a question:

How much in terms of "spying operations" do you think that Snowden, Assange and Manning combined have actually caused to stop? Anything at all? Really? I highly doubt it. If anything, they've driven it deeper underground.

In case you haven't noticed, the uproar is largely people on computers thumping their fists, a couple of news articles MANY years ago now and... not much else. There was no radical change in the way those forces operate, in the accountability, uproar among allies, public outcry, etc. They just went back to doing what they were doing before, and keeping a closer eye out for anyone stupid enough to try whistleblowing again.

They all did whistleblowing a great disservice. One, they were caught. Two, their information didn't result in anything substantial happening. Three, they did it poorly and didn't protect sources or innocent parties. Four, they passed it off to an agency (Wikileaks) that didn't do anything to protect them either. Five, they all ended up convicted, on-the-run or at the behest of the Russian government (don't for one second think that Snowden is free to do what he likes). Six, they have taught every whistleblower who follows them to NOT whistleblow unless they want all that to happen to them.

They're known, quite literally, because they were BAD whistleblowers. They are famous for NOT getting away with it. The only reason we know their names is because they were caught and convicted, for the most part. Their leaks gave them away, 100%. Their leaks also caused basically... well... nothing to happen. I think if we're honest, everyone just went "Oh, that's terrible... but that's exactly what we knew was happening, really, just not the specifics".

They didn't reveal that the president's a lizard, or that aliens exist, or mass genocide, or even really anything that turned into a convictable offence, did they?

And you only have to watch a movie to guess what happens to the whistleblower who blows wide open the government's secrets. If anything, the response to their leaks has been very muted, I feel. Nobody in government really cares about what was leaked, they just want the person who leaked it to serve a (reasonable length, if you look at Manning) prison sentence for it.

In the grand scheme of things, not much happened, and they got famous for being bad whistleblowers who all got caught and either were jailed or will spend their lives on the run.

In the grand scheme of things, https://en.wikipedia.org/wiki/List_of_whistleblowers has a list of people who did far more, had far more done to them, and most of whom I've never heard of.


Biting the hand that feeds IT © 1998–2020