* Posts by ChrisC

1541 publicly visible posts • joined 2 Jul 2009

Skyrora circles Orbex wreckage as UK rocket rival heads for administration

ChrisC Silver badge

"We have been successfully developing"

Ah, this is obviously some strange usage of the word "successfully" that I wasn't previously aware of...

Tech support chap invented fake fix for non-problem and watched it spread across the office

ChrisC Silver badge

Re: Wiggle the mouse

My own recollection of the 9x era was that, on some systems at least, wiggling the mouse when the PC was running certain tasks, would force the UI to refresh more often than Windows would otherwise bother with, thus giving the impression that said tasks were running faster than they actually were, due to how slow they *appear* to be running when the PC has stopped bothering to keep you updated on a sufficiently regular basis as to what it's up to...

Summoning the spirit of the BBC Micro with a Pi 500+ and a can of spray paint

ChrisC Silver badge

Re: Memories

My Beeb Micro owning friend had a hacked version called "Fruity Frak", which did a bit more than just altering the speech bubble text...

British military to get legal OK to swat drones near bases

ChrisC Silver badge

Re: be careful what you wish for

If your weapons storage bunkers are sufficiently poorly constructed that there's a risk of setting off the stuff stored within by opening fire with small arms, then you've got *much* bigger problems than having an unauthorised drone buzzing around your airbase...

Similar for the "hangars", which on most bases (especially those out of which high value airframes such as F-35 would be operating) are more likely to be hardened shelters (HAS) as opposed to the more flimsy hangers you might be thinking of, suitable (mostly) for sheltering aircraft from the elements, and also from the prying eyes of satellites/recon aircraft, but really not designed to offer much if any level of protection against anything else.

There's nothing micro about this super-sized Arduino Uno

ChrisC Silver badge
Pint

My first experience of hands-on circuit design/construction was in the late 80s, using the similarly large-scale protoboards our school science department had. They weren't literally scaled-up versions of resistors, caps etc in quite the same way as this, they were more large plastic/metal carriers for each component, designed so that when slotted into the protoboard, the markings on the top of each carrier resembled a schematic layout.

So although the visual appearance of that older setup vs this one are quite different, I can well understand the principle of scaling things up so they're easier for students to grasp (both physically and figuratively), and given that this one actually does then replicate the visual appearance of the 1:1 scale devboard, I can also easily imagine how this would really help with understanding the use of the actual hardware.

ChrisC Silver badge

Or alternatively:

* they're not sufficiently confident in being able to self-narrate in that particular language

* they've released the same footage with localised voiceovers for different regions, and didn't fancy spending the time recording it all themselves

or probably a few other reasons as to why someone might choose to use synthesised narration rather than voicing it personally, none of which would imply any level of negativity towards the creator...

Autonomous cars, drones cheerfully obey prompt injection by road sign

ChrisC Silver badge

Re: May I suggest

The opcode, or the rather entertaining TV series of the same name?

ChrisC Silver badge

"how long would you ignore it before proceeding carefully"

There's a key difference between what you're suggesting a human driver might do here when faced with a fake instruction, and what the artificial ineptitude was doing in this research - note that you describe the human driver as proceeding *carefully*, whereas the research is suggesting the AI driver would blindly proceed and cause a collision based *solely* on what their vision system picked up from the adversarial signage, with no regard for what it (or any of the other sensors on the vehicle) should have been telling it about the nature of the road ahead.

And you don't even need an adversarial bit of signage to come up with a similar real world scenario - if instead of a fake GO sign next to a set of genuine red lights, those lights had turned green, a human driver is *still* expected to proceed carefully and not blindly assume the green light is giving them carte blanche to set off regardless of what might be happening ahead of them. Same rules damn well ought to apply to automated vehicles as well, and if they're as easy to bypass as this research is implying, then something is very wrong...

ChrisC Silver badge

Re: Teenage boys will be salivating...

"Crooks putting up detour signs. (To most closely match what these researchers are trying)"

Or even an entirely legitimate green traffic light being displayed to a driver, whilst the road ahead of them isn't clear to allow them to safely proceed...

As humans, one of the fundamental requirements we have when driving is to avoid colliding with anyone/thing, so if the sign says go but the road says whoa, you damn well stay put - starting to move and colliding just because the signage said it was OK to do so is never going to be a defence for a human driver, so there's no excuse for an automated one either, and this feels like it ought to be one of the easiest things to codify - i.e. regardless of what the sign detection code is telling you to do, you never, EVER, ignore what the object detection and collision prediction code is telling you.

Sword of Damocles hangs over UK military’s Ajax as minister says back it or scrap it

ChrisC Silver badge

Re: Putting lipstick on a pig

I mean, I somewhat recall this being one of the issues with the attempts to upgrade Nimrod - the variability between each of the original airframes, meaning that the modifications were essentially unique to each one - but given how old those airframes were and the way in which airframes were put together back then, this wasn't entirely a surprise. - ISTR it being suggested (however close to or far away from the truth it might have been) was that it'd have been cheaper to simply build entirely new airframes to a consistent set of dimensions, that just happened to look exactly like Nimrods...

But to hear that a modern design, something which has absolutely no excuse for not learning from the mistakes of those who went before, has anything more than a few *mm* of tolerance variation, simply beggars belief, and I'm struggling to accept that comment on the chassis dimensions differing by such a significant amount as being anything more than poetic licence, with the actual variation being rather less dramatic whilst still being a genuine factor in the reliability/performance problems AJAX is having. Because if they really *are* rolling off the production line with that much variability, then it raises real concerns over what else about the design/build has been similarly screwed up.

Techie's one ring brought darkness by shorting a server

ChrisC Silver badge

Re: No Capes, No Scarves, No Ties, No Necklaces, No Watches, No Bracelets, No Rings

Ouch. IIRC from my brief time spent working with liquid nitrogen many moons ago, it's fine to get on your skin so long as it isn't allowed to pool there - i.e. a thin layer that can immediately boil off before it has a chance to freeze anything it's come into contact with is OK, a more significant quantity that remains trapped in contact with your skin in its liquid form, not so good. It seemed counterintuitive at first to be told to not wear safety gloves when handling it, and then they explained why...

Marketing 'genius' destroyed a printer by trying to fix a paper jam

ChrisC Silver badge

Re: Users and printing devices...

I'd say they were just very good at minimising expenditure on any cost codes they had responsibility for. Who cares if it costs the company overall more, so long as you can truthfully state that your department didn't overspend...

Concorde at 50: Twice the speed of sound, twice the economic trouble

ChrisC Silver badge

Re: Amazing memories

Cramped, but not uncomfortable, IMO - unlike the similarly narrow seats you find some airlines using in economy, it was less of an unpleasant squeeze and more of a firm but gentle hug. You wouldn't want to be sat in such a seat for too long, but then that was also the point about Concorde - trading off the higher level of in-flight comfort offered by a widebody, in return for giving you back several hours of your time with each hop across the pond.

Can't comment on how limited the catering choice was, as I don't know what the equivalent choice would have been 20+ years ago if someone had opted to cross the Atlantic in the front end of a 747 vs on a Concorde. I do remember it all being bloody nice though!

ChrisC Silver badge

Re: Hmm

The charters also gave more of us "ordinary people" the opportunity to sample the Concorde experience, which I suspect helped maintain the love we had for it above and beyond what it'd managed to earn for itself. I mean, don't get me wrong, as someone who started their love of aviation around the time Concorde first entered service, made the pilgrimage out to my local airport with thousands of like-minded souls in the area to see it when it visited in the early 80s, explored the prototype at the FAA Museum, always looked towards the sky whenever I heard it heading out of Heathrow during my walk to work in the early years of my career, and have a deep appreciation (as someone who's followed an engineering career path of my own) of the engineering efforts that went into it, I'd happily wax lyrical about Concorde regardless.

However...

Having *also* been on one of those BA charters - round trip from Fairford during one of the RIAT shows in the 90s - and having therefore had first hand experience of what it was like not merely to sit in as a static exhibit, but as a passenger, plied with excellent food and drink by the attentive cabin crew as we wafted along at M2 with less fuss or bother than any other airliner experience I've had, my position as a Concorde supporter is set in stone forever. So whatever monetary benefit those charters brought in directly is one thing, but the PR benefits they generated, with the potential knock-on effects for indirect revenue generation, really shouldn't be overlooked either.

Windows 11 shutdown bug forces Microsoft into out-of-band damage control

ChrisC Silver badge

Re: Just wait a bit longer

"a real American idea, trying to steal employee's time for updates"

And in typical MS fashion, implemented with minimal care or attention to accuracy - suggesting it'll only take a couple of minutes to perform the update, so you might as well just let it proceed because, by the time you've chucked stuff in your bag, gone for a quick pre-commute bathroom break etc., it'll be done and dusted allowing you to grab your laptop and head for home with no real delay to your departure time...

..except that, once you've fallen for yet another of MS's examples of woeful inability to predict completion times, and found yourself sat there some 10-15 minutes later, *still* watching your PC chunter through whichever bit of the update process it's now on (telling me how close to completion it is within *this* part is all well and good, but without knowing how many *more* parts are still stacked up waiting to run after this one, it feels like I'm being gaslit - yeah yeah, almost there, 98%...99%...see, all done, just need to reboot and... 1%, 2%... FFS), your desire to just let the update process run whenever it asks to do so (or, whenever it says it's going to run whether you want it to or not), rapidly diminishes to the point where you are going to start thinking up ways to inhibit it such as you've described.

Which really isn't how it should be - if we want users to accept the need to keep systems updated, the update process needs to be as minimally disruptive to their lives as possible. Windows updates have always been a bit of a joke in that regard, all the way back to the early days when they regularly insisted on the need for a post-update reboot despite this not, in so many cases, actually appearing to simply be the default setup for the installation builder, rather than something the update provider had actually proven for themselves to be a hard requirement. But at least back then, as with so many other aspects of Windows operation, these were more just inconveniences that could more easily be worked around by the end user. These days, damn near everything Windows does now feels like it's ramming down our throats a constant reminder that we're no longer in control of the system, and we should just sit back and let MS do whatever it thinks it needs to do, whenever it decides to do it, regardless of how much of a negative impact it might have on us.

Kids learn computer theory with wood, cardboard, and hot glue

ChrisC Silver badge

Re: Hold Up, There!

Note the previous comment also mentioned memory pages, which might complicate matters somewhat when using function calls vs inlined code, if the function is being called from code running in a different page to where the function resides (ISTR this was the cause of some fun and merriment when writing PIC code some years ago).

These days though, it's generally better to just write it as a function and let the compiler work out for each given build of your code whether or not silently inlining it makes for a more efficient build - that way you get all the benefits of only having a single copy of the code to maintain, whilst also still getting the benefits of inlining when there's any benefit to doing so.

There may well be the odd edge case where controlling inlining manually remains preferable/essential, but over my decades of embedded design work, I've increasingly pivoted from being utterly aghast at how awful the compilers were at turning my C code into something that actually resembled it at the assembler level (one compiler would regularly fail to generate working asm for entirely valid C code, if said C wasn't consistently written to do the absolute minimum per line of code), to being quietly impressed at how good they (and by "they", I really just mean anything GCC-based - I still tread exceptionally warily at first if I ever need to use a platform where there isn't a GCC variant available, or where someone has decided they can write a better compiler than GCC) now are.

ChrisC Silver badge

"Some of the coding we used back then probably wouldn't pass muster today, short variable names, no comments, etc. to save a few bytes but sometimes elegance in programming had to give way to practicality."

Efficient use of system resources is still something valued in the embedded systems sector, particularly amongst those of us tasked with developing bare-metal firmware running on systems which would sometimes be embarrassed by a Spectrum in terms of available memory, even if the raw processing power is a few orders of magnitude higher - having a relatively high performance ARM core coupled to a few KB of memory can make for some interesting design decisions...

That said, things like short variable names/no comments etc. aren't an issue these days when you're writing/compiling your code on a larger system, and only requiring the embedded target to run the binary, but other tricks from the 8-bit days for reducing the amount of runtime resources required are still valid, and it always brings a wry smile to my face when I find myself just instinctively using a technique today that I learned 40+ years ago, because I'm still sometimes working on systems with similar constraints as my Speccy.

And stuff like writing a C# app to convert bitmaps into data structures suitable for the display driver routines in the embedded system also triggers memories of being sat with a stack of graph paper, manually sketching out sprites and typefaces and converting them into the corresponding hex values.

I think this constant reminder of, and ability to make use of, some of those retro concepts that were a big part of my childhood, is part of what makes me love this branch of engineering so much...

Sony no longer home of the Bravia as it plans TV biz spin-out to China’s TCL

ChrisC Silver badge

Re: Not sure why the dissing of TCL

TCL + soundbar is our current setup, and given the combined price is still comfortably lower than the price of just an equivalent spec TV from one of the more obvious brand names (which, based on prior experience with other TVs from such brands, would *still* need the soundbar anyway), it wasn't a difficult choice to shortlist TCL, and having then compared some of their sets side by side with other brands, the only two questions in my mind were firstly "why would anyone spend that much more just to get a different name printed on the bezel of a screen that's delivering the same image quality?" and secondly "what's the expected delivery time for *that* TCL model?"...

And whilst they might not be *as well* known as the likes of Sony et al, they (and HiSense et al) have been gaining some level of awareness in the UK at least via their sports sponsorships - a random person on the street might not be immediately aware of *what* these brands offer, but they're now increasingly likely to at least be aware that the brands exist in the first place.

Engineer used welding shop air hose to 'clean' PCs – hilarity did not ensue

ChrisC Silver badge

Re: BS

But what about my old 78's?

Britain goes shopping for a rapid-fire missile to help Ukraine hit back

ChrisC Silver badge

Re: Something

For sure there's an element of political posturing with all of the public help that's been offered - it's inevitable that the governments behind such help will want to get at least *something* out of it, because we all know that politicians never do anything solely for the good of the people they say they're trying to help...

However, I still think you're being unduly pessimistic/dismissive as to the effectiveness of the aid given. Take Challengers as an example - yes, we were only able to scrape together a handful, and yes they're somewhat unique in terms of ammunition supply, maintenance etc. compared with the other MBTs Ukraine now has. But the key word there is *now*. Let's not forget that before we made that first move in not merely saying we would, but damn well actually doing it, supply MBTs to Ukraine, everyone else was dragging their heels. So perhaps, other than the good use they've been put to in the "sniper" role, the Challengers haven't been as effective in general as the Ukrainians might have liked, but their effectiveness in coercing other nations to supply greater numbers of tanks better suited to their needs shouldn't be overlooked.

And it's the nature of warfare, if the fighting continues for more than a few weeks/months at best, that one side will come up with countermeasures to stuff the other side is throwing at them, and then that side will come up with countermeasures to the countermeasures, and so on back and forth. So yes, the Russians have come up with ways to diminish the effectiveness of some guided munitions, but that doesn't mean it was pointless to have supplied them to Ukraine in the first place.

Could we be doing more to help Ukraine? No question about it. But that doesn't diminish the help we have given them.

ChrisC Silver badge

Re: Something

So, let me get this straight, you're saying that none of the following were of *any* military importance to Ukraine:

Challengers, Abrams, Leopards, Bradleys, Archers, MLRS/HIMARS, Javelins, NLAWs, Storm Shadows, MiG-29s, F16s, body armour, assault rifles, ammunition, field hospitals, comms gear, satellite and signals intelligence, and all of the other stuff we *have* heard about...

Nothing, not even a single solitary entry on that bloody long list justifies being classed as militarily important in your mind?

Yes, there's undoubtedly *other* stuff that we don't hear about, and may never hear about until everyone involved is long gone from this world, if even then, but to so confidently dismiss all of the public stuff as being just a load of political posturing quite simply beggars belief.

ChrisC Silver badge

Re: Something

We're already doing a lot of somethings, and this sounds a lot more like one of those than a *something*...

Danish dev delights kid by turning floppy drive into easy TV remote

ChrisC Silver badge

Ah, as I was writing that comment about offering an eject menu option, I felt the faintest of prods seemingly from a direction deep within my long-term memory banks, but dismissed it as a mere glitch in the system - I'll admit that, as an Amiga owner back in the day, my early experiences with Macs (a couple of SE's we had at school - back in the days when the IT classroom was still rammed full of BBC Model Bs and perhaps an Archimedes or two, the Mac was sufficiently exotic to warrant further investigation) was less than complimentary once I realised just how godawful that earlier (System 5 I think, given this was around 89-90) version of MacOS was compared with Workbench, so I didn't get sufficiently familiar with them to be able to recall more clearly details like this.

ChrisC Silver badge

You say that, but percentage-wise, I've suffered higher failure rates with almost every other type of storage medium I've used over the decades than I have with floppies (caveat - although I have *used* systems that relied on other floppy formats, I've only ever *owned* systems built around the 3.5" version, so maybe users more familiar with those other types may have less favourable opinions).

I literally could count on the fingers of one hand how many floppy discs I can ever recall failing back in the days when they were still fresh, and even today, 30+ years after some of them had first been bought and written to, and then stuffed in a storage crate and moved from loft to storage unit to garden shed over the intervening decades, I've now been able to successfully read a far higher percentage of them than the doom and gloom merchants in the retro computing world would have you believe. Yes, some have succumbed to bitrot, fungal growth etc, but considering I paid basically no attention at all to the environmental conditions in which they were being stored (because until I learned how important this might be, they'd *already* been stored in sub-optimal conditions for so long that I figured it'd make no difference if I now tried to store them properly...) I'm genuinely amazed at just how easy it's been to pull data off of them after so long.

In contrast, I'd need to move onto my second hand in order to count the number of hard drive failures I've suffered, despite owning far fewer of those over the decades - i.e. a higher absolute number of failures out of a significantly smaller sample size. Floppy Disc 1, Other Media 0.

And even if I discount all the failures observed whilst initially writing to optical discs and only consider post-burn failures due to disc rot or some other mechanism, whilst this would bring the absolute number of failures down into the same ballpark as for floppies, I don't yet think I've burned enough such discs to bring the relative failure rate below that of floppies. So Floppy Disc 2, Other Media 0.

Let's not even consider the absolute crapshoot that is removable flash media, whether in memory card or pendrive format. Some memories are just TOO painful to dredge out of long term storage for the sake of adding to a forum anecdote. Let's simply say - Floppy Disc 3, Other Media 0 - and leave it at that, shall we.

In complete contrast to the experiences of some others however, the one medium which has never ever let me down (yet - I'm frantically trying to remember if I've already backed them all up before I go tempting fate here!) is the much-maligned Zip Disc. Floppy Disc 3, Other Media 1.

So honestly, my experience with floppies is that they really were pretty robust, especially so given the sort of treatment they'd get - randomly being ejected mid-access, left sitting next to speakers or CRTs, used as fidget toys before we'd even heard that term being bandied around...

ChrisC Silver badge

For some specific definition of the word "better"...

The idea of dragging an entire disc into the trashcan in order to merely eject it from the computer, whilst leaving the contents of said disc entirely intact, has never sat comfortably with me. It was neat as a way to avoid users ejecting discs mid-access as was possible on other systems of the day, but that could also have been achieved by providing a seperate eject icon on the desktop, an eject option in a menu, or something else which would still leave the OS in control of synchronising the eject request with access requests, but without requiring the user to mentally context switch the meaning of the trashcan depending on what it was they were dragging into it.

ChrisC Silver badge

Re: QR Codes?

As someone who's recently been rediscovering the delights of using older media whilst backing up my old floppy and zip disc collections, it simply CANNOT be overstated just how important the tactile/audible sensations are as a way of reinforcing the "you've just taken an action that will allow this system to access your data"/"you've just taken an action that removes access to that data from your system", combined with the additional feedback you get whilst the data is then being accessed. Having that more physical connection to the act of reading/writing data really does help focus your attention in ways that the more efficient/instantaneous methods we now take for granted simply don't achieve.

I'm definitely not saying I'd prefer to go back to the good old days of having to load *everything* from tape/floppy (you can pry my NVMe SSD out of my cold dead hands...), but for use cases like this where there's no requirement to be transferring large quantities of data, or transferring anything particularly quickly, and where the primary focus is on ensuring the *correct* data is being accessed, and also on allowing the user to perform differing physical actions (put disc in the drive vs take disc out of the drive) to trigger differing system actions, vs requiring them to learn how to perform the same physical action in differing ways/different times (scan a QR code when the system is in *this* state to start playback, but scan a QR code when the system is in *this other* state to stop playback), which is less intuitive, then it's a genuinely neat way to repurpose the drive and discs, and I can see why their child loves using it.

Baby's got clack: HP pushes PC-in-a-keyboard for businesses with hot desks

ChrisC Silver badge

Bread, muffins, buns, baguettes, teacakes, perhaps even, if I may be so bold, the occasional waffle now and again... But definitely not, no question about it, any smegging flapjacks!

ChrisC Silver badge

Exactly - keycap wear-out seems to be a recent-ish phenomenon caused by changes in how keyboards are manufactured, because neither of my Amigas are exhibiting any sort of wear on they keyboards, despite having each been subjected to around 5-6 years of heavy daily use back when they were my primary machines, and similarly the keyboards I've held onto from my old PCs (because you never know when you might need an AT or PS/2 keyboard...) also still look as good as new.

It's only the keyboards I've been buying over the past decade or so where keycap wear has been a recurring problem - either the stick-on/transfer/whatever surface-level process is used to apply them legends wearing away, or the caps themselves being physically altered by use. Not merely becoming shiny, but in some cases with clear wear marks suggesting the plastics being used these days are far softer than in the days of yore...

So I also don't agree with the previous comment that you *can't* stop keycaps wearing out - we have ample historical evidence to show it *can* be done, and the only reason it might seem otherwise today is because we're now surrounded by cheap and cheerful keyboards which are now being treated more like consumable items and therefore simply not designed for longevity,

Apple blocks dev from all accounts after he tries to redeem bad gift card

ChrisC Silver badge

You're assuming that any of that software or media would have been available in a form that could be easily backed up - and more importantly, still used if needed - outside of Apple's ecosystem...

Considering how much software these days is no longer a "buy once and it's yours forever" deal, but increasingly a "buy a licence that will grant you access for at least as long as we advertise, usually, might still give you access beyond that point but don't count on it, will almost certainly stop working at some point once we get tired of paying to maintain the dusty old licencing server required to handle reinstalls and regular licence validation checks, and also requires you to sign in with the same account details used to purchase the licence in the first place" type of sordid affair, and it's far from reasonable to presume that anyone these days who needs to deal with commercial software will have the ability to divorce themselves entirely from the supplier and be able to maintain use of said software no matter what happens in the future.

Electric cars no more likely to flatten you than the noisy ones, study finds

ChrisC Silver badge

Re: Vehicle weight?

"I would conjecture EV owners (perhaps excluding Tesla owners ;) would be more safety conscious and attentive drivers."

I'd say not - IME (as someone living on the outskirts of a major metropolitan area) between all the minicab drivers that've switched to EVs over the years, along with the growing number of drivers who seem to have opted for EVs in order to get something with even more acceleration performance than any ICE vehicle they'd be able to afford, and are then only too happy to make full use of said performance when out on the roads regardless of whether the conditions are really suitable for doing so, I'd say EVs are at best merely on a par with ICE in terms of driver ability/sensibility.

Micron ditches consumer memory brand Crucial to chase AI riches

ChrisC Silver badge

Re: Fire sales soon?

Yup. When I bought my current laptop I pushed the boat out a little more than I'd otherwise have done, in order to get a decent discrete GPU setup thinking that, now the kids are older, I might actually have the time to get back into PC gaming. And though I have made some use of the GPU for that, the majority of the work I've pushed through it so far has been running a CUDA-optimised H265 video encoder (because one of the other things I've been able to get back into is video editing...) where its ability to display stuff is of no relevance whatsoever.

So describing these things as GPUs is now quite misleading, because they're no longer fixed-function devices intended solely for accelerating real-time graphics output. Adding one to your system is more akin to adding a FPU to systems of old - it's not going to magically speed up *everything* your system does, but for anything which *can* be offloaded to it rather than being left for the CPU to deal with, the benefits can be striking.

ChrisC Silver badge

Re: Damn.

"AliExpress-level", not "AliExpress-supplied" - i.e. I read that as the OP equating the quality of the stuff they were getting back then, with the quality of stuff we might now be more familiar with courtesy of AliExpress, *not* that they were saying they'd actually got it from AliExpress back then.

ChrisC Silver badge

Re: Damn.

Likewise - the Crucial memory selector became my first port of call about that long ago whenever I found myself putting together the bits for a new build PC or to add extra to an existing one, and more recently I'd also been speccing their SATA SSDs as upgrades to older systems where I couldn't justify the additional expense of something like a Samsung Evo, but where even the lower performance offered by a BX or MX was still streets ahead of the spinning rust it was replacing.

But then I thought to myself "when was the last time I actually did any of that?", and I realised that my days of regularly getting my hands dirty (and torn to shreds by randomly placed sharp pieces of metal casing) building new and upgrading old PCs are now mostly behind me - the last few PCs I've introduced into the fleet at home have all been off the shelf laptops, and the last time I did any tweaks to one of the few remaining desktops at home, or been asked to upgrade one owned by a relative, is probably now getting on for 7-8 years ago.

So as sad as this news is, it's more from a nostalgia perspective, and realistically I suspect its effect on me will be negligible to non-existent - yes, if I ever do find myself at some point in the future getting back into system builds/upgrades, then I might lament their no longer being part of the consumer market, but given how infrequently I (and I suspect many in the same sort of position as me) now send any business their way, compared to how much I used to a decade+ ago, it's entirely understandable why Crucial/Micron have taken the decision to withdraw from this sector and focus on the bits which are still money-makers.

Indian government reveals GPS spoofing at eight major airports

ChrisC Silver badge

Re: No Silver Bullet

Number of visible satellites is a hard constraint on achieving a position fix, sure, and 3 is indeed the bare minimum if all you care about is lat/lon. But once you've achieved a fix at all, then *where* those satellites are relative to the receiver can then have a significant impact on the quality of that fix.

So realistically, unless you can guarantee that your initial "yeah, you're roughly here, give or take a hundred metres or so" fix can then be refined via other techniques such as DGPS, or unless your application is able to make use of such vague fixes, then intentionally restricting your receiver's ability to see as many satellites as possible isn't necessarily a simpler solution, nor one that would necessarily still guarantee the required level of performance under all potential threat (whether caused naturally/accidentally - e.g. weather related - or maliciously) scenarios which might still prevent even your more tightly focussed receiver from being able to reliably see at least those 3 bare minimum satellites at all times.

Aviation delays ease as airlines complete Airbus software rollback

ChrisC Silver badge

Re: Protection is ideally done by hardware, but can also be done in software

That's one way of doing it, and if all you need to do is force the compiler to re-read the value each time, without caring about any other optimisations it might be applying to the resultant code, then it's probably also the easiest/cleanest way. In some projects I've also used file-level optimisation overrides to prevent the compiler doing *any* optimisation on *any* of the code contained within said file, which is another, rather more brute-force, way of doing it, but which might be necessary for other reasons - e.g. to ensure cycle-accurate timing without the need to actually write the asm yourself...

I'm getting a little long in the tooth to be subjecting my brain to that, though in my younger years I used to relish the chance to get stuck into some truly bare metal coding without even the crutch of a compiler for support, and I do still understand the instruction sets of the processor cores I work with these days well enough to know what the compiler is generating, which is a skill any coder working on lower-level systems really needs to have IMO, even if they never use it to actually write so much as a single line of asm in their entire career - unlike those earlier years, where I genuinely could write better asm than the compilers of the time (some of which would happily generate code that was utterly and hopelessly wrong), I'm happy to concede that a few decades of compiler development combined with the increasing complexity of the cores themselves means that the compilers these days almost certainly are doing a far better job than most of us could manage.

ChrisC Silver badge

Re: Protection is ideally done by hardware, but can also be done in software

bool f( int x )

{

volatile static int c1;

volatile static int c2;

c1 = x;

c2 = x;

return c1 == c2;

}

Speccy clone storms back for Christmas without a shred of Sinclair code

ChrisC Silver badge

Re: squishy "dead flesh" type keyboard

"I am puzzled by people's nostalgia for the rubber-keyed Spectrum"

Perhaps because, aside from all the obvious reasons to have nostalgia for it - maybe it was your first ever computer, maybe it had the games you really loved to play as a kid etc - despite all the scorn heaped upon the keyboard, it really wasn't all that bad when compared against the outwardly far better looking keyboards on its rivals.

As a pre-teen computer geek with correspondingly pre-teen sized hands, spending too much time typing on something like a BBC or C64 keyboard felt like a chore - initially yes, a very much nicer tactile sensation, but after a while increasingly tiring due to the extra effort needed not only to move my fingers into the correct positions, but then the amount of movement and pressure required to activate the keys. In contrast, the Speccy, with its rather more compact and bijou keyboard arrangement that was a better fit to my hand size, and with keys requiring rather less exertion to depress far enough to elicit a response, was quite nice to use over extended coding sessions. And then there was the incessant clatter those other keyboards generated - typing on the Speccy was a somewhat more serene and peaceful affair, which also meant it was easier to get away with in the small hours when you were supposed to be fast asleep...

Would I have wanted to use one for serious word-bashing? No. But honestly, I'd say exactly the same about those other keyboards from that era - until the 16/32-bit machines started to make inroads into the home computing scene here in the UK, I don't remember there being *any* machines I had access to where I'd consider the keyboard to be truly decent and worthy of praise, and even after that point there was still some notable variation in keyboard quality - I loved the Amiga and Archimedes keyboards, was so-so on the Mac ones (Mac II being a bit better than the Classic, IIRC), never got on with the ST one, and found the whole PC/Unix/other OS workstation experience I had in my early university years to be utterly inconsistent to the point of sheer frustration.

And don't even get me started on the ergonomic disasters that were some of those similarly early-era mice...

Tiny tweak for Pi OS, big makeover for the Imager

ChrisC Silver badge

At this point you're into the realm of diminishing returns and the fundamental resolving power of the Mk.1 eyeball though, so less surprising you can't discern any difference. I'd be rather more surprised if you still made the same statement if one of those screens was being fed from a VGA source, whilst the other was being fed from a 720P (or even just a SVGA) source.

ChrisC Silver badge

Yup, although modern LCDs utilising all the latest tricks are now at or beyond that level of intensity, depending on how much you're willing to spend, though even entry level displays are now taking advantage of the trickle-down effect of formerly high-end tech. Our current LCD generates images that I'd say are on a par with our old plasma in this regard, whereas the first LCD we bought to replace that old monster was definitely lacking in the intensity/punchiness of the images it could generate, though it more than made up for that with all of the extra clarity we gained in the switch from 720P to 2160P...

ChrisC Silver badge

Only in the warmer months of the year - once the temperature started dropping, our old plasma TV took on a dual role both as a display device and as a radiator, helping to reduce the load on the actual heating system for that part of the house... And in comparison to the alternatives that existed at the time for achieving big screen output, they really weren't that bad - it's only when you stack them up against a modern energy-sipping display optimised to within a gnat's doo-dah to comply with the latest energy saving standards, that plasmas seem like power-gobbling monsters.

ChrisC Silver badge

"which is quite a lot _lower_ than VGA resolution"

It's quite a bit lower than the sort of *DPI values* you might expect for a VGA frame rendered on the sorts of screen sizes more commonly associated with display of VGA content, but that's not a like for like comparison - if you were to take a VGA frame and pipe it through the VGA connector that's likely to exist on the backside of said plasma display, then it ought to be fairly immediately obvious to you how much less detail can be resolved within that frame compared with a 720 (let alone a 1080) frame on the same display.

Similarly, if you were to pipe your 720/1080 frame into a HD-capable display of the same physical size as your typical VGA display, then the DPI value would be correspondingly higher than for VGA, and it'd now be obvious you were looking at a clearer, sharper, more detailed image than if you were feeding that display with a VGA signal.

So as far as determining whether something is SD, HD, UHD etc, you should be looking solely at the dimensions of each frame in terms of horizontal and vertical pixel count, and ignoring how big or small each pixel is once rendered on whatever size display that frame has been sent to. In that context, your TV *is* therefore HD.

FCC sounds alarm after emergency tones turned into potty-mouthed radio takeover

ChrisC Silver badge

Re: Its like a song with sirens

These days, having a TV show featuring a Ring doorbell can result in much the same type of response...

Microsoft's fix for slow File Explorer: load it before you need it

ChrisC Silver badge

Re: Bag of shite

If I had a pound/euro/dollar/etc every time I found myself sat in front of a modern Windows PC, thinking "why is this damn thing, with orders of magnitude more raw processing power, memory, drive transfer rates etc than my old 80's era Amiga, still SO much slower at performing simple, core, tasks - starting an application, displaying file properties etc", with bonus payments for every time I *also* was then left wondering "why can't it multitask as efficiently as Workbench used to manage", then I'd have been able to comfortably retire some years ago..

BOFH: You know something's up when the suits want to spend money

ChrisC Silver badge

Re: "colored crayon office"

Methinks "u" have missed the point the previous commenter was trying to make there...

MS Task Manager turns 30: Creator reveals how a 'very Unixy impulse' endured in Windows

ChrisC Silver badge

Yeah, I do fire up a Linux VM from time to time to do some cross-platform testing work. However, as corporate policy dictates that our work PCs run W11, and as work makes up the lion's share of my weekly time in front of a PC, then whatever benefits I'd get from switching my own PC away from 11 would be minimal in their own right, and would then bring the added joy of incurring context switching every time I swapped between work and non-work. On top of that, as expecting the rest of the family to migrate away from Windows would be asking a lot, I still need to remain sufficiently experienced in administering Windows systems to be able to deal with any problems on theirs, so it feels as if leaving mine running Windows as well is the best compromise at present.

Maybe once I retire I'll be more inclined to make the switch at home, though as that's still a good few years away, maybe it'll be whatever crazy ideas MS decide to impose on us in the meantime which tips the balance. Or who knows, maybe someone in Windows HQ will read these articles, realise just how far from ideal Windows currently is, and set the wheels in motion to return us to the good old days. Well, we have to at least dream, even if it's a hopelessly optimistic dream, right?

ChrisC Silver badge

Re: It's the habit of assuming that the user is trying to accomplish some real work

As someone who has to suffer W11 in a corporate environment, I don't get interrupted by adverts, but I do regularly get interrupted by the even more annoying way that Office is now designed to treat even a few femtoseconds-worth of inactivity as a sign you're no longer *really* focused on whatever you were doing in Word, Excel, Outlook, Teams etc. etc., and that it's therefore *entirely* acceptable for the Office Update system to suddenly, without any forewarning, opportunities to defer to a later point in time, or even any sort of feedback to at least let you know what it's doing, close down the app you were literally just using in order to apply whatever update it's deemed so utterly essential to the survival of the known universe, that it simply HAS to be applied without any further ado.

Honestly, given a choice between having to watch an ad, where I at least *knew* that the system was deliberately preventing me from doing anything for that period of time, or having to sit twiddling my thumbs for an entirely unknown period of time not being entirely certain that Word etc. has suddenly closed down to perform an update, or if it's done so for some other reason, and having to then just wait patiently without any knowledge as to how long that wait will be, before things start up again, allow me to gather my train of thought again and resume whatever it was I was doing before the system so rudely interrupted my work, then I'd choose ads. And as someone who utterly despises online advertising and quashes it wherever possible, that should tell you all you need to know as to how utterly and completely user-hostile I now consider Office Update to be. How anyone at MS could look at the implementation and go "yeah, that's a perfectly acceptable way for us to treat users" beggars belief, yet here we are.

ChrisC Silver badge

"It's the habit of assuming that the user is trying to accomplish some real work."

And there, so neatly encapsulated in that one sentence, is the gulf of difference between not only MS of old and today, but of most software peddlers. Back in the good old days, software generally just let users do what they needed to do provided it was within the fundamental capabilities of the software itself, whereas these days, no matter how capable the software now is, I find myself now increasingly battling with it to let me do what I *know* it can do if only it'd expose that specific sub-mode of the UI to let me do it, or if only it'd stop trying to second-guess what I was trying to do and constantly get in the way by thinking it knows better than me.

This article, as well as the one from yesterday, really resonated with me. As a child of the 70s/80s, riding the wave of the home computing boom in the UK starting with the early 8-bit systems (ZX81, Spectrum, Beeb) then through the 16/32-bit era (Amiga, ST, Archimedes, Mac Classic) and into those early years of PCs moving out of the corporate environment into the home (Win3.1 onwards, plus a bit of OS/2 Warp), I've spent decades gaining experience of using a wide variety of systems all of which had at their heart a fundamental respect for the person sat in front of them, and a clear design ethos based around facilitating their ability to use the systems in whichever ways they so desired, no matter what the original intent of the design team might have been. Hence the myriad of games, demos and other bits of software which constantly pushed those systems to do things that impressed even the designers.

Today, sitting in front of a W10/11 PC with infinitely more raw potential than anything I've ever had access to before, yet which feels utterly stifling to use, I find myself increasingly longing for a return to those genuinely good old days when computing was still fun and exciting, and reading the words of someone who was a part of those times, is in a position to speak with some authority, and seems to have much the same opinion of the state of affairs these days, is both refreshing and depressing in equal measure, and I wonder how much more pain the average PC user will be willing to endure before there's a collective scream of "enough is ENOUGH!" and the pushback begins.

Tablet market stalls because there’s not much new worth buying

ChrisC Silver badge

Re: Doing Art on a Tablet PC

Have you tried any tablets that integrate active digitizers to enable use of dedicated pens, or are you just basing this opinion solely on the woeful passive stylus performance fundamentally limited by the capabilities of the standard touchscreen itself?

Based on your reasoning and dismissive attitude towards using tablets in this way, I rather suspect the latter, because the former is also the underlying tech for graphics tablets which very much ARE used by professional artists, and even in their potentially more performance-compromised tablet implementations there is still a world of difference in performance between an active digitizer pen and a passive touchscreen stylus.

The CAPITAL LETTERS trick that helped merge Windows 95 into NT

ChrisC Silver badge

Even with source control systems, being able to simply search the source code within your IDE/editor and almost instantly find every instance of an original vs a fixed version of a macro, function call etc, can be invaluable.

Windows 11 update breaks localhost, prompting mass uninstall workaround

ChrisC Silver badge

Re: "Microsoft's quality control department"

Naah, because painting by numbers implies the creation of something which doesn't look polished but is at least generally recognisable as the intended thing, contains a variety of contrasting colours, and is usually considered to be a fun activity to be a part of. The latest few iterations of the Windows UI, not so much to any of those points...