* Posts by Elledan

109 posts • joined 3 Mar 2017


When one open-source package riddled with vulns pulls in dozens of others, what's a dev to do?


I had the joyful experience of working on an embedded JS project (commercial) once, that involved a significant amount of NodeJS and NPM fluff.

You don't understand how bad NPM is until you have glanced at the node_modules folder, noticed it's 1+ GB for a modestly sized project, and finding that deleting said folder in MacOS takes many, many seconds, SSD be darned.

And don't get me started on NodeJS. Using NodeJS for 'embedded' means having a literal Hello World app sucking down 70 MB with just a basic comms framework in place that does some JSON parsing and writing.

The effect of all that has been that I now refuse to do anything 'embedded' that doesn't at the very least involve a significant amount of C, C++ or Ada.Even TypeScript won't lure me back into the wonderful world of NodeJS.

Skype for Windows 10 and Skype for Desktop duke it out: Only Electron left standing


The future is a website?

Any bets on how long it'll take for Windows 10 to turn into just an Edge instance that logs into the Microsoft Cloud, like some kind of Chromebook?

Clearly TypeScript is the language to learn if one wants to make it at MSFT.

Splunk to junk masters and slaves once a committee figures out replacements


Grounds for firing?

Does using any of these doubleplusungood words in company communications or source code/comments constitute a breach of contract that would result in a person being immediately fired from said company?

Are we going to have people rejected from open source projects because they felt that censoring said doubleplusungood words is rather daft, but the Thought Police whose function has by then been enshrined in the Code of Conduct lacks any sense of humour or capacity for rational thought?

Also, shame about the lack of action to fix modern day slavery, inequality in society (how many people living below the poverty line again?) and human rights abuses by countless governments. But you got those doubleplusungood words censored.

Gawd, I loathe SJWs.

Amazon declined to sell a book so Elon Musk called for it to be broken up


Amazon is a shady bookstore

I'm thinking that it would be awesome if Amazon was more selective about the books which it offers for sale. The number of loony conspiracy books (ebook or printed-on-demand) that are available almost drown out the legitimate books for certain topics. And when you pop over to the Korean or Japanese book sections to see what's new, you find yourself drowning in softcore porn rags. Even with the 'adult content' filter enabled.

But I guess good on this nut for getting another conspiracy book on Amazon.

Please do support your local bookstore.

Microsoft's carefully crafted Surfaces are having trouble with its carefully crafted Windows 10 May 2020 Update


Rock, meet hard place.

So on one hand one can elect to have one's Windows 10 installation regularly b0rked by official updates, and on the other hand there is the scaremongering about not 'upgrading' from Windows 7.

Maybe unsupported Windows OSes are the safest ones at this point...

Nice wallpaper you've got there. It would be a shame if it bricked your phone

IT Angle

The cardinal sin of programming

As they phrase it in software development courses, provided example code is usually devoid of all the error and bound checking that would be required in production code.

Also, does this mean that nobody at Google ever in the history of Androd tried to test this 'app' with an RGB image? For a core service whose functioning determines whether a phone works or is just a useless brick stuck in an endless reboot loop?

That seems hard to swallow, but then again, we are talking about software development here...

Microsoft announces official Windows package manager. 'Not a package manager' users snap back


Re: Pacman

As an MSYS2 and formerly mostly Ubuntu user, I can say that using the former has made my introduction to Manjaro infinitely easier and more pleasant. Takes a bit to learn the pacman vocabulary, but now I can confidently -Syu and -Ss my way to victory :)

Makes me really glad that MSYS2 went with a sane, existing package manager instead of rolling their own.

*eyes Microsoft askance*



As a software developer who mostly uses Windows, I have found the pacman package manager (same as in Arch/Manjaro Linux) to be exceedingly useful. Together with the MSYS2 environment, one gets a Linux-like environment, access to a package manager that makes installing and managing libraries, tools and other dependencies as easy as on Linux.

As a bonus, this approach allows me to use the same shell scripts and build tools across Windows, Linux, BSD and other platforms with minimal effort. Not something that can be said for using the prescribed Microsoft Way (tm), I'm sure.

Google rolls out pro-privacy DNS-over-HTTPS support in Chrome 83... with a handy kill switch for corporate IT


DoH is still stupid

The thing is that DoH doesn't add anything that DoT doesn't already do, while also making network security (as noted) impossible. How do you distinguish some spyware sneaking its HTTPS DNS queries along with other HTTPS traffic, after all?

DoH also doesn't solve the most important issues, that of validating whether a DNS record one obtained from the DNS server is genuine (requires DNSSEC), nor does it keep the details of your DNS query get shouted across the entire DNS network. This latter point requires the implementation of QNAME minimisation, also not a part of DoH.

At best, DoH is a red herring for internet security. At worst it's a trojan that enables the destruction of one's network and system security and privacy.

Rust marks five years since its 1.0 release: The long and winding road actually works


From a C++ & Ada perspective, Rust fails in multiple areas:

* It weakens strong typing by adding type inference (much like the often abused 'auto' keyword in C++).

* It prefers obscure symbol series over clear phrasing in English words.

* It allows for many ways to accomplish the same thing.

* Its non-obvious syntax that does nothing to prevent logic errors.

* Its crusade against OOP and replacing of it by a much harder to learn and correctly use alternative.

The fact that the Rust developers were at no point inspired by anything in Ada/SPARK, which is unquestionably at this point the pinnacle of safe and reliable programming, should speak volumes. Maybe reading the Steelman requirements before Mozilla's devs embarked on throwing out the baby with the bathwater might have been helpful.

Better late than never... Google Chrome to kill off 'tiny' number of mobile web ads that gobble battery, CPU power


Re: I went back to Web 1.0

I still feel bad for having a 100+ kB JPEG header image on my personal website. Probably bloats up the first page load to a grand total of 160 kB or so, including the bit of CSS. Not a single line of JavaScript, of course. Any scripting that needs doing can be done with CSS as well, which saves a lot of space and CPU cycles :)


I went back to Web 1.0

Back around 2000 when I was fiddling about with web development, there was no real use of JavaScript worth speaking of. Us web designers were in the midst of moving to CSS-based styling (and layout, sanity permitting) and any ads that were around were text or image-based. Most of us designers even optimised any images, trying to shave off a few more bytes so that it'd load half a second faster over a 56k dial-up connection.

Move forward by twenty years, and optimising for file size as well as overall accessibility (websites that break fully without JS enabled, and do not work well with screenreaders etc. when it is) have been tossed out of the window. Want to browse the web with a dial-up modem? At blistering 2-4 kB/s, that 2 MB page (even with some cached data) is going to take ~34 seconds at an average speed of 3 kB/s. Optimistically, that's you waiting under a minute for a single page.

But who is using 56k dial-up these days, you ask? Roughly 2 million Americans, for one: https://www.dailydot.com/debug/dvd-rental-windows-3-aol-2017/

Even though I have had access to a fast broadband connection for many years now, I still do not appreciate the way that pages have bloated up, the myriad of JavaScript pop-ups and JavaScript-based ads. The JavaScript-based reloading and changing of the page while it's loading, causing the browser to re-render and re-render it multiple times, which causes the fun 'skipping around' effect one sees on pages these days.

These days I am using a non-commercial browser (Pale Moon), with JavaScript enabled per website address (NoScript) and ads fully blocked (uBlock Origin). I have tried to go back to just using the web again like I used to in the past, even with just a bit of ad-blocking to weed out the worst offenders, but the degradation in overall usability is shocking. Ironically, using glitched-out websites due to no JS and missing stacks of ads making the layout collapse is still a better user experience than the other way around.

Really makes one wonder where things will go from here. Moving the rendering engine out of the browser and into JavaScript, so that it is 'always up to date'?

Node.js creator delivers Deno 1.0, a new runtime that fixes 'design mistakes in Node'


JavaScript: a single-threaded prototype-based language that only supports ASCII strings and IEEE-754 32-bit floats as types.

TypeScript: JavaScript, with the prototype part (optionally) replaced with (some) typing.

Rust: like C++11, but with OOP and strong typing removed. Loved by folk who thought Brainf*ck was too easy to read.

C++: running the show along with C.

Visual Studio Code 1.45 released: Binary custom editors and 'unbiased Notebook solution' in the works


I regularly write PHP in Notepad++, but haven't felt any need for anything beyond syntax highlight and auto-indent. Around 2000 I did use a number of PHP-oriented IDEs, but found myself never really using their features beyond said auto-indent and syntax highlighting.

On Linux I use Vim, which also has an extensive collection of extensions. As a bonus, Vim works on just about any platform as well, and even without an X server.


But why use the VS Code website for this, instead of the VS application? What about any of the IntelliJ IDEs? They're also highly extensible.I have used IntelliJ and noticed the same functionality built-in there.

One could also use Vim in this case, with the massive amount of extensions for it.


Re: They're for ctrl+space whores

I'm a senior C++ developer by trade, and that's not a scenario that I recognise. After enjoying auto-complete and such 'features' in VS Pro, XCode, Netbeans, etc., the only language where I actually appreciate that feature is in Java. For C, C++ (and increasingly Ada), the time I could conceivably save by something like Intellisense is negligible, and possibly negative.

My usual workflow consists out of a few reference pages open, Notepad++ with the code, and a couple of terminal windows for compilation and debugging. I have found that anything beyond syntax highlighting and auto-indent merely gets in the way.

But if I wanted to use Intellisense, why wouldn't I just use Visual Studio, instead of this VS Code website?


As a daily user of tricked-out instances of Notepad++ and Vim, VS Code confuses me.

I know what it is, I know that people use it, but what I cannot figure out is the 'why'. I have used all kinds of IDEs, from VS Pro to XCode, Eclipse and IntelliJ, and as mentioned have settled for mostly NP++ and Vim. I do not see where VS Code fits in all of this. It's a bit like VS, with like 99% fewer features and performance. It lacks basic features that Notepad++ (and Crimson Editor before it in the 90s) already had.

So again I ask, why is it that people use VS Code? What species of developer does it target who wouldn't be happier with anything else?

If it feels like the software world is held together by string and a prayer, we don't blame you: Facebook SDK snafu breaks top iOS apps

IT Angle

Paranoia is a good thing

The main rule when dealing with user input in one's application has always been to never trust said user. Expect the worst kind of mangled, hopelessly incorrect data. Ergo one sanitises incoming data and bails out early if something seems fishy. With third-party libraries and code it's no different. Even for one's own code and libraries checking input data (when called from a function, or as the return value from calling some method) has to be standard, not optional.

Clearly Facebook's library did not bother checking the input, which then cascaded into taking down the rest of the application with them. Of course, with JavaScript and increasingly more new languages that are weakly typed (Kotlin, Swift, Rust, etc.), a lot of (static & dynamic) type validation is being tossed out of the window, with things seemingly working fine until the runtime hits a type conversion that is impossible, throwing an exception.

With a language like Java that has no stack-based variables, one technically had to validate every incoming parameter for being a Null type. Since nobody every did this, NullPointerExceptions are still super-common in Java code. With weakly typed languages (like JS and Python) the only time that you will encounter the really fun bugs is when you get a stacktrace barfed at you (Python) or the app fails silently (JavaScript) while the code runs in production (because testing & staging is for losers).

Does anyone ever really trust code someone else wrote, or worse: code one wrote themselves?

In Rust we trust? Yes, but we want better tools and wider usage, say devs


Re: Realistic business scenario

Considering that Rust is a weakly-typed language (akin to Python), I'd not want to use it over C++. While one can use strong typing in Rust, it is completely optional, with the default Python-like inferred typing the default.Using strong typing in Rust requires a significant amount more code, which makes what comes by default an error-prone business in Rust.

This in addition to the symbol soup Rust introduced instead of normal English phrases (like Ada) and a highly complicated 'alternative' to OOP and classes which has a beyond vertical learning curve, inviting even more errors in one's code.

Basically, just use Python or C++ (or Ada). But not Rust. Maybe use Go, if you really need to.


Re: Thoughts from an old fogey

So you're making the argument that everybody should not be using Rust, but Go?

Because by those metrics, Go is a far better choice than Rust, providing memory safety and overall restricting what one can do. See for yourself: https://en.wikipedia.org/wiki/Go_(programming_language)


Got tools?

As a senior C++ developer and part-time Ada (among others) developer, my 'IDE' is either Notepad++ or Vim, and I use gdb, Valgrind, etc. on the CLI.

Perhaps ironically, I used to heavily use IDEs in the past, including Visual Studio 2010 Pro. Over the years I have moved away from all of those, as I didn't se the perceived benefit of using IDEs.

Maybe I'm just an odd-ball, but I see IDEs more like something that Java, Kotlin, JavaScript and Python developers would demand.

As for Rust itself... I looked at it, poked at it a bit, but to me it feels like someone mashed Python and C++ together, along with a few other weird choices (no OOP) and missing every single reason why Ada is the best and safest language out there (ergo it being used for avionics, etc.).

We're number two! Microsoft's Edge browser slips past Firefox in latest set of NetMarketShare figures


Re: Why the decline of Firefox?

I have used Firefox since before it was called Firefox, yet I gave up on Firefox a few years ago because I didn't care for the direction it was heading.

I now use Pale Moon as my main browser, which is an evolution of 'old-school' Firefox, keeping the powerful XUL extensions and NPAPI support, while cutting the bloat that nobody was using. Because Pale Moon doesn't use the crippled WebExtensions, it can use the more powerful NoScript and uBlock Origin addons, as XUL allows for direct access to a lot of the browser's internals.

Much like many others who are using Pale Moon, Waterfox and similar Fx spin-offs, we like Firefox still, just not this 'Firefox' that Mozilla tries to pawn off as the 'real deal' nowadays. If it wasn't for the fact that Firefox doesn't use Chromium (yet), one might as well just call it another Chromium-based browser. It follows the same APIs as Chrome, after all, to the point where one can install Chrome addons in Firefox and have them work without issues.

So yeah, that's why I'm no longer using Mozilla's Firefox.

NASA mulls restoring Saturn V to service as SLS delays and costs mount



The best April 1st jokes are the ones which bear a bitter grain of truth :)

Microsoft drops a seemingly innocuous Windows Insider build, teases the future


Welcome to the world of Agile.

There was this big fuss about Microsoft shifting from the antiquated 'waterfall' development model, and shifting to the hip & modern 'Agile/Scrum' method. What this effectively meant, however, was firing the QA department, no longer bothering with releases, and instead having developers churn out half-tested code into chunks that'd be spat out at 'Fast Ring' , 'Slow Ring' and 'Please Don't Hurt Me Ring', AKA 'Business customers'.

Which means that if you use Windows 10 and aren't being paid by Microsoft, you're effectively an unpaid Microsoft employee. Congrats :)


Re: Start Menu flakier than ever

I have been using Classic Start Menu since Windows 7 (2009) on any system that I own to keep the Windows 2000-style start menu. Occasionally I find myself confronted with the 'new' start menu attempts in Window 7, 8.1 and the dumpster fire that is the Windows 10 'start menu'.

I still fail to see where the Windows 2000 start menu needed such a big change, especially with the hiding of applications and easy access to settings dialogues.

Maybe you're supposed to just ask Cortana to find everything for you these days?

Watching you, with a Vue to a Kill: Wikimedia developers dismiss React for JavaScript makeover despite complaints


Re: Wikimedia uses JavaScript?

I loathe sites which just tell you 'You must enable JavaScript to use this site" more than I used to kinda-sorta loathe Flash-only sites, or Java applets for that matter.


Wikimedia uses JavaScript?

And here I am, using Wikipedia et al. like a chump with NoScript nuking any potential JS and not noticing any loss of functionality.

Wikipedia is text with images, what use does JavaScript have there? I'm genuinely curious as to what I may be missing out on.

Microsoft's GitHub absorbs NPM into its code-hosting empire: JavaScript library vault used by 12 million devs now under Redmond's roof


Re: TypeScript + NPM = ?

Interesting. It's been a few years since I last used TypeScript (and tried to convince my colleagues to use it over plain JS, with mixed success). Guess that this would make TypeScript even more of a drop-in solution than it used to be.


TypeScript + NPM = ?

Considering that Github is owned by Microsoft, and Microsoft also created the TypeScript language (kind of like JS++), it would be interesting to see whether this means that the NodeJS ecosystem will move closer to 'NodeTS'.

Microsoft frees Windows Subsystem for Linux 2 from the shackles of, er, Windows?


WSL versus running Linux in a VM

The main use case which I had for WSL was while doing cross-platform development, as well as embedded development using a Linux-only build chain. As my preferred OS is Windows (because of its coherent set of APIs and singular desktop environment), I had previously used virtual machines running $Linux for this development. After switching to a new work laptop with Windows 10, I decided to use WSL instead of VMs. With the device-passthrough (for USB serial interfaces, etc.) it saved me the trouble of firing up a VM.

To me that is a nice use of WSL. Yet it comes with the caveat of only being able to run a single $Linux. When I look at the list of VMs that I have in VirtualBox right now, it covers the whole gamut of $Linux, from $Debian (Mint, Ubuntu, Raspbian, straight Debian) to $Arch (mostly Manjaro) to more exotic $Linux like Alpine Linux. One may also have RedHat/Fedora/CentOS in that list.

The thing there is namely that $Linux =/= $Linux unless it is the exact same distribution or at the very least same lineage. Not every $Linux uses the same start-up scripts & service manager, standard shell, or has the same standard library, or same default libraries installed. Many use a different package manager or other quirks that mean that one has to test on that $Linux and not another.

In my experience, this is where WSL (2) should really be treated as as 'Microsoft Linux', and shows clearly where the limits lie. For my uses as an (embedded) developer, the VM-based approach suits me fine, giving me the full power of Windows (as host), Linux and whatever other OS I may install in its own full-featured virtual system.

I imagine that this 'Microsoft Linux' may fit the use cases for a number of folk out there, however.

Boots on Moon? Well, the boot part is right: Audit of NASA's Space Launch System reveals more delays, cost overruns


White Eleph^WRocket

Thing about the SLS is that no one with a clue would really expect that thing to ever fly. The 'SLS program' essentially raided museums for some Shuttle-era hardware and duct-taped it to some other bits and pieces that were still lying around back from the days when NASA and contractors still did something with rockets. Even the development plan (blocks) looks ridiculous in so many ways when one keeps in mind that they're essentially using 1970s-era technology.

When one looks at nations like China and India, along with private companies such as SpaceX and Blue Origin developing new engines and coherent designs that already include or will include some kind of reusability in the near future, the difference couldn't be more stark.

My prediction is that there will never be an SLS launch, at least not with actual humans in the capsule. The program will drag out for another 10-20 years until it's taken out back because the Senate Lubrication System will have fulfilled its purpose. A purpose that never included space travel. This might actually be the thing that finishes off NASA for good, after decades of budget cuts.

Maybe I'm just overly skeptical, though.

Fancy that: Hacking airliner systems doesn't make them magically fall out of the sky


The human factor

Since commercial air traffic became a thing in the late 1940s and 50s, the industry had to deal with countless teething issues. Early planes didn't have redundant control systems, or sensors to measure almost every conceivable parameter, whether for consumption by the pilot or for the FDR. Pilots also didn't have to file a flight plan, or stick to a specific route. That led to an infamous 1950s mid-air collision above the Grand Canyon, when a pilot decided to give his passengers a good look of this natural marvel.

Over the past decades, air travel has become immensely more safe. Every incident and every crash led to improvements and more knowledge that improved airplanes, radar systems and so much more. Some lessons were hard-learned, such as TWA800, and the countless crashes due to sudden downdrafts.

What hasn't changed much, however, is the human inside the cockpit. Aside from that today's generations of pilots are unlikely to have served in the air force, getting most of their experience flying above the battlefields of WWII, Korea and Vietnam. Instead it's mostly about the training that these new pilots have received, which unfortunately doesn't always suffice, as was learned with AF447, where the PF (pilot flying) got confused in the dark over the Atlantic Ocean when he got handed back control by the board computer due to frozen pitot tubes returning conflicting readings, managed to yank on the controls a few times, get the airplane into a left-banking, oscillating turn, nearly stalled the airplane a few times, got confused by the stall warnings before getting the airplane into a proper stall and having it drop out of the skies into the ocean.

Such cases of pilots managing to wreck perfectly fine airplanes for no good reason are sadly becoming a large part of today's crashes. In large part this seems to be due to either the pilot becoming confused and losing his sense of orientation, not trusting the instruments, or becoming overly focused on a single, often irrelevant, detail while ignoring the issues that will kill them in a few moments. Like the captain who insisted on debugging the lights for the landing gear while circling around the airport, until his plane ran out of fuel.

Systems like TCAS, ground and stall warning and ILS are there primarily to assist the pilot, but they're there as suggestions, not as commandments, and it has been decided that the pilot ultimately remains in control. As modern day crashes and incidents show, this is both a positive and a negative thing. Unfortunately, both human and machine are still flawed in the end.

There's been a lot of research and studies by NASA and others on cockpit behaviour, which has led to improved use of checklists and much more. Everything from social interactions between the captain and co-pilot, the adherence to protocol and the dealing with unexpected events all can become a single link in the chain that leads to an accident.

Here I find a phrase that's often uttered in professional pilot circles quite useful when thinking about the right thing to do in a situation as a pilot: 'How would this look in the NTSB report?'

Famed Apple analyst chances his Arm-based Macs that Apple kit will land next year


Where are the benchmarks?

It's easy to make claims about 'ARM having caught up with Intel CPUs', but color me a sepia-shade of skeptical until this is backed up with some solid evidence. When for example the Raspberry Pi 4 (with zippy quad Cortex-A72 cores) is promoted as a 'desktop replacement', but as Wired notes, this would be a 2012-era desktop: https://www.wired.com/review/raspberry-pi-4/

Until there are some solid apples-to-apples (excuse the pun) benchmarks that show that Apple's ARM SoCs come even remotely close to a budget AMD APU in a fair comparison, I'll take this analyst's claims with a 1 kg bag of the finest skeptic's salt.

Firefox now defaults to DNS-over-HTTPS for US netizens and some are dischuffed about this


The answer is obvious: use DNSSEC and DNS-over-TLS (DoT), with the latter having its own port instead of trying to sneak along with other HTTPS traffic.

Client-side DNSSEC validation ensures that the DNS record is genuine. DoT ensures that nobody can look at those queries (for whatever reason...).

DoH is an unnecessary complication to DoT that adds a lot more overhead. Having random apps on your system dodging local and network-level security by pretending to be plain HTTPS traffic is a security nightmare. Implementing DoH on embedded devices is inconceivable, unless unnecessarily adding an entire webserver to said device as a dependency overhead (and security risk) can be considered to be a good idea by anyone.

Get in the C: Raspberry Pi 4 can handle a wider range of USB adapters thanks to revised design's silent arrival


Stll boggles the mind

When I wrote the article for Hackaday on the RPi 4 issue, I was hit by the fact that apparently nobody along the line, nobody at the RPi Foundation or any of the testers had used one of those 'smart' cables, or even validated the resistance values they got from the pins on the USB-C interface. In fact, there are clear and obvious examples of these expected values, so omitting an entire resistor and wiring the whole board-side up wrong, followed by never testing it is rather amazing.

If one of the main features of the Raspberry Pi 4 is that it now does USB-C, but it actually doesn't really, then that is rather embarrassing. Would love to hear a statement from the developers on this, other than the 'none of our volunteers testers used one of these cables' excuse that we did get.

Are we having fund yet, npm? CTO calls for patience after devs complain promised donations platform has stalled


Can you imagine if NPM were to go bankrupt? Maybe some volunteers would have to take over and ask for, oh, donations or so to keep the servers on.

How much is NPM's bucket o' JS cuttings worth to folk out there?

Ofcom measured UK's 5G radiation and found that, no, it won't give you cancer



The reference in the article to Chernobyl seems rather apt. As the recent TV series demonstrates, people are perfectly fine with believing that Chernobyl is a global disaster that killed millions and rendered half of Europe uninhabitable, and that Fukushima Daiichi will destroy all life on Earth (still waiting...), while at the same time when you actually go down on the ground in those places you'll find that... it's rather boring.

Sure, there are many spots around the Chernobyl reactor #4 you probably don't want to lick, and having a sleep-over right next to the Fukushima Daiichi buildings is not advisable (may raise your cancer risk a tad), but in general this 'radiation will kill all of us' fear was just that. In the case of Fukushima Daiichi, the 2011 Diet report and follow-up studies have shown that the evacuation itself was significantly more harmful than leaving everyone in place could ever have been. The number of health issues, suicides and other adverse effects among the (still displaced) evacuees is simply gruesome.

But hey, radiation, right?

Doesn't matter if it's ionising, non-ionising, or which part of the EM spectrum it falls on (including blue light), it'll all kill you. Somehow. Yet there are still tobacco plants greedily sucking lead-210 isotopes out of the soil because they like it more than plain calcium and magnesium, to pass these tasty alpha emitters (yes, ionising radiation) straight into the lungs of those same grateful smokers who'll be standing in those protests to protest against 5G.

I hope they remembered to have their basements checked for radon gas. Darn uranium in the soil.

Larry Tesler cut and pasted from this mortal coil: That thing you just did? He probably invented it


The AI Effect

Tesler was also known from the so-called 'AI Effect', which he formulated as follows:

"“Intelligence is whatever machines haven't done yet”. Many people define humanity partly by our allegedly unique intelligence. Whatever a machine—or an animal—can do must (those people say) be something other than intelligence."

Very sad to hear that he has passed away.

Chrome 81 beta hooks browser up to Web NFC, augmented-reality features

IT Angle

Re: The web is way too safe! We need more danger!

It seems to be related to this rise of JavaScript-based 'web apps' and its degenerate offspring: Electron 'apps'. Real developers who can program C or C++ cost money, so instead there's this nice ready supply of 'JS devs' who are cheap as chips and if you just provide them with some JS runtime APIs, they'll whip up an 'app' that'll violate every single native UI & UX behaviour rule and look, but by golly, one can save so much money by developing this way.

But what about 'security' and other good software development practices? Didn't need those for websites with a lifespan of 2 months, so don't need them for desktop apps. Same thing, after all.

I once joked about 'NodeOS', as in an entire OS based around JavaScript soon becoming a thing, before someone pointed out to me that NodeOS has been a real thing for a while now: https://node-os.com/

Welcome to the future :)

Oracle staff say Larry Ellison's fundraiser for Trump is against 'company ethics' – Oracle, ethics... what dimension have we fallen into?


Good luck with that

Consider the fact that we're talking about Oracle here, the company that has been known for time immortal to be a company so vile, so amoral and greed-driven that in comparison Microsoft and IBM at their worst appeared like choir boys. Of course dear ol' Larry would want to smooch up to his ol' boy Trump, because everything Trump does aligns perfectly fine with all that Oracle stands for: unbridled greed, tax cuts for the rich, and as few protections and rights for employees as possible.

I wish these folk luck in their effort to change Larry's mind, but... they work at Oracle, I'm sorry to say.


Re: Did they change the meaning of an acronym again?

And then they still managed to miss out the intersex community.

'That's here. That's home. That's us': It's 30 years since Voyager 1 looked back and squinted at a 'Pale Blue Dot'


Spaceship Earth

A view that's become quite popular is to look at planet Earth as our very own spaceship. Looking at this mostly-liquid rock with a thin slice of atmosphere and life support system on its crust making its way through space, it's not such a crazy view. Although dependent on the nearby star (the Sun) to support this thin layer of life, everything that was and makes up the history of Earth including the few nanoseconds that we have spent on its surface, all of it happened right there.

The Space Age has always been there. We just didn't bother to look up enough, away from Earth, at this massive Universe around us.

"These are the adventures of spaceship Earth. Our mission: to seek out new life and new civilisations. To boldly go where no human has gone before."

Or we can just keep bludgeoning each other over the head during squabbles in the proverbial galley over things nobody else in the Universe gives a toss about :)

Cache me if you can: HDD PC sales collapse in Europe as shoppers say yes siree to SSD


Yeah, no

My laptops over the years have all featured an SSD of some type, most currently a Samsung 970 Pro in a tricked out gaming laptop. Yet I don't really notice a difference compared to my HDD-only PC rigs.

Gaming? It's all about the CPU, PCIe speed, RAM (speed) and GPU specs.

YouTube/Netflix binging? Network speed.

Browsing and doing hardware & software development in a variety of applications? Mostly CPU and please get me some nice, big screens or working on a PCB layout will drive me insane.

Oh, I guess opening a spreadsheet or KiCad file may be a tad faster with an SSD. But I also like to have those little microbreaks in between tasks. I'm not a machine.

Considering that most recently QLC (quad-level charge) NAND Flash has become a big thing for 'affordable' SSD storage, and that this type of Flash has access times that are (ironically) not that far removed from what a zippy bucket of spinning rust can accomplish, I feel that the death of HDDs has been grossly exaggerated.

Since I'm maybe I/O bound due to the HDDs in my main PC rig about 0.1% of the time, I somehow don't see the appeal of splurging a few thousand Euros for the privilege of replacing the 11 TB of storage in this rig, not to mention the 24 TB NAS that's doing its thing on the network somewhere. While not noticing any appreciable change in performance.

Looking at the benchmarks out there, what I'm seeing is NAND Flash hitting a brick wall with QLC (and its upcoming 5-level successor) as 128 GB SSDs are still being pawned off as 'reasonable storage sizes', and 1 TB SSDs being extravagantly expensive, still. Not to mention QLCs horrid write endurance and other issues: https://www.anandtech.com/show/13078/the-intel-ssd-660p-ssd-review-qlc-nand-arrives

I guess if your target market is 'people who browse Facebook/Twitter & fiddle about with Google Docs', then yes, that Chromebook with eMMC Flash will be just fine. But you want to actually download more than two new AAA games from your Steam library onto your laptop. At the same time? Better start saving for that 2 TB NVME SSD or you're in for a lot of pain.

Maybe it's just that I have been reading articles proclaiming the 'imminent death of HDDs' for about a decade now, but somehow I'm not convinced that this time is going to be 'the one'.

What's the German word for stalling technology rollouts over health fears? Cos that plus 5G equals Switzerland


A few decades of experiments should be enough?

Considering that we have been filling our environments, homes and everywhere else with this kind of non-ionising radiation that 5G also uses since the 1980s (at least), and so far we have not seen a corresponding spike in cancer cases or other diseases, it would be fairly safe to say that these fears are overblown. There have been lots of studies on the effect of non-ionising radiation on for example brain tissue when one blathers into a smartphone for hours on end. So far the only possible effect found is that it might maybe warm up the tissue somewhat? But that could be from the warm or tasty smartphone cooking the side of your head as well.

Honestly, considering that most folk in Switzerland are voluntarily sucking lead-210 isotopes (alpha particle emitter, definitely ionising radiation) as well as polonium-210 isotopes into their lungs along with a whole batch of other carcinogenic or simply toxic substances whenever they smoke a cigarette (and people there love smoking, with no smoke-free zones on their train stations, even), one must question a lot of things about these fears.

Did anyone mention yet that since Switzerland is a mountainous region, they likely have a lot of (ionising) radon gas seeping out of the ground as well? Just saying that all of this might just be a tad ironic in the big picture.

NBD: A popular HTTP-fetching npm code library used by 48,000 other modules retires, no more updates coming


Just Node things

If you get the dependencies for any Node project that's older than a month or so, you'll see at least half a dozen 'deprecated' messages for direct or indirect dependencies. I wouldn't expect any project that's older than 11 months to even run any more, even if you can hit on the right NodeJS runtime version. This is fun when you're doing embedded JavaScript work (yes, it's a thing...) and have to update a two-year old app.

If there's anything that this news shows, it's that Node/NPM isn't a framework, but just whatever scraps of JavaScript one can dig out of the skip that's parked outside of NPM HQ.Else it would have an actual standard API for basic functionality like this.

Google Chrome to block file downloads – from .exe to .txt – over HTTP by default this year. And we're OK with this


Re: Annoying tho

There's still Pale Moon, which is pretty much a slimmed down version of classic Firefox, with an updated rendering engine, XUL-based extensions and NPAPI plugin support. I use this as my main browser for years now.

Basilisk is also by the Pale Moon developers, based on a newer Firefox codebase. It supports DRM and acts as a testing ground for new features in Pale Moon. I use this as my secondary browser, for watching Netflix and for things that require a separate Google account.

WaterFox is another Firefox-derived browser that has kept all of the classic features, though I haven't used it myself yet. It appears to be similar in scope to Basilisk, however.

Astroboffins may have raged at Elon's emissions staining the sky, but all those satellites will be more boon than bother


Re: This is where we are now

...and that should obviously have read StarLink, not SpaceLink. Well, close enough, I guess :)


This is where we are now

I think it's fair to say that if we had already wired up the globe with fiber internet, we would have little need for SpaceLink and kin. But we haven't. And clearly the demand for (fast) internet that is just there and is affordable exists. The benefits of giving the world + dog access to the internet have been documented extensively as well. This part all checks out.

Thus the solution is simple if these thousands of new satellites are an issue: install fast, affordable fiber internet everywhere people live.Have a few satellites maybe act as bridges between population gaps, but otherwise keep it terrestrial.

Ergo, governments slacking off on making internet available and affordable is the root cause why SpaceLink even exists? Intriguing if that's the case.

RIP FTP? File Transfer Protocol switched off by default in Chrome 80


Guess I'm a fossil, then

The rationale for removing features like FTP support is usually 'nobody uses it any more' and 'it's a security risk because it's not encrypted'. That seems to go hand-in-hand with wanting to disable HTTP support and switching everyone to HTTPS.

In the olden days, when Netscape 4.7x was still relevant, and the thought of tabbed browsers not a spark in anyone's imagination yet, encrypted comms was something you used when you had sensitive data to hide, like your shopping data and creditcard details, or your online banking sessions. Now it seems everything has to be encrypted, including those cat pictures you just downloaded after a friend sent you some links via (encrypted) email. Because imagine if someone read those things.

Your boss would be at your desk in the morning, foaming at the mouth as they slam a stack of incriminating photographic evidence of those cat pictures on your desk.

Let's ignore for a moment the irony that in those olden days everyone told you to never use your real name online or give anyone any personal details about you. Now we have Facebook, Twitter, et al. and the rush to spill as much of our personal lives on those sites including every detail about our offspring (who haven't consented, obviously). And Facebook et al. will never suffer an embarrassing data breach or bug that makes all 'private' data 'public'. Obviously.

What good is end-to-end encryption, Mr. Anderson, if both ends are leaky like a sieve?

Guess I'll be over in the ol' Greyboards corner, using my 'legacy' browsers, like Pale Moon, with its quaint 'plugins' and 'FTP' support.

Obviously my jacket is the one with the 'senior citizen' card in the pocket.



Biting the hand that feeds IT © 1998–2020