* Posts by Andrew Hodgkinson

130 publicly visible posts • joined 7 Mar 2008


Tesla Semi, out since December, already facing a recall over brakes

Andrew Hodgkinson

...which is a shame...

...because even in tiny New Zealand, they're looking promising - see for example https://www.1news.co.nz/2023/03/25/new-breed-of-electric-trucks-put-to-work-on-central-interceptor/ or on a smaller scale, https://www.greengorilla.co.nz/ev-trucks/.

Seeing as GPT-3 is great at faking info, you should lean into that, says Microsoft

Andrew Hodgkinson

For those who still don't realise...

...this isn't AI/AGI, it's just ML. A large language model. It understands nothing. It knows no *rules* - just pattern matches, which with a big enough data set can seem convincing but this, combined with the parameter programming designed to make it respond in a very confident and authoritative manner, make it downright dangerous.

It can't give you a right or wrong answer because it doesn't know what right or wrong means. It knows that it thinks your pattern of text input mathematically correlates highly with other inputs which in turn led to other outputs, which are then combined to produce a result which mathematically looks like it's probably following the grammar patterns it has combined with the expected response patterns based on the input text.

Crap analogy time: Imagine an image matching ML system trained on cats. One day, it's given something that to a human is clearly a plant, but it's been trimmed and the colours of leaves and branches have been set up to make it look like a cat. If it mathematically matches very close to cat, the ML system might get 'plant', but it'll strongly get 'cat' and won't know *rules* that a cat can't be a plant, because it *understands* nothing. It cannot apply logic or reason. So it'll say "cat", and be wrong. LLMs are the text equivalent of the same thing. Give it enough data and it might start being able to say "cat *and* plant", or even have enough parameters to have never seen something that is both a cat and a plant so know no rules for it, but statistically speaking lean that way anyway; and so, it gives the illusion of understanding, without any. Doesn't know biology; RNA, DNA, how those work; nothing. No objective foundation in the laws of mathematics; not AGI. Just *fake*; an illusion.

It's also why image synthesisers like Lensa or MidJourney mangle fingers. They don't know what anything *is*, so they don't know what "rules" fingers have. Don't know there should be four fingers and a thumb, rules about symmetry, the way they can bend; just kinda makes something fingers-like that's close enough based on the maths and stats -> job done. And the result is, typically, very broken. Imagine the text equivalent of not knowing what fingers are. Yeah. That's where we are.

All this is why ChatGPT infamously responded with confidence about the bears Russia put into space. Sure, subsequent case-by-case hacks close off the embarrassing, very public mistakes, but that doesn't mean the technology suddenly magically works in any different a way. OpenAI is an Elon Musk venture originally and now largely controlled by Microsoft, so with either form of leadership, it's going to be (my opinion) ruthlessly immoral and entirely interested in profit, seeking it at any cost, legal if forced to, illegal if it can get away with it - e.g. by misleading advertising in countries where that's not permitted or embrace-extend-extinguish to remove competition (all, again, just my opinion).

So, the company is IMHO spouting BS about what this can do, the public are largely buying it and other companies are then spending money like it's going out of fashion to incorporate the largely useless and broken technology into their systems. They've either not so much drunk, as drowned in the Kool-Aid, or they're well aware that it's all nonsense, but think there'll be a good ROI because their own *customers* don't know any better and they're very happy to maintain that status quo. The net result is software that's even more bloated, even more buggy and even more all-round unpredictable. Slow - fucking - clap.

Any ML system can be a fun (if legally/morally dubious, due to training set!) way to generate fiction/expressive works, be it text or image, where accuracy or "makes any sense" aren't required. To front a search engine with it, where accuracy is key, is one of the most serious and flagrant breaches of duty of care I've ever witnessed and will *very severely* increase the misinformation problem with which our society is already struggling (and largely failing) to cope.

Tesla reports two more fatal Autopilot accidents to the NHTSA

Andrew Hodgkinson

I still don't understand how they get away with this!

In the UK, as far as I'm aware, advertising cannot be misleading.

If Tesla called their system ADAS, it'd be fine - Automated Driver Assistance, yeah, got it. Assists me. Instead, Tesla called it "Autopilot" and the name alone certainly gives the impression of being rather more than just assistance. I'm pretty sure their marketing-of-the-time was trying to give a self-driving impression too, but they've obviously reigned it in a lot since.

Lately, however, Tesla have had something which surely crosses they line - they call it Full Self Driving. That is literally its name. It is not full self driving at all; it's just ADAS. They warn you that full self driving is not full self driving as if that's somehow supposed to alleviate them of any responsibility...


It's bizarre. How can they be so completely misleading with a product named in a way that specifically says it is something it is not and this be allowed? Even in the USA, it seems like a stretch.

Multi-factor auth fatigue is real – and it's why you may be in the headlines next

Andrew Hodgkinson

Why are they sending notifications at all?

The article appears to not mention the most sensible solution - using a 3rd party MFA app and prompting the user to type in the 6-digit code, rather than using any kind of SMS or notification. SMS should be a fallback only for users who insist they can't run an app and notifications should just not be a thing.

You can't bombard a user with notifications when there aren't any. This whole thing is bizarre - once again, our industry sucks - it never learns anything from past mistakes while simultaneously inventing new ways to fail. This is why I had to stop reading "comp.risks" in the end; the repetition was too depressing.

If Twitter forgets your timeline preference, and you're using Safari, this is why

Andrew Hodgkinson


...lazy, not competent or hobbled-by-crap-company-structure (probably the latter) devs stored user preferences client-side in the browser rather than in the user record on their side.

Why is this a stupid, broken design? Try using more than one web browser, or more than one device.

The whole point of cleaning out the web-dev-dumpster-fire of client-side databases is to stop malicious actors - such as advertisers - from storing things on your computer indefinitely without your consent. Thank god at least one browser out there seems to be trying to stop it.

GitHub saved plaintext passwords of npm users in log files, post mortem reveals

Andrew Hodgkinson

Why is anyone surprised?

Microsoft bought GitHub, then Microsoft bought NPM, then Microsoft integrated the two.

We're surprised about elementary and severe security failures why, exactly?

Safari is crippling the mobile market, and we never even noticed

Andrew Hodgkinson

Today in "nonsense", we have...

...this article.

Web apps are slower because the technologies upon which they are based are inherently and irrevocably slower and more resource intensive than native coded applications. An in-advance compiled application (actual native code) will always be tighter than cross-platform web technologies, including WASM since - apart from numerous other reasons - the WASM modules are still _driven from_ JavaScript. Even if a JIT's going to produce native code, that all comes at a cost of RAM and, on a resource constrained device, RAM is *very* precious.

Below the JavaScript, you've got HTML and CSS which were never designed for application-like UIs, so it's a tortured mess of DIV soup and reams of CSS - typically messing around with the hyper-convoluted flexbox, especially if you have the audacity to want something that pre-flex Web found super hard and super advanced, like, y'know, vertical centring. Woah. Advanced stuff, web guys. As for autolayout with springs, struts and the like? Yeah, right. Once again, we're hacking around with bits of CSS that can be coerced into behaving in a similar fashion, given enough time and effort - and device resources to interpret and execute it all.

(The recent example of the performance of LibreOffice ported to WASM was a pretty stark example of how efficient those technologies aren't).

Moreover, there's no access to the native UI toolkit from these applications (no, HTML forms elements are *not* an application user interface toolkit). You need to construct everything from scratch. If you're lucky, you might be able to use a native form button and maybe an input field - but photo pickers, toolbars, popups, map views, tabs, master-detail views, navigation overlays, all of the animated transitions...

Your device's global settings offer a *built-in* native toolkit dark mode? Text size options? Bold-text-everywhere? High contrast mode? Distinguish-without-colour? Button shape settings? On-off labels? Transparency reduction? Motion reduction? Numerous accessibility options for navigation like switch control or audio descriptions that just work out-of-box on native elements? Tough. Reimplement it all again, from scratch, different every time, limited by at-best the comparatively meagre attribute decorations that HTML offers for accessibility *and* only if your devs know to use them (and use them everywhere at, again, great cost in time, testing and maintenance).

Even something as basic as proper scrolling mechanics often have to be coded from scratch, depending on what you're trying to scroll inside your giant tower of DIVs.

The whole debate is asinine. If you want a web page, write a web page. If you want a "web app", fine, you don't want to pay fees except your own hosting. Live with the fact that you're either going to produce a sub-par user experience on the lowest of lowest common denominator cross-platform options, or you're going to burn a truly vast amount of money on extra engineering resource to try and reimplement all the things that native code would've given out-of-box on Android or iOS - right up until next year, at least, when a new iOS or Android version changes how things looks, or introduces new features that all the already-written native framework apps just 'get', but your web app doesn't.

Want free of the "walled garden"? Good news! Android exists, and has a *huge* market share compared to iOS. Deploy off-store. Don't want to be limited by Safari on macOS? Good news! Windows and Linux exist and have an even *more* huge market share compared to macOS. I mean, who cares if you need to tell your users to bugger off and download the latest Chrome or Firefox or whatever because we all just *loved* it in the 1990s when web sites would tell us that our browsers weren't good enough, right? So knock yourselves out, use all those shiny new APIs that evil Apple isn't giving you.

But what if you wanted that juicy income from those rich iOS folks but don't wanna bother writing a native app because hosting it on the App Store (or Google Play Store, for that matter) means 30%? Well then yeah, it's not about your users, is it? It's about the money. The users have to accept something slower, of unknown security, of unknown privacy and with no control over when updates happen.

If any web app was worth beans then it'd be popular AF on Android, with people clamouring for a version on their Apple device, making Apple look bad until they did something about it. Ever heard of that for a web app? Even once? Nah, me neither.

As for when Google is pushing the latest new web API? Be afraid - or did you think somehow that Google were any less evil, or any less self-interested, than Apple?

Intel puts ‘desktop-caliber’ CPUs in laptops with 12th-gen Core HX

Andrew Hodgkinson

55W idle and 157W peak?!

Even if you're considering the laptop a "portable desktop", that seems pretty absurd; you'll be lucky to get more than a few tens of seconds at peak before everything thermal-throttles, especially if there's a GPU even remotely matching the workloads implied by the CPU's feature set. And as for performance on battery... Shudder.

For comparison, an MBP 16" M1 Max with 64GB RAM (the most power hungry model in the range) idles in macOS at around 7-8W and I've rarely managed to push it past about 70W at peak, which included GPU usage. I think the true peak is more like 110-120W, which is itself pretty high, but this includes all GPU cores running at full tilt as well - the Intel specs are for the CPU alone.

This compares apples (no pun intended) and oranges, perhaps, but the efficiency in the Intel offering here is almost comically bad. I struggle to see the point in a "desktop class" CPU intended for mobile use when it's got power use specs like that. Maybe you could build a small factor PC in the style of Apple's weird Mac Studio or similar; otherwise, surely it'd be cheaper & provide much more reliable performance (in every sense of "reliable") to just use a normal form factor machine with the desktop CPU inside.

Anyone have suggestions about the kind of device where the CPU's peak performance would actually be useful and sustained? TIA!

Review: Huawei's Matebook X Pro laptop is forgetful and forgettable

Andrew Hodgkinson

Re: "you can do better for the $2,000 or so Huawei charges for it"

Dunno why you got downvotes - Apple haters I guess; they're a horrendous company, but then, show me a big corporate that isn't? Google, Microsoft, Dell, Lenovo... Huawei?

At $2000 I'm kind of astounded at the poor value indicated here. Apple charges a LOT under Tim Cook for just about anything, but for $1999 US I can get the 14" MBP with 16GB+512GB and the M1 Pro that'll beat this Lenovo machine in just about any test with a battery life that by the sounds of things would be at least double, if not triple, the reviewer's quoted uptime of 6 hours, coupled with maximum performance available at all times, rather than only when plugged into the mains.

Now, OK, sure, it's heavier and bigger and maybe you want 3:2 etc. etc; the Mac *will* run Windows 11 ARM unofficially and very fast too, but yeah, it's macOS native and Linux almost-native, not x86, so we're not comparing like with like. Even so, if the size and weight are the issue, the 13" MBP same spec with M1 would almost certainly still outperform the Lenovo in just about every metric and cost "only" $1699 despite Apple's horrific markup for the 8->16GB bump from standard MBP 13" spec.

The PC world is meant to offer better value - but clearly, not in this case. It seems that some PC vendors need to catch up; prices need to drop, stability needs to improve and surely 11th gen or earlier Intel is largely a joke at this point, unless (IMHO) it's only at a 3-figure "budget"-ish price tag.

GitLab issues critical update after hard-coding passwords into accounts

Andrew Hodgkinson

That's correct, but...

...the way it works is that the unknown plain text password supplied by the user is hashed and compared with the existing one-way hashed database record. If there's a match, we deduce (via confidence in the unique-output-given-unique-input mathematics of our chosen hashing algorithm) that the input plain text value matches that which was given by the user when they first signed up.

The issue here is that the _plain text_ original value for a new account is being hard-coded as a first-time password that the user hasn't chosen (under doubtless some "less obvious than it sounds" set of conditions, else it'd have been spotted much sooner). It's stored hashed, but someone can still try to hack such accounts by making simple password guess attempts based on the known "default password" approach.

Web devs rally to challenge Apple App Store browser rules

Andrew Hodgkinson

Oh great, the genius web developer talent has weighed in

Given the comprehensive and almighty clusterfuck that web developers have proven capable of developing over the last few years, with catastrophically bloated frameworks, constant churn to new shiny, ridiculously bad security practices within package management and a truly horrific track record of vulnerabilities in browsers themselves as they continue to clamour for ever-more invasive and deep-rooted hooks into the host OS, I really think they should just shut the f*** up and get their own house in order before having the temerity to tell anyone else what to do.

These clowns took the web, open or not, and pretty much tore it to pieces with a set of absolutely awful technologies and awful implementations balanced on top. RAM and CPU requirements of even basic shopping web sites are now absurd, with broken navigation models and compatibility issues as they insist on ever-more recent browsers; their lazy, half-arsed, have-a-go-hero development approach is the bane of just about every end user on the planet.

That's before we even so much as glance at the list of grievances as far as web-based tracking and advertising go, where at least iOS applications are in theory held to some kind of privacy standard and required privacy declaration.

If it wasn't clear enough: Kindly bugger off.

AI really can't copyright the art it generates – US officials

Andrew Hodgkinson

The tool isn't important; the person wielding the tool matters

Whether I'm "generating" art (FSVO 'art') via BBC BASIC plotting random coloured pixels at random locations, or by picking up a can of paint and throwing it across a canvas, or by carefully drawing and shading things with pencils, or by clicking the "Go" button that's connected to some software-based image generator that may or may not be based on AI, they're all just tools.

The artist, if you can call them that, is the person using the tool to generate the picture - it's no more copyrighted by the algorithm's author than a pencil sketch is copyrighted by the inventor of that pencil, or an iPad sketch using an Apple Pencil is copyrighted by Apple, even though a tonne of software algorithms are used to interpret the hardware signals generated by the pencil and use them to simulate a pencil-like result via a set of pixels written into a bitmap plane.

Experimental WebAssembly port of LibreOffice released

Andrew Hodgkinson

> Secondly, and emphatically, WebAssembly is not "grossly inefficient"

I can only assume you're trying to be correct by being as narrow and pedantic as possible in the consideration of 'WebAssembly' by itself.

Every single person that's tried this particular example - which ostensibly consists of probably the highest ever ratio of pure WebAssembly to any other kind of code - has observed that it's glacially slow and uses truly stratospheric amounts of RAM.

You're trying to argue that something isn't grossly inefficient, right next to a great demonstration of that thing being, emphatically, grossly inefficient.

Avira also mines imaginary internet money on customers' PCs

Andrew Hodgkinson

Wait, what?

So through all these harmful acquisitions, Norton now own Norton, Avira, Symantec, Avast *and* AVG AV?

This is why it should never be possible to purchase brands! Sigh.

What's left that's any good?

Two non-Gtk Linux desktops have put out new versions

Andrew Hodgkinson

Re: Variety is the Spice of Life...

All other engineering disciplines are using software tools to make their lives dramatically easier, whilst the software industry keeps messing it up. Why?

I don't know, apart from perhaps blaming the two easiest targets - the general dubious competence levels in newer software development, coupled with the "NIH" syndrome that has always seemed to be present.

From a once-RISC OS, now MacOS (macOS) / iOS developer perspective, Interface Builder's fast and simple operation went backwards with the much slower and buggier Storyboard stuff in XCode, but at least it was still graphical. IMHO Apple's more recent move to Swift UI seems to have been a big backwards step, at least if a graphical design approach were to be dropped and leave only the code-writing option - which bewilderingly, many people seem to want to happen. Notwithstanding the dreadful documentation and bugs, it's just *much* more effort and producing something that actually looks good is, uuh, challenging to say the least.

I've frequently read a rationale concerning code reviews. You can easily diff UI-written-in-code, but can't easily diff the output of a visual tool. This is a shortcoming of the diff engine of course, not a shortcoming of the generation tool. In any case, if you've ever done a code review where someone's modified a bunch of XUL, or CSS, or even Swift UI code, you'll know that for all but the most trivial changes, you're very unlikely to have any clear idea of exactly how those changes will manifest visually, or whether or not this will produce a desired outcome; nor can you be too sure if there are reuses somewhere which could mean that the innocent looking change in a diff actually has wider impacts. There are no silver bullets. A diff tool that understood the output of the visual editor system and could respond with an appropriately visual diff, on the other hand - that would be valuable.

There are times and a places for UI components expressed in hand-written code or markup languages, of course, but the lack of good tools for things like desktop or mobile app development outside of the Apple ecosystem has always surprised me.

$600m in cryptocurrencies swiped from Poly Network

Andrew Hodgkinson


Innocent unless proven guilty, which looks like a small change but is really extremely important (otherwise, you're innocent until we prove you guilty - it's only a matter of time...). There are probably even fewer countries where that holds.

Android 12 beta lands bringing better personalisation, speed upgrades, and some privacy tools borrowed from iOS 14

Andrew Hodgkinson

Yuck - what a colour choice

Interesting choice to lead with Spew Yellow-Green for the hero images, with some off-magenta thrown in to some other examples I've seen online in the same screen - it's as if they went out of their way to choose the most horrifying colour combinations.

I suppose it's down to these super-l33t "designers" who are all innovative and cutting edge etc. etc. who've now finished strip-mining the 80s for variations on 2D user interfaces and linear graduated fills, and are now heading into the 90s for variants on those good ol' awful off-cyan/magenta colour schemes popular back then. I guess it'll be all rounded edge, bean-like design too - oh, wait - we're already doing that.


Looking forward to getting as far as ripping off design from the 2000s, so that we can get contrast back and easily identify what is or is not a clickable/tappable control.

Lock up your Peloton smart treadmills, watchdog warns families following one death, numerous injuries

Andrew Hodgkinson

I think it's more a design issue

Many treadmills have the rollers and tread on a bed that is pretty close to the ground. It is very hard to get underneath it. The belt is often, though not always hidden underneath behind a cheap plastic bottom cover plate, mitigating a potential "on deep pile carpet" issue if used in the home. See e.g.:


...noting the plastic guard along the rear roller/base (to which a carry handle is attached, but that's incidental) and fairing around the roller edges.

The Pelotron units look different - the belt is fully exposed on all sides and around the rollers, and they have riser feet that bring the whole assembly some distance off the ground. These feet are *not* positioned at the far corners of the base, but inset and, further, there is no kind of guard bar or any other piece of plastic or other super-cheap stuff to guard the end of the roller. It's fully exposed (and has quite a large diameter).


This doubtless looks good in Photoshop and is doubtless cheaper than anything incorporating additional fairing or guards - and I Am Not An Industrial Designer - but I would wager that if Mittens The Cat, Fido The Dog or Mary Your Small Child happen to come running in (someone opens the door and they squeeze in quickly, dog, cat or child can open doors with latch handles, etc.etc.), one's view of the safety of the chosen product might change somewhat as Mittens, Fido or Mary brush inadvertently against that rear roller.

Some kind of cheap clip-on plastic guard, or even just a tray, that could live under the machine and rose up just enough to guard that rear roller might help, but Pelotron argue that it's all fine & nothing is wrong & everyone who's been injured must be holding it wrong.

So how's .NET 6 coming along? Oh wow, Microsoft's multi-platform framework now includes... Windows

Andrew Hodgkinson

Re: Can we all please...

Nope, dead right. As more and more photoshop monkeys think they're "UX Experts", the worse it seems to get.It seems as if the minute someone calls it "UX" instead of "UI", you know you're in trouble.

Flavour-of-the-month visual design (most of which seems to be poorly implemented, probably because designers are trying, without the requisite skills, to poorly copy something they saw someone else do better in a blog the previous day), a complete and comprehensive lack of coherence or consistency across every section of the interface (which we can probably ascribe to card-dev-myopia from the infection of poor agile application just as much as we can ascribe to "UX" designer problems), and an almost wilful ignorance of just about any feature beyond the most drool-level single-finger, single-click kind of interaction in the system.

The majority of the now-historic wider platform conventions for cross-application cooperation are not there. Keyboard shortcuts if you're lucky. Multi-select following platform conventions? Forget it. Any kind of power user feature is stripped away leaving a bland, tasteless, white-space-wasting canvas of crap that's as pathetically inefficient, clumsy and irritating to use as the awful mobile app UIs from which all design is now derived. Most modern designers don't seem to be able to conceive of, or understand the utility of, a system which provides a richer array of input options than a fat finger. And that'd be fine if engineers implemented the designer's "look", but now they've been given control of the entire feature set and engineers turned into dumb drones that just do what's on the card and nothing else. The engineers are treated as if - and accordingly, increasingly seem to exhibit - no particular skill, no deep expertise, no creativity, no initiative and no interest. The outcome is inevitable.

Inmates running the asylum.

Mac OS X at 20: A rocky start, but it got the fundamentals right for a macOS future

Andrew Hodgkinson

Re: It was all downhill after Snow Leopard

Couldn't agree more. Snow Leopard was Apple's Windows 2000.

My first foray into OS X happened because of Vista, so I suppose I've got Microsoft to thank. Took one look at it and went "hell no", so got a 2011 Intel MBP, 4GB RAM. Fastest laptop I'd ever owned. Still working today, but it did need its discrete GPU re-soldering - early lead-free solder strikes again.

Speaking of "strikes again", suddenly, OS X Lion happened. I "upgraded" and everything ground to a halt - never mind multiple features being removed like the 4x4 desktop grid in favour of the hopeless horizontal-only strip-of-un-renameable-doom we have to this day, or the horrible linen background on any 2nd monitor thanks to the newly introduced, and thoroughly unreliable from app-to-app "full screen" mode, thankfully resolved in 10.8 - but the main problem was RAM.

Those days, you could upgrade RAM in Mac laptops. Imagine that! So I went from 4GB to 8GB. It was better, but still sluggish. So I went to the chipset maximum of 16GB and _finally_ Lion ran as well as Snow Leopard.

Four times less efficient. Four! And from a user-facing perspective, it seemed to have taken more features away than it added.

Similar story ever since - things Apple don't seem to have touched get more and more broken with each release, never to be fixed; occasionally some "big rewrite" happens and the new replacement is far more buggy, far bigger and slower and far less functional than its predecessor. Examples:

• Original iMovie rewrite tho that at least has improved over time

• iPhoto to Photos which hasn't improved over time

• Aperture to, well, what, Photos? Really?!

• iTunes to the incomprehensibly slow, buggy and feature-poor Music app which seems basically just stagnant

• The various incarnations of Messages which thanks to Catalyst have got more and more iOS-like, and less and less Mac-like, especially if you want to do something ground-breaking like selecting more than one conversation at once in its sidebar

So don't, for heaven's sake, ask for the Finder to be rewritten. I can't even begin to imagine what kind of bloated, dysfunctional horror story would arise (well - I guess I could look at the iOS Files app and have a pretty good idea... Shudder!).

Google changes course, proposes proprietary in-app purchase API as web standard

Andrew Hodgkinson

Re: "see the capability gap between native and web apps closed as much as possible"

PWAs don't have any less security that a standard web page run through chrome.

YES. Exactly the point.

Reading every update note for every app to see what amazing new features it had. Doesn't happen anymore

Because thanks it large part to agile and in part to lazy change log writing, "nothing apparently changed" updates on a 2 or 3 week cycle with update notes saying something lame like "bug fixes and performance improvements" are the norm. That aspect of lazy development is only tangentially related at best to PWA vs native. People don't read them anymore because app updates are very annoyingly frequent and very rarely have anything interesting to say in the notes.

As for PWA vs Native it's often a different use case and a decent PWA doesn't have to be 'janky and slow'.

Compared to native, sorry, yes it does; the basic domain of HTML, CSS and JavaScript is simply a very poor and very inefficient toolset for constructing the kind of dynamic user interfaces we associate with apps, for which native APIs like Cocoa Touch are dramatically better optimised and designed.

People don't search for apps randomly, it is based on a real user need and most people would prefer not to download a full blown native app just to get information when staying at a hotel, or pay for their pizza etc.

So your argument here appears to be that a PWA is used when people don't want a "full blown app". This is surely a tacit admission of the limitations of the web based environment compared to native. Basically we're saying we just need web sites optimised for mobile a lot of the time. Agree! We don't need or want really to install things. Agree! So, um, wait. What's PWA for again? It's not to replace native apps, and it's a kind of installable thing...

...glorified home screen bookmark with an over complicated cacheing model that would've been better served just by the built-in web browser providing a first class ability to pin things to the home screen? Oh, wait, we have the latter. So, uuuh... All we've really got left is the ability to work offline. And of course, given that we recognise native is best unless you'd really want just to access what's on a web page, and the point of the web page is to be online, we're left with what? Crap games written in JavaScript that'll just about work offline because you jumped through the hoops needed to declare the resource collection in a manner that allows it to be accessed as a PWA?

If it's worth an app, treat your users with respect and make a decent native app that conforms to all your target operating system's best-practice recommendations, layouts, integrations with the rest of the system, accessibility features, and so-on. Have it lean on native frameworks as much as it possibly can to keep the app as small as possible and allow it to conform, often with little or no effort, with things like dark modes or system-wide contrast variations or animation styles or whatever.

If it's not worth an app, just make sure your inherently online web site is a best-possible online web experience on mobile and stop wasting your development money on PWA bundling.

Andrew Hodgkinson

Re: "see the capability gap between native and web apps closed as much as possible"

Amen. As for PWAs, their slow, janky, non-native, non-integrated "performance" is of benefit only to the lazy developer who cares not for battery life (or multitasking, given that means sharing RAM) of the end user devices.

Native development is and will always be very much faster, very much more integrated and very much more respectful of local device idioms than a PWA. And yeah, in particular, given the history of web security - having a clueless JavaScript monkey hack up a way to siphon money off my credit card via some kind of JavaScript in app purchase API fills me with horror. We _all_ know exactly where that's going to end up.

Flaw hunter bags $75,000 off Apple after duping Safari into spying through iPhone, Mac cameras without permission

Andrew Hodgkinson

Re: Use our code its always better

From the article:

He found flaws in rarely used specifications that browsers nevertheless have to implement in order to be compliant with other code, but which do not get the same level of attention as commonly used parts of the browser API.

Pretty sure there'll be variants of this found, if you tried hard enough, in Chrome, Chromium, Opera, Firefox, Edge, MSIE...

Yes, Apple's software quality is increasingly terrible, but rancid specs are the bane of the web world. There is a litany of errors, with increasingly absurd, edge case riddled and ginormous specs comprising a bad joke as we go from version to version. Look at HTML 4 or XHTML 1.1 versus HTML 5, for example, or even CSS 1 vs CSS 2.

It's a nightmare of a job to implement this stuff. I know, I've done it, many years ago now; HTML 4 was new back then. I'm glad I'm not trying to do it in an HTML 5, CSS 3 world, especially not with modern JavaScript / ECMAScript and the bazillion flavours of that along with its ever-growing list of ever-more invasive interfaces into the host operating system as lazy programming (and a deficit of half-decent alternatives) continue to make engineers hell-bent on some kind of 'write once, run everywhere' model that competes with native-written applications tailored for the host operating system. Lowest common denominator is the _best_ outcome you can hope for. As for security? Forget it.

ZTE Nubia Z20: It's £499. It's a great phone. Buy it. Or don't. We don't care

Andrew Hodgkinson

Photos are "excellent"? In what universe?

Seriously, that autumn park photo was just absolutely *horrible* - weird, impressionistic and inconsistent smearing and edge artefacts as some kind of screwed up noise reduction system seemed to have turned it into a result resembling impressionism. The trees themselves (particularly the lower parts, in shadow) were bad, but worse yet was any of the grass under them or in the distance. Just the in-page image looked fuzzy and strange, but when zoomed in, it looked outright faulty. Hideous.

For £499, I'd be taking that straight back to the shop as unfit for purpose - based on your in-depth review of two photos (sigh) it's one of the worst performing cameras I've seen in a 2019 smartphone at that price point. Surely *half a grand* for the base level model means one should have some kind of standards here?! This is not a cheap device!

Not very bright: Apple geniuses spend two weeks, $10,000 of repairs on a MacBook Pro fault caused by one dumb bug

Andrew Hodgkinson

Re: Genius, more like idiot.

If you read TFA - he has a T2 chip in the laptop and a password set. It turns out that PRAM reset doesn't work if you have that. People tried PRAM resets; they didn't make the backlight come on.

There are very many levels of fail in the whole sorry story, but it boils down to modern Apple - each major hardware iteration gets worse and more expensive; each major software iteration gets more buggy, gaining new bugs that are never fixed; the once-industry-leading documentation is getting sparse to non-existent.

if developer_docs == bad then app_quality = bad; Coders slam Apple for subpar API manuals

Andrew Hodgkinson

It's just part of modern software engineering; with each passing year, code gets more bloated, less reliable and more poorly documented. Upcoming engineers either just can't be arsed documenting anything or literally don't know any better and are somehow unable to figure out for themselves how vital good documentation is. Rigour is dead; RIP.

As with most recent Apple problems, the most serious rot seemed to start quite suddenly, around the iOS 7 / OS X 10.7 era.

Turn on, tune in, drop out: Apple's whizz-bang T2 security chips hit a bum note for Mac audio

Andrew Hodgkinson

Re: Possible solution...

According to news reports, yes, using a USB 3 audio interface, a Thunderbolt audio interface, or a USB 2 interface hanging off a USB 3 or thunderbolt hub should all work. Only USB 2 devices on the internal bus appear to be problematic.

Still a damned stupid bug!

Oracle demands dev tear down iOS app that has 'JavaScript' in its name

Andrew Hodgkinson

Well this all raises a pretty good point

So in the interest of forgetting any and all associations with Oracle, why don't we just call it ECMAScript, then? Even if that's clumsy? Most colloquial references seem to talk about it by framework anyway - "This site uses Angular", "The service is written in Node".

The app just changes its description to "HTML5, CSS, ECMAScript, HTML, Snippet Editor" and carries on being a hobby project with the accidental reference to the pointless bloatware dead-man-walking that is Oracle left consigned to the dustbin of history.

If Oracle want their trademark, let them have it. And thus, let it die.

Fake-news-monetizing machine Facebook lectures hacks on how not to write fake news that made it millions

Andrew Hodgkinson

Facebook implemented a set of guidelines - and you won't believe what happened next!

Number 7 blew my mind!

<-- [Prev] I ate 30 tomatoes a day AND LOST 15KG IN A WEEK <--

--> [Next] 23 Reasons You Should Only Buy Blue Donuts -->

Have MAC, will hack: iThings have trivial-to-exploit Wi-Fi bug

Andrew Hodgkinson

Re: Now I'm Confused

But that's just it. I've bashed Apple's declining s/w quality relentlessly since the truly horrible days of the introduction of OS X Lion and iOS 7, but with High Sierra and iOS 11, the media is genuinely struggling to find anything wrong. Let's review the list:

* 32-bit apps stop working. Um, hardly a bug; that's been advertised since 2014, with non-64-bit app submissions to the store rejected for at least two years. iOS 10 started warning users about it, in increasingly strong terms. If you rely on a 32-bit app then yeah, it's crappy and you can't upgrade, but it's still a 3-year-old well advertised deprecation and means you are using an app that can't have had a single update or security patch in at least two full years.

* You can't turn off WiFi and Bluetooth! Panic! Uuuh, except you can, in Settings. Questionable UI for anyone but novices in Control Centre for sure, but the rationale is well explained in the Apple knowledge base article - it seems journos can't be arsed even reading *that* much these days though.

* An actual bug! The Exchange connection issue. I didn't experience it, but enough did that 11.0.1 is already out and fixes it. So, that's gone.

* Another bug perhaps? Some people report slow application launch times. I've not noticed it being slower, but then I've been on the beta a while before the final release and perhaps I got used to it. There could be a genuine issue here. The "double launch" UI animation bug is still present, so clearly something is amiss. This one seems legit.

* Worse battery life! Yeah, as with every update. Every single one. Spotlight reindexing and usage profiling data restarts each time. In 1-2 days, it'll settle as it always does. Doubtless a few people out of the many millions who can upgrade will have bad patches that don't function properly and need to restore, which sucks, but is that a reason to have screamed "do not upgrade" a day or two ago? No.

Aaaaand that's it so far. That's all. To me, that's basically mind blowing. I've never seen an OS release from anyone with so few headline bugs at release, even before iOS 11. There are little UI glitches all over the place, but nothing breaking the device functionality. Quite something, especially given the magnitude of changes on the iPad, which pretty much all seem to work properly.

High Sierra is a similar story and, very rarely for modern Mac OS, actually runs faster than 10.12 on some older hardware, allegedly thanks to Metal 2 and (for all-Flash storage devices) APFS. It certainly spend up my 2011 MBP. Again, very few significant bugs are evident, despite an entirely new filesystem; amazing. Yes, it's still a pale shadow of 10.6 thanks to ongoing absurd RAM requirements and such, but even the RAM problems are much reigned in compared with 10.12. Perhaps being stuck on 16GB max in laptops thanks to Intel limitations has been a motivator!

So I can say what I like about the intermediate years, but they seem to have genuinely knocked it out of the park on this one.

Attention adults working in the real world: Do not upgrade to iOS 11 if you use Outlook, Exchange

Andrew Hodgkinson

Full of nonsense as ever

The continued race to the bottom in El Reg persists, I see.

You of course can completely turn off Bluetooth / WiFi, in the Settings app. Control Centre does the rules-based "sleepy services" thing, which only supports the Apple proximity device detection features for stuff like Continuity or Airdrop. It'll probably solve more support queries than it generates, even if I personally don't think it's a good idea. And yes, airplane mode works normally.

But hey, thanks to warning us of the perils of the death of 32-bit advertised and warned about since 2014, a bug for some users (myself not included) with Exchange and a Control Centre oddity that most people won't even notice. For sure, iOS 11 truly is a catastrophe.

(Sent from the department of "why do I people come out with this rubbish"?)

Roses are red, you're feeling blue, 'cos no one wants to watch VR telly with you

Andrew Hodgkinson

Advert sent straight from the 1970s...

...or am I supposed to just ignore the bit around 10 seconds in where there's a close-up of a women's arse and a fast cut to a smug pervy guy grinning in appreciation?

I suppose it's close to honest advertising, given that the most likely way this will catch on - if at all - is for porn; perhaps since kids wearing headsets can't even *see* their parents walking in on them in the bedroom, they just won't care. Out of sight out of mind.

Stop us if you've heard this one before: Seamen spread over California

Andrew Hodgkinson

Re: Swarms of weaponized suicide drones

Don't worry, all you need to do is transmit some loud rock music over FM and the swarm will explode.

(Such a shame, that film had started quite well...)

Crocodile well-done-dee: Downed Down Under chap roasted by exploding iPhone

Andrew Hodgkinson

A few have

A few Telsa vehicles have already caught fire. Stories that pop up in Google include one that burned after an accident, one that burned in a garage while the owner claimed it was *not* plugged in, and one that burned in Norway while connected to a supercharger.

The fork? Node.js: Code showdown re-opens Open Source wounds

Andrew Hodgkinson

Re: Without open source there would be no leftpad

Except you've completely failed to answer my vertical centre challenge and the markup you require is a complete clusterfuck of mangled HTML and CSS hackery that is the total opposite of separating content from presentation logic.

And frankly, given that CSS even contains generated content directives these days, that idea sailed away a long, long time ago.

Andrew Hodgkinson

Re: Without open source there would be no leftpad

Sure, so how about you vertically centre that line of text next to that image which is on the right, and doesn't wrap around when the viewport gets small?

Have fun with your cross-browser CSS. I've already finished with tables.

Old isn't always bad.

Sloppy security in IoT putting 'life and limb' at risk, guru warns

Andrew Hodgkinson

Re: The Greatest Fear is Fear Itself

Amen. OP's analogy is ludicrous. The river can't be commanded to flood your home by a botnet operated by script kiddies 10,000km away, but the IoT boiler [1] can sure as hell be commanded to close its inlet valves, boil itself dry and burn your house down remotely. Good luck trying to get the insurer to pay out when they claim that it "clearly must have been your fault", because computers are perfect and the manufacturer claims the IoT boiler is secure.

[1] Y'know, so you can warm up the water from work before you get home, because, like, everyone totally needs that and a dirt cheap mechanical timer just wouldn't cut the mustard.

Microsoft kills Sunrise

Andrew Hodgkinson

Re: Any one surprised?

The phrase you are looking for in relation to Microsoft is: "embrace, extend, extinguish". On this occasion, though, we might have skipped "extend" entirely.

Debian farewells Pentium

Andrew Hodgkinson

Re: Ouch!

Amen. "Farewelling"? What the actual fuck, El Reg.

Picking apart the circuits in the ARM1 – the ancestor of your smartphone's brain

Andrew Hodgkinson

Open source these days

> For some reason the software on the Archimedes seemed to before much more advanced than that for Win/x86 machines

It is open these days - I'm involved with RISC OS Open Limited, which manages it. Emulators available for those who want the nostalgia and it'll run on things like the Raspberry Pi for those that want something a bit more practical :-)


US government's $6bn super firewall doesn't even monitor web traffic

Andrew Hodgkinson

Re: At least 90% of the Register's readers

Diodelogic wrote:

> The only surprising information is that the firewall caught as much as 29% of the intrusions. I'd have guessed somewhere in the 6-9% range.

It didn't. It caught 29 of them. 29, not 29%. Which was indeed, as both your guess and the article say, around 6%.

Huffing and puffing Intel needs new diet of chips if it's to stay in shape

Andrew Hodgkinson

The unbridled greed of the MBA's New Normal

1-2% drop over the entire year with 55 *billion* dollar revenue and earnings per share over two dollars regardless.


Yeah. OK. Whatever.

The new Huawei is the world's fastest phone

Andrew Hodgkinson

I'll get ignored and/or downvoted, but - "world's fastest"?!

Not that benchmarks really matter all that much unless you're an engineer, but your claim in this article is absurd, as one or two other commenters have pointed out.

Let's look at that Anandtech link which is the only thing you provide as a possible basis for the claim. The differences in results aren't small here - they're quite big jumps:

Kraken - iPhone is fastest

Octane - iPhone is fastest

WebXPRT - iPhone and Note are fastest

OS II System - iPhone is fastest

OS II Memory - Huawei is fastest

OS II Graphics - Almost everything is faster, Huawei tanks

OS II Web - iPhone is fastest

OS II Overall - iPhone is fastest

PC Mark - only tests Android and Huawei wins

I have no special love for today's Apple, their software quality is horrific, but on every benchmark except Android-only or *one* result for OS II, iPhone beats it.

"World's fastest phone"? What are you smoking?

Huawei announces tiny 10 KB IoT kernel

Andrew Hodgkinson

Re: 10KB for the OS?

- "Swtmr" objects (whatever they are)

Software Timer, surely.

You've come a long way, Inkscape: Open-source Illustrator sneaks up

Andrew Hodgkinson

If you're on OS X...

...then Affinity show you just what a powerhouse closed source can be (I'm going to get *so* many downvotes for this) and just how good software really can be when it targets carefully, and integrates deeply with a specific operating system.


As for "I haven't found anything in Photoshop that Gimp can't do", try > 8bpc images. Unless you're on the 2.9 beta, Gimp *still* can't do deep colour after years and years of waiting.