* Posts by Andrew Hodgkinson

121 posts • joined 7 Mar 2008


Intel puts ‘desktop-caliber’ CPUs in laptops with 12th-gen Core HX

Andrew Hodgkinson

55W idle and 157W peak?!

Even if you're considering the laptop a "portable desktop", that seems pretty absurd; you'll be lucky to get more than a few tens of seconds at peak before everything thermal-throttles, especially if there's a GPU even remotely matching the workloads implied by the CPU's feature set. And as for performance on battery... Shudder.

For comparison, an MBP 16" M1 Max with 64GB RAM (the most power hungry model in the range) idles in macOS at around 7-8W and I've rarely managed to push it past about 70W at peak, which included GPU usage. I think the true peak is more like 110-120W, which is itself pretty high, but this includes all GPU cores running at full tilt as well - the Intel specs are for the CPU alone.

This compares apples (no pun intended) and oranges, perhaps, but the efficiency in the Intel offering here is almost comically bad. I struggle to see the point in a "desktop class" CPU intended for mobile use when it's got power use specs like that. Maybe you could build a small factor PC in the style of Apple's weird Mac Studio or similar; otherwise, surely it'd be cheaper & provide much more reliable performance (in every sense of "reliable") to just use a normal form factor machine with the desktop CPU inside.

Anyone have suggestions about the kind of device where the CPU's peak performance would actually be useful and sustained? TIA!

Review: Huawei's Matebook X Pro laptop is forgetful and forgettable

Andrew Hodgkinson

Re: "you can do better for the $2,000 or so Huawei charges for it"

Dunno why you got downvotes - Apple haters I guess; they're a horrendous company, but then, show me a big corporate that isn't? Google, Microsoft, Dell, Lenovo... Huawei?

At $2000 I'm kind of astounded at the poor value indicated here. Apple charges a LOT under Tim Cook for just about anything, but for $1999 US I can get the 14" MBP with 16GB+512GB and the M1 Pro that'll beat this Lenovo machine in just about any test with a battery life that by the sounds of things would be at least double, if not triple, the reviewer's quoted uptime of 6 hours, coupled with maximum performance available at all times, rather than only when plugged into the mains.

Now, OK, sure, it's heavier and bigger and maybe you want 3:2 etc. etc; the Mac *will* run Windows 11 ARM unofficially and very fast too, but yeah, it's macOS native and Linux almost-native, not x86, so we're not comparing like with like. Even so, if the size and weight are the issue, the 13" MBP same spec with M1 would almost certainly still outperform the Lenovo in just about every metric and cost "only" $1699 despite Apple's horrific markup for the 8->16GB bump from standard MBP 13" spec.

The PC world is meant to offer better value - but clearly, not in this case. It seems that some PC vendors need to catch up; prices need to drop, stability needs to improve and surely 11th gen or earlier Intel is largely a joke at this point, unless (IMHO) it's only at a 3-figure "budget"-ish price tag.

GitLab issues critical update after hard-coding passwords into accounts

Andrew Hodgkinson

That's correct, but...

...the way it works is that the unknown plain text password supplied by the user is hashed and compared with the existing one-way hashed database record. If there's a match, we deduce (via confidence in the unique-output-given-unique-input mathematics of our chosen hashing algorithm) that the input plain text value matches that which was given by the user when they first signed up.

The issue here is that the _plain text_ original value for a new account is being hard-coded as a first-time password that the user hasn't chosen (under doubtless some "less obvious than it sounds" set of conditions, else it'd have been spotted much sooner). It's stored hashed, but someone can still try to hack such accounts by making simple password guess attempts based on the known "default password" approach.

Web devs rally to challenge Apple App Store browser rules

Andrew Hodgkinson

Oh great, the genius web developer talent has weighed in

Given the comprehensive and almighty clusterfuck that web developers have proven capable of developing over the last few years, with catastrophically bloated frameworks, constant churn to new shiny, ridiculously bad security practices within package management and a truly horrific track record of vulnerabilities in browsers themselves as they continue to clamour for ever-more invasive and deep-rooted hooks into the host OS, I really think they should just shut the f*** up and get their own house in order before having the temerity to tell anyone else what to do.

These clowns took the web, open or not, and pretty much tore it to pieces with a set of absolutely awful technologies and awful implementations balanced on top. RAM and CPU requirements of even basic shopping web sites are now absurd, with broken navigation models and compatibility issues as they insist on ever-more recent browsers; their lazy, half-arsed, have-a-go-hero development approach is the bane of just about every end user on the planet.

That's before we even so much as glance at the list of grievances as far as web-based tracking and advertising go, where at least iOS applications are in theory held to some kind of privacy standard and required privacy declaration.

If it wasn't clear enough: Kindly bugger off.

AI really can't copyright the art it generates – US officials

Andrew Hodgkinson

The tool isn't important; the person wielding the tool matters

Whether I'm "generating" art (FSVO 'art') via BBC BASIC plotting random coloured pixels at random locations, or by picking up a can of paint and throwing it across a canvas, or by carefully drawing and shading things with pencils, or by clicking the "Go" button that's connected to some software-based image generator that may or may not be based on AI, they're all just tools.

The artist, if you can call them that, is the person using the tool to generate the picture - it's no more copyrighted by the algorithm's author than a pencil sketch is copyrighted by the inventor of that pencil, or an iPad sketch using an Apple Pencil is copyrighted by Apple, even though a tonne of software algorithms are used to interpret the hardware signals generated by the pencil and use them to simulate a pencil-like result via a set of pixels written into a bitmap plane.

Experimental WebAssembly port of LibreOffice released

Andrew Hodgkinson

> Secondly, and emphatically, WebAssembly is not "grossly inefficient"

I can only assume you're trying to be correct by being as narrow and pedantic as possible in the consideration of 'WebAssembly' by itself.

Every single person that's tried this particular example - which ostensibly consists of probably the highest ever ratio of pure WebAssembly to any other kind of code - has observed that it's glacially slow and uses truly stratospheric amounts of RAM.

You're trying to argue that something isn't grossly inefficient, right next to a great demonstration of that thing being, emphatically, grossly inefficient.

Avira also mines imaginary internet money on customers' PCs

Andrew Hodgkinson

Wait, what?

So through all these harmful acquisitions, Norton now own Norton, Avira, Symantec, Avast *and* AVG AV?

This is why it should never be possible to purchase brands! Sigh.

What's left that's any good?

Felt Qt (might delete later)*: Two non-Gtk Linux desktops have put out new versions

Andrew Hodgkinson

Re: Variety is the Spice of Life...

All other engineering disciplines are using software tools to make their lives dramatically easier, whilst the software industry keeps messing it up. Why?

I don't know, apart from perhaps blaming the two easiest targets - the general dubious competence levels in newer software development, coupled with the "NIH" syndrome that has always seemed to be present.

From a once-RISC OS, now MacOS (macOS) / iOS developer perspective, Interface Builder's fast and simple operation went backwards with the much slower and buggier Storyboard stuff in XCode, but at least it was still graphical. IMHO Apple's more recent move to Swift UI seems to have been a big backwards step, at least if a graphical design approach were to be dropped and leave only the code-writing option - which bewilderingly, many people seem to want to happen. Notwithstanding the dreadful documentation and bugs, it's just *much* more effort and producing something that actually looks good is, uuh, challenging to say the least.

I've frequently read a rationale concerning code reviews. You can easily diff UI-written-in-code, but can't easily diff the output of a visual tool. This is a shortcoming of the diff engine of course, not a shortcoming of the generation tool. In any case, if you've ever done a code review where someone's modified a bunch of XUL, or CSS, or even Swift UI code, you'll know that for all but the most trivial changes, you're very unlikely to have any clear idea of exactly how those changes will manifest visually, or whether or not this will produce a desired outcome; nor can you be too sure if there are reuses somewhere which could mean that the innocent looking change in a diff actually has wider impacts. There are no silver bullets. A diff tool that understood the output of the visual editor system and could respond with an appropriately visual diff, on the other hand - that would be valuable.

There are times and a places for UI components expressed in hand-written code or markup languages, of course, but the lack of good tools for things like desktop or mobile app development outside of the Apple ecosystem has always surprised me.

$600m in cryptocurrencies swiped from Poly Network

Andrew Hodgkinson


Innocent unless proven guilty, which looks like a small change but is really extremely important (otherwise, you're innocent until we prove you guilty - it's only a matter of time...). There are probably even fewer countries where that holds.

Android 12 beta lands bringing better personalisation, speed upgrades, and some privacy tools borrowed from iOS 14

Andrew Hodgkinson

Yuck - what a colour choice

Interesting choice to lead with Spew Yellow-Green for the hero images, with some off-magenta thrown in to some other examples I've seen online in the same screen - it's as if they went out of their way to choose the most horrifying colour combinations.

I suppose it's down to these super-l33t "designers" who are all innovative and cutting edge etc. etc. who've now finished strip-mining the 80s for variations on 2D user interfaces and linear graduated fills, and are now heading into the 90s for variants on those good ol' awful off-cyan/magenta colour schemes popular back then. I guess it'll be all rounded edge, bean-like design too - oh, wait - we're already doing that.


Looking forward to getting as far as ripping off design from the 2000s, so that we can get contrast back and easily identify what is or is not a clickable/tappable control.

Lock up your Peloton smart treadmills, watchdog warns families following one death, numerous injuries

Andrew Hodgkinson

I think it's more a design issue

Many treadmills have the rollers and tread on a bed that is pretty close to the ground. It is very hard to get underneath it. The belt is often, though not always hidden underneath behind a cheap plastic bottom cover plate, mitigating a potential "on deep pile carpet" issue if used in the home. See e.g.:


...noting the plastic guard along the rear roller/base (to which a carry handle is attached, but that's incidental) and fairing around the roller edges.

The Pelotron units look different - the belt is fully exposed on all sides and around the rollers, and they have riser feet that bring the whole assembly some distance off the ground. These feet are *not* positioned at the far corners of the base, but inset and, further, there is no kind of guard bar or any other piece of plastic or other super-cheap stuff to guard the end of the roller. It's fully exposed (and has quite a large diameter).


This doubtless looks good in Photoshop and is doubtless cheaper than anything incorporating additional fairing or guards - and I Am Not An Industrial Designer - but I would wager that if Mittens The Cat, Fido The Dog or Mary Your Small Child happen to come running in (someone opens the door and they squeeze in quickly, dog, cat or child can open doors with latch handles, etc.etc.), one's view of the safety of the chosen product might change somewhat as Mittens, Fido or Mary brush inadvertently against that rear roller.

Some kind of cheap clip-on plastic guard, or even just a tray, that could live under the machine and rose up just enough to guard that rear roller might help, but Pelotron argue that it's all fine & nothing is wrong & everyone who's been injured must be holding it wrong.

So how's .NET 6 coming along? Oh wow, Microsoft's multi-platform framework now includes... Windows

Andrew Hodgkinson

Re: Can we all please...

Nope, dead right. As more and more photoshop monkeys think they're "UX Experts", the worse it seems to get.It seems as if the minute someone calls it "UX" instead of "UI", you know you're in trouble.

Flavour-of-the-month visual design (most of which seems to be poorly implemented, probably because designers are trying, without the requisite skills, to poorly copy something they saw someone else do better in a blog the previous day), a complete and comprehensive lack of coherence or consistency across every section of the interface (which we can probably ascribe to card-dev-myopia from the infection of poor agile application just as much as we can ascribe to "UX" designer problems), and an almost wilful ignorance of just about any feature beyond the most drool-level single-finger, single-click kind of interaction in the system.

The majority of the now-historic wider platform conventions for cross-application cooperation are not there. Keyboard shortcuts if you're lucky. Multi-select following platform conventions? Forget it. Any kind of power user feature is stripped away leaving a bland, tasteless, white-space-wasting canvas of crap that's as pathetically inefficient, clumsy and irritating to use as the awful mobile app UIs from which all design is now derived. Most modern designers don't seem to be able to conceive of, or understand the utility of, a system which provides a richer array of input options than a fat finger. And that'd be fine if engineers implemented the designer's "look", but now they've been given control of the entire feature set and engineers turned into dumb drones that just do what's on the card and nothing else. The engineers are treated as if - and accordingly, increasingly seem to exhibit - no particular skill, no deep expertise, no creativity, no initiative and no interest. The outcome is inevitable.

Inmates running the asylum.

Mac OS X at 20: A rocky start, but it got the fundamentals right for a macOS future

Andrew Hodgkinson

Re: It was all downhill after Snow Leopard

Couldn't agree more. Snow Leopard was Apple's Windows 2000.

My first foray into OS X happened because of Vista, so I suppose I've got Microsoft to thank. Took one look at it and went "hell no", so got a 2011 Intel MBP, 4GB RAM. Fastest laptop I'd ever owned. Still working today, but it did need its discrete GPU re-soldering - early lead-free solder strikes again.

Speaking of "strikes again", suddenly, OS X Lion happened. I "upgraded" and everything ground to a halt - never mind multiple features being removed like the 4x4 desktop grid in favour of the hopeless horizontal-only strip-of-un-renameable-doom we have to this day, or the horrible linen background on any 2nd monitor thanks to the newly introduced, and thoroughly unreliable from app-to-app "full screen" mode, thankfully resolved in 10.8 - but the main problem was RAM.

Those days, you could upgrade RAM in Mac laptops. Imagine that! So I went from 4GB to 8GB. It was better, but still sluggish. So I went to the chipset maximum of 16GB and _finally_ Lion ran as well as Snow Leopard.

Four times less efficient. Four! And from a user-facing perspective, it seemed to have taken more features away than it added.

Similar story ever since - things Apple don't seem to have touched get more and more broken with each release, never to be fixed; occasionally some "big rewrite" happens and the new replacement is far more buggy, far bigger and slower and far less functional than its predecessor. Examples:

• Original iMovie rewrite tho that at least has improved over time

• iPhoto to Photos which hasn't improved over time

• Aperture to, well, what, Photos? Really?!

• iTunes to the incomprehensibly slow, buggy and feature-poor Music app which seems basically just stagnant

• The various incarnations of Messages which thanks to Catalyst have got more and more iOS-like, and less and less Mac-like, especially if you want to do something ground-breaking like selecting more than one conversation at once in its sidebar

So don't, for heaven's sake, ask for the Finder to be rewritten. I can't even begin to imagine what kind of bloated, dysfunctional horror story would arise (well - I guess I could look at the iOS Files app and have a pretty good idea... Shudder!).

Google changes course, proposes proprietary in-app purchase API as web standard

Andrew Hodgkinson

Re: "see the capability gap between native and web apps closed as much as possible"

PWAs don't have any less security that a standard web page run through chrome.

YES. Exactly the point.

Reading every update note for every app to see what amazing new features it had. Doesn't happen anymore

Because thanks it large part to agile and in part to lazy change log writing, "nothing apparently changed" updates on a 2 or 3 week cycle with update notes saying something lame like "bug fixes and performance improvements" are the norm. That aspect of lazy development is only tangentially related at best to PWA vs native. People don't read them anymore because app updates are very annoyingly frequent and very rarely have anything interesting to say in the notes.

As for PWA vs Native it's often a different use case and a decent PWA doesn't have to be 'janky and slow'.

Compared to native, sorry, yes it does; the basic domain of HTML, CSS and JavaScript is simply a very poor and very inefficient toolset for constructing the kind of dynamic user interfaces we associate with apps, for which native APIs like Cocoa Touch are dramatically better optimised and designed.

People don't search for apps randomly, it is based on a real user need and most people would prefer not to download a full blown native app just to get information when staying at a hotel, or pay for their pizza etc.

So your argument here appears to be that a PWA is used when people don't want a "full blown app". This is surely a tacit admission of the limitations of the web based environment compared to native. Basically we're saying we just need web sites optimised for mobile a lot of the time. Agree! We don't need or want really to install things. Agree! So, um, wait. What's PWA for again? It's not to replace native apps, and it's a kind of installable thing...

...glorified home screen bookmark with an over complicated cacheing model that would've been better served just by the built-in web browser providing a first class ability to pin things to the home screen? Oh, wait, we have the latter. So, uuuh... All we've really got left is the ability to work offline. And of course, given that we recognise native is best unless you'd really want just to access what's on a web page, and the point of the web page is to be online, we're left with what? Crap games written in JavaScript that'll just about work offline because you jumped through the hoops needed to declare the resource collection in a manner that allows it to be accessed as a PWA?

If it's worth an app, treat your users with respect and make a decent native app that conforms to all your target operating system's best-practice recommendations, layouts, integrations with the rest of the system, accessibility features, and so-on. Have it lean on native frameworks as much as it possibly can to keep the app as small as possible and allow it to conform, often with little or no effort, with things like dark modes or system-wide contrast variations or animation styles or whatever.

If it's not worth an app, just make sure your inherently online web site is a best-possible online web experience on mobile and stop wasting your development money on PWA bundling.

Andrew Hodgkinson

Re: "see the capability gap between native and web apps closed as much as possible"

Amen. As for PWAs, their slow, janky, non-native, non-integrated "performance" is of benefit only to the lazy developer who cares not for battery life (or multitasking, given that means sharing RAM) of the end user devices.

Native development is and will always be very much faster, very much more integrated and very much more respectful of local device idioms than a PWA. And yeah, in particular, given the history of web security - having a clueless JavaScript monkey hack up a way to siphon money off my credit card via some kind of JavaScript in app purchase API fills me with horror. We _all_ know exactly where that's going to end up.

Flaw hunter bags $75,000 off Apple after duping Safari into spying through iPhone, Mac cameras without permission

Andrew Hodgkinson

Re: Use our code its always better

From the article:

He found flaws in rarely used specifications that browsers nevertheless have to implement in order to be compliant with other code, but which do not get the same level of attention as commonly used parts of the browser API.

Pretty sure there'll be variants of this found, if you tried hard enough, in Chrome, Chromium, Opera, Firefox, Edge, MSIE...

Yes, Apple's software quality is increasingly terrible, but rancid specs are the bane of the web world. There is a litany of errors, with increasingly absurd, edge case riddled and ginormous specs comprising a bad joke as we go from version to version. Look at HTML 4 or XHTML 1.1 versus HTML 5, for example, or even CSS 1 vs CSS 2.

It's a nightmare of a job to implement this stuff. I know, I've done it, many years ago now; HTML 4 was new back then. I'm glad I'm not trying to do it in an HTML 5, CSS 3 world, especially not with modern JavaScript / ECMAScript and the bazillion flavours of that along with its ever-growing list of ever-more invasive interfaces into the host operating system as lazy programming (and a deficit of half-decent alternatives) continue to make engineers hell-bent on some kind of 'write once, run everywhere' model that competes with native-written applications tailored for the host operating system. Lowest common denominator is the _best_ outcome you can hope for. As for security? Forget it.

ZTE Nubia Z20: It's £499. It's a great phone. Buy it. Or don't. We don't care

Andrew Hodgkinson

Photos are "excellent"? In what universe?

Seriously, that autumn park photo was just absolutely *horrible* - weird, impressionistic and inconsistent smearing and edge artefacts as some kind of screwed up noise reduction system seemed to have turned it into a result resembling impressionism. The trees themselves (particularly the lower parts, in shadow) were bad, but worse yet was any of the grass under them or in the distance. Just the in-page image looked fuzzy and strange, but when zoomed in, it looked outright faulty. Hideous.

For £499, I'd be taking that straight back to the shop as unfit for purpose - based on your in-depth review of two photos (sigh) it's one of the worst performing cameras I've seen in a 2019 smartphone at that price point. Surely *half a grand* for the base level model means one should have some kind of standards here?! This is not a cheap device!

Not very bright: Apple geniuses spend two weeks, $10,000 of repairs on a MacBook Pro fault caused by one dumb bug

Andrew Hodgkinson

Re: Genius, more like idiot.

If you read TFA - he has a T2 chip in the laptop and a password set. It turns out that PRAM reset doesn't work if you have that. People tried PRAM resets; they didn't make the backlight come on.

There are very many levels of fail in the whole sorry story, but it boils down to modern Apple - each major hardware iteration gets worse and more expensive; each major software iteration gets more buggy, gaining new bugs that are never fixed; the once-industry-leading documentation is getting sparse to non-existent.

if developer_docs == bad then app_quality = bad; Coders slam Apple for subpar API manuals

Andrew Hodgkinson

It's just part of modern software engineering; with each passing year, code gets more bloated, less reliable and more poorly documented. Upcoming engineers either just can't be arsed documenting anything or literally don't know any better and are somehow unable to figure out for themselves how vital good documentation is. Rigour is dead; RIP.

As with most recent Apple problems, the most serious rot seemed to start quite suddenly, around the iOS 7 / OS X 10.7 era.

Turn on, tune in, drop out: Apple's whizz-bang T2 security chips hit a bum note for Mac audio

Andrew Hodgkinson

Re: Possible solution...

According to news reports, yes, using a USB 3 audio interface, a Thunderbolt audio interface, or a USB 2 interface hanging off a USB 3 or thunderbolt hub should all work. Only USB 2 devices on the internal bus appear to be problematic.

Still a damned stupid bug!

Oracle demands dev tear down iOS app that has 'JavaScript' in its name

Andrew Hodgkinson

Well this all raises a pretty good point

So in the interest of forgetting any and all associations with Oracle, why don't we just call it ECMAScript, then? Even if that's clumsy? Most colloquial references seem to talk about it by framework anyway - "This site uses Angular", "The service is written in Node".

The app just changes its description to "HTML5, CSS, ECMAScript, HTML, Snippet Editor" and carries on being a hobby project with the accidental reference to the pointless bloatware dead-man-walking that is Oracle left consigned to the dustbin of history.

If Oracle want their trademark, let them have it. And thus, let it die.

Fake-news-monetizing machine Facebook lectures hacks on how not to write fake news that made it millions

Andrew Hodgkinson

Facebook implemented a set of guidelines - and you won't believe what happened next!

Number 7 blew my mind!

<-- [Prev] I ate 30 tomatoes a day AND LOST 15KG IN A WEEK <--

--> [Next] 23 Reasons You Should Only Buy Blue Donuts -->

Have MAC, will hack: iThings have trivial-to-exploit Wi-Fi bug

Andrew Hodgkinson

Re: Now I'm Confused

But that's just it. I've bashed Apple's declining s/w quality relentlessly since the truly horrible days of the introduction of OS X Lion and iOS 7, but with High Sierra and iOS 11, the media is genuinely struggling to find anything wrong. Let's review the list:

* 32-bit apps stop working. Um, hardly a bug; that's been advertised since 2014, with non-64-bit app submissions to the store rejected for at least two years. iOS 10 started warning users about it, in increasingly strong terms. If you rely on a 32-bit app then yeah, it's crappy and you can't upgrade, but it's still a 3-year-old well advertised deprecation and means you are using an app that can't have had a single update or security patch in at least two full years.

* You can't turn off WiFi and Bluetooth! Panic! Uuuh, except you can, in Settings. Questionable UI for anyone but novices in Control Centre for sure, but the rationale is well explained in the Apple knowledge base article - it seems journos can't be arsed even reading *that* much these days though.

* An actual bug! The Exchange connection issue. I didn't experience it, but enough did that 11.0.1 is already out and fixes it. So, that's gone.

* Another bug perhaps? Some people report slow application launch times. I've not noticed it being slower, but then I've been on the beta a while before the final release and perhaps I got used to it. There could be a genuine issue here. The "double launch" UI animation bug is still present, so clearly something is amiss. This one seems legit.

* Worse battery life! Yeah, as with every update. Every single one. Spotlight reindexing and usage profiling data restarts each time. In 1-2 days, it'll settle as it always does. Doubtless a few people out of the many millions who can upgrade will have bad patches that don't function properly and need to restore, which sucks, but is that a reason to have screamed "do not upgrade" a day or two ago? No.

Aaaaand that's it so far. That's all. To me, that's basically mind blowing. I've never seen an OS release from anyone with so few headline bugs at release, even before iOS 11. There are little UI glitches all over the place, but nothing breaking the device functionality. Quite something, especially given the magnitude of changes on the iPad, which pretty much all seem to work properly.

High Sierra is a similar story and, very rarely for modern Mac OS, actually runs faster than 10.12 on some older hardware, allegedly thanks to Metal 2 and (for all-Flash storage devices) APFS. It certainly spend up my 2011 MBP. Again, very few significant bugs are evident, despite an entirely new filesystem; amazing. Yes, it's still a pale shadow of 10.6 thanks to ongoing absurd RAM requirements and such, but even the RAM problems are much reigned in compared with 10.12. Perhaps being stuck on 16GB max in laptops thanks to Intel limitations has been a motivator!

So I can say what I like about the intermediate years, but they seem to have genuinely knocked it out of the park on this one.

Attention adults working in the real world: Do not upgrade to iOS 11 if you use Outlook, Exchange

Andrew Hodgkinson

Full of nonsense as ever

The continued race to the bottom in El Reg persists, I see.

You of course can completely turn off Bluetooth / WiFi, in the Settings app. Control Centre does the rules-based "sleepy services" thing, which only supports the Apple proximity device detection features for stuff like Continuity or Airdrop. It'll probably solve more support queries than it generates, even if I personally don't think it's a good idea. And yes, airplane mode works normally.

But hey, thanks to warning us of the perils of the death of 32-bit advertised and warned about since 2014, a bug for some users (myself not included) with Exchange and a Control Centre oddity that most people won't even notice. For sure, iOS 11 truly is a catastrophe.

(Sent from the department of "why do I people come out with this rubbish"?)

Roses are red, you're feeling blue, 'cos no one wants to watch VR telly with you

Andrew Hodgkinson

Advert sent straight from the 1970s...

...or am I supposed to just ignore the bit around 10 seconds in where there's a close-up of a women's arse and a fast cut to a smug pervy guy grinning in appreciation?

I suppose it's close to honest advertising, given that the most likely way this will catch on - if at all - is for porn; perhaps since kids wearing headsets can't even *see* their parents walking in on them in the bedroom, they just won't care. Out of sight out of mind.

Stop us if you've heard this one before: Seamen spread over California

Andrew Hodgkinson

Re: Swarms of weaponized suicide drones

Don't worry, all you need to do is transmit some loud rock music over FM and the swarm will explode.

(Such a shame, that film had started quite well...)

Crocodile well-done-dee: Downed Down Under chap roasted by exploding iPhone

Andrew Hodgkinson

A few have

A few Telsa vehicles have already caught fire. Stories that pop up in Google include one that burned after an accident, one that burned in a garage while the owner claimed it was *not* plugged in, and one that burned in Norway while connected to a supercharger.

The fork? Node.js: Code showdown re-opens Open Source wounds

Andrew Hodgkinson

Re: Without open source there would be no leftpad

Except you've completely failed to answer my vertical centre challenge and the markup you require is a complete clusterfuck of mangled HTML and CSS hackery that is the total opposite of separating content from presentation logic.

And frankly, given that CSS even contains generated content directives these days, that idea sailed away a long, long time ago.

Andrew Hodgkinson

Re: Without open source there would be no leftpad

Sure, so how about you vertically centre that line of text next to that image which is on the right, and doesn't wrap around when the viewport gets small?

Have fun with your cross-browser CSS. I've already finished with tables.

Old isn't always bad.

Sloppy security in IoT putting 'life and limb' at risk, guru warns

Andrew Hodgkinson

Re: The Greatest Fear is Fear Itself

Amen. OP's analogy is ludicrous. The river can't be commanded to flood your home by a botnet operated by script kiddies 10,000km away, but the IoT boiler [1] can sure as hell be commanded to close its inlet valves, boil itself dry and burn your house down remotely. Good luck trying to get the insurer to pay out when they claim that it "clearly must have been your fault", because computers are perfect and the manufacturer claims the IoT boiler is secure.

[1] Y'know, so you can warm up the water from work before you get home, because, like, everyone totally needs that and a dirt cheap mechanical timer just wouldn't cut the mustard.

Microsoft kills Sunrise

Andrew Hodgkinson

Re: Any one surprised?

The phrase you are looking for in relation to Microsoft is: "embrace, extend, extinguish". On this occasion, though, we might have skipped "extend" entirely.

Debian farewells Pentium

Andrew Hodgkinson

Re: Ouch!

Amen. "Farewelling"? What the actual fuck, El Reg.

Picking apart the circuits in the ARM1 – the ancestor of your smartphone's brain

Andrew Hodgkinson

Open source these days

> For some reason the software on the Archimedes seemed to before much more advanced than that for Win/x86 machines

It is open these days - I'm involved with RISC OS Open Limited, which manages it. Emulators available for those who want the nostalgia and it'll run on things like the Raspberry Pi for those that want something a bit more practical :-)


US government's $6bn super firewall doesn't even monitor web traffic

Andrew Hodgkinson

Re: At least 90% of the Register's readers

Diodelogic wrote:

> The only surprising information is that the firewall caught as much as 29% of the intrusions. I'd have guessed somewhere in the 6-9% range.

It didn't. It caught 29 of them. 29, not 29%. Which was indeed, as both your guess and the article say, around 6%.

Huffing and puffing Intel needs new diet of chips if it's to stay in shape

Andrew Hodgkinson

The unbridled greed of the MBA's New Normal

1-2% drop over the entire year with 55 *billion* dollar revenue and earnings per share over two dollars regardless.


Yeah. OK. Whatever.

The new Huawei is the world's fastest phone

Andrew Hodgkinson

I'll get ignored and/or downvoted, but - "world's fastest"?!

Not that benchmarks really matter all that much unless you're an engineer, but your claim in this article is absurd, as one or two other commenters have pointed out.

Let's look at that Anandtech link which is the only thing you provide as a possible basis for the claim. The differences in results aren't small here - they're quite big jumps:

Kraken - iPhone is fastest

Octane - iPhone is fastest

WebXPRT - iPhone and Note are fastest

OS II System - iPhone is fastest

OS II Memory - Huawei is fastest

OS II Graphics - Almost everything is faster, Huawei tanks

OS II Web - iPhone is fastest

OS II Overall - iPhone is fastest

PC Mark - only tests Android and Huawei wins

I have no special love for today's Apple, their software quality is horrific, but on every benchmark except Android-only or *one* result for OS II, iPhone beats it.

"World's fastest phone"? What are you smoking?

Huawei announces tiny 10 KB IoT kernel

Andrew Hodgkinson

Re: 10KB for the OS?

- "Swtmr" objects (whatever they are)

Software Timer, surely.

You've come a long way, Inkscape: Open-source Illustrator sneaks up

Andrew Hodgkinson

If you're on OS X...

...then Affinity show you just what a powerhouse closed source can be (I'm going to get *so* many downvotes for this) and just how good software really can be when it targets carefully, and integrates deeply with a specific operating system.


As for "I haven't found anything in Photoshop that Gimp can't do", try > 8bpc images. Unless you're on the 2.9 beta, Gimp *still* can't do deep colour after years and years of waiting.

Elementary, my dear penguin: It's the second beta of Freya

Andrew Hodgkinson

Elephant in the room

Amazing nobody's mentioned that this "nice skin" is an utterly shameless rip of OS X pre-Yosemite, including Finder with sidebar, pre-Yosemite full screen arrows in the same location on the title bar and even including the stuff OS X users complain about like hard-to-see "running application indicator lights" in the faux 3D dock under the icons. The Calendar interface screenshot is also disturbingly unoriginal.

I can only assume they don't get sued to oblivion via design patents is that nobody's paying for the OS so they aren't big enough to worry about.

Apple deliberately wiped rivals' music from iPods – iTunes court claim

Andrew Hodgkinson


This is dubious behaviour at best, but the title is nonsense (as are all titles I've read so far on the inevitably sensationalist reporting of the story).

Music loaded into iTunes would sync to an iPod just fine. You could buy (say) MP3s from anywhere you liked, load them into iTunes, sync and it'd work (within the normal iTunes/iPod values of "work"). Music sync'd to the iPod *BY SOME OTHER METHOD* would cause iTunes to barf.

The context is Apple's accusation that RealNetworks hacked the iPod by allowing unofficial iPod sync within the Real player, so that Real could directly sell and sync music over their own service instead of Apple (rather than selling over their own service, then throwing the files at iTunes for playback and sync - they had an overinflated opinion of their "brand" in RealPlayer, I suspect). Apple disagreed with the reverse engineered syncing, are calling it hacking and took steps to stop Real doing so. If iTunes found files on the iPod which it didn't put there itself, it would insist you reset the device.

You might very well argue that Real should've been allowed to do that (interoperability, monopoly breaking etc) and think it warranted an anti-trust suit. And lo, that's what's happening, and that's where the story comes from.

Fait Acompli: Microsoft gobbles Android email upstart for $200m

Andrew Hodgkinson

Re: Wtf?

Plus the execs from Zimbra and VWMare probably knew execs from Microsoft from the get-go and after lots of dual-direction corporate wining and dining, the backhanders were all sorted so the sale could go ahead.

The idea that an e-mail client could be worth $200 *million* is absurd even by typical industry absurdity, so it's very likely that there's more to it.

PEAK APPLE: iOS 8 is least popular Cupertino mobile OS in all of HUMAN HISTORY

Andrew Hodgkinson

Re: Still refusing to admit

Well yes, obviously, it makes a lot of sense.

If you upgrade on device, then the upgrade archive has to be downloaded to the device filesystem somewhere. So that's 1GB ish. Then you need a bunch of scratch and verification space, probably room to unpack files etc., and though 5GB seems excessive, you can certainly see how there'd be escalating storage requirements - especially with on-device checks and balances to make extra sure if anything goes wrong the OS isn't stuffed.

When doing it via iTunes, all the big files can be kept on the downloading computer, with only filesystem changes sent over the wired connection to the device. It may well be possible to be much less careful about keeping a consistent system state too, since if you're connected to iTunes, the upgrade process will already have made a restoration file in case things go tits up.

Big Retail's Apple Pay killer CurrentC HACKED, tester info nicked

Andrew Hodgkinson

Re: Concerns

> b) let Apple/Google know all about your purchases

That's exactly what Apple Pay doesn't do, since Apple don't make money off user data unlike Google. They were able to bypass any notion of storing information about the purchase as part of a unique selling point - and point scoring! - against Google, which does make money off this sort of thing and does collect data.

Apple often seem behave in a nasty way but the interesting part is that with Apple Pay, they have a vested commercial interest in *not* collecting your data. They're financially motivated to be the least evil in this particular case.

CurrentC is a waste of time because it's such a myopic US-centric mess anyway; social security number? In 2014? Chortle. Meanwhile, Apple Pay might struggle outside the US just because the rest of the world was already onto Chip & Pin, and now PayWave etc. anyway. The US transaction market has always seemed pretty "quaint" to much of the rest of the world.

Size matters – how else could Dell squeeze 15 million pixels into this 27" 5K monitor?

Andrew Hodgkinson

It's just a HiDPI version of 2560x1440

Lots of (entirely IMHO) missing-the-point comments about 4K.

4K is basically rubbish on large screens. It's just a 1920x1080 *equivalent area* display in most cases, albeit with the ability to scale outside the natural quad density arrangement (or in Windows' case, a general inability to scale consistently across applications with Hilarious Consequences). Most use a 4K monitor as something with 4 times the detail of the equivalent "low DPI" display, for a 1920x1080 equivalent area. At 27" and using typical laptop display area / on-screen element size as a reference, that's comically bad; UI elements are very large with a feeling of much wasted space.

Worse, 4K made it look like the industry would just settle on mass producing cheapo 4K panels and we'd get stuck in a prettified version of the 1080p rut we've endured for several years already.

Fortunately - albeit in largely niche products with a price to match - manufacturers have been making 27"-30" monitors with a 2560x1600 or 2560x1440 (former 16:10 preferred, more area / height - e.g. widescreen video editing *plus toolbars* top/bottom) resolution for a few years now. Dell's new so-called 5K display isn't just a numerical contest to try and make 4K look outdated, it's merely the natural quad density evolution of the predecessors; you end up with a 27-30" monitor that has a vaguely sane desktop "area" (think of it as 1440p), rather than something that's really just a crummy 1080p panel in disguise.

As someone who was using 1600x1200 CRTs in 1994, today's "FULL HD!!1!!11" seem rather pathetic and 4K overrated and overdue. The Dell announcement is great news, though I'd be even happier with a 30" display at 5120x3200 :-P

Apple 'fesses up: Rejected from the App Store, dev? THIS is why

Andrew Hodgkinson

Re: Blah, Blah Blah

Yes, so it's a good thing that this is just the pointless summary done by a journalist, rather than the more useful and linked-to-information summary done by Apple.


Try reading the original material - then you can complain about what Apple *actually* said, rather than assuming The Register's language is Apple's language.

Apple abruptly axes Aperture ... Adobe anxiously awaits arrivals

Andrew Hodgkinson

So more crappy portware - or just move to Windows

Apple's software slide into utter mediocrity continues. How depressing.

I personally find all Adobe UIs I've ever used to be utterly horrible, but marginally less horrible on Windows since Windows doesn't have the same system-wide integration and toolkit approach of Cocoa on OS X. Your expectations as a user are thus much lower anyway. OS X seems to be as slow as molasses these days too (it all went horribly wrong at OS X Lion) and has at least as high a bug count as Windows now (again it all went horribly wrong at OS X Lion), so it just seems pointless to bother running it anymore, even if I'll miss Logic, text services and AppleScript support.

Fanbois Apple-gasm as iPhone giant finally reveals WWDC lineup

Andrew Hodgkinson

Now. With even longer. Pauses.

Much as I'm interested in WWDC as a developer, I'm not particularly looking forward to the Tim Cook keynote. Over time he's developed inter- and intra-sentence pauses that are getting so long, he's risking heat death of the universe before he finishes the speech.

Hopefully someone at Apple has noticed.

...It's getting ridiculous.

...Asked him to speed up, but somehow...

...I doubt it.

<Thunderous, slightly relieved applause>



Biting the hand that feeds IT © 1998–2022