* Posts by juice

776 posts • joined 16 Nov 2010

Page:

Don't touch that dial – the new guy just closed the application that no one is meant to close

juice Silver badge

Back in t'day...

I've mentioned this little tale before - though I must stress that it's one which was handed down to me.

Back in the mid-90s, there was a bit of a skunkworks project at a major telco, to see what could be done with this newfangled World Wide Web malarky.

And so a demo was put together for higher management - but as the technology was literally still being written, there was a lot of smoke and mirrors involved in said demo, and the (distinctly non-technical) person hosting the demo was given a very precise script to follow.

Come the day, and the guy doing the demo became nervous and started to wander off-script. Cue lots of curses from the techies in the background, as they frantically tried to wire together stuff in realtime to try and make his improvised actions work...

After the demo, said demonstrator was nearly throttled when he wandered off the stage, came round to the techies and cheerfully said "Well, I think that went pretty well..."

Check your bits: What to do when Unix decides to make a hash of your bill printouts

juice Silver badge

Re: Not a Cossie, but...

> Didn't the Top Gear team always claim that the fastest car you'd ever drive (or at least the one you'd drive fastest) was a hire car?

Your milage may vary.

I once hired a car in Barcelona, as there was a little village nearby which I wanted to visit - it has an annual street art festival, so all the buildings are prettied up to the nines.

Alas, something which I failed to take into consideration is that the queues at the hire-car place would be both spectacularly long and glacially moving; it literally took about 3 hours to eventually get to talk to a human and secure the key to a car. Which the nice lady told me would be an automatic.

Oddly, it turned out to be a manual. A tiny petrol 1.0 litre Toyota with over 100,000 miles on the clock.

Perhaps unsurprisingly, that thing struggled to make any speed; I was pretty much standing on the accelerator in third gear just to get it up to 60mph on the motorways.

However, it did have one major thing going for it: the aircon worked.

And this was a major blessing, since with all the faff at the hire-place, I ended up arriving at this wee village at mid-day. In June, during a blue-sky day with the sun scorching down at around 40C.

After an hour of walking around and snapping photos, I pretty much melted back into the car, started the engine and just hugged the dashboard until my extremities stopped glowing...

And following that, it was time to stand on the accelerator again, for the return journey!

BOFH: You'll find there's a company asset tag right here, underneath the monstrously heavy arcade machine

juice Silver badge

Re: Personal heaters

In a past life, I worked for a company which offered free tea and coffee making facilities, together with instant hot water devices.

however, for some people, this hot water wasnt hot enough to make A Perfect Brew, and so they used their own kettles. Which was all well and good, until one day, they forgot to secure the lid, which left said kettle stuck in a permanent "almost boiling" stage.

And in turn, this meant it generated vast clouds of steam; enough to trigger the smoke alarms.

And that's how an entire company - including the call-centre - ended up stood on the street outside, waiting for a fire truck to arrive.

Needless to say, words were had...

Sir Clive Sinclair: Personal computing pioneer missed out on being Britain's Steve Jobs

juice Silver badge

Re: Ah young un's...

> And so on a so forth. If you look inside a 1980 issue of the UK micro computer mags of the day you will see that the ZX80 was met by a massive wave of indifference. Sure it was cheap but for another 100 quid you could actually get a real microcomputer. That could do things. And for 400 quid more you could get one of the proper ones. Like an Apple II or a PET. Who did not lust after an Exidy Sorcerer in 1980...

I'll agree with this to some degree. But it's worth noting that £100 was a lot of money back in 1980 - it's roughly equivalent to £450 in current terms, which meant that it was comfortably more than a week's wage back in the day.

So, yeah. IT professionals and hobbyists may have turned their noses up at the distinctly primitive ZX80 and ZX81. But there were millions of people who were interested in these newfangled electronic devices, but who couldn't afford to splash out over a month's wages on one of the "proper" machines being imported from the USA.

Especially those who were unemployed, with the UK's unemployment rate soaring up to 3 million at the time. That's a lot of people who were time-rich but cash-poor, and Sinclair's "cheap as chips" machines were a lot more affordable for them, even as the media and professionals turned their noses up at them.

After all, it's arguably not a coincidence that many of the great software houses of the 80s came from industrial towns and cities in the north. E.g. Liverpool gave us Bug Byte and Software Projects, Manchester gave us Ocean and Sheffield gave us Gremlin Graphics...

> Now what Clive was a genius at was self publicity. Absolute genius, he had no peers [...] And we had Microchip Mad Inventor straight from Ealing Studios Central Casting, Clive Sinclair.

I'm not sure about that. I think Clive's main genius lay in the fact that he recognised that people would be willing to buy something which was Good Enough. E.g. his calculators may have offered 90% of the features and 80% of the performance of the equivalent machines from TI (to pick some arbitrary values), but they were also half the cost. And given the choice between spending £2000 and £1000 (in 2021 prices), many people were happy to opt for the lower-cost option. Because it was either that, or go back to pen and paper!

It was always a risky business strategy - the high return rates of the black watch essentially bankrupted Sinclair Radionics. And therein perhaps lies Clive's main failing, in that when it became clear that demand seriously exceeded supply, his companies never really got the logistical and manufacturing side of things sorted.

Unlike Amstrad, who seriously shook things up when they bought his IP and started to churn out the +2 and +3 models.

> So to those of us there at the time Clive was just this media side story that had no real impact on the longer term development of the business. It was what happened at places like Acorn, AST, Psion etc that had a long term impact. Not a lot of subsequent businesses came from the Sinclair Research alumni

I'll agree with this to a degree, but you do know that Chris Curry worked at Sinclair Radionics for 13 years before founding Acorn? And Psion used to write software for the ZX Spectrum? And the hardware team from Sinclair went on to create the Flare hardware, which ended up in the Atari Jaguar?

Still, the "Acorn" side of the family tree undisputedly had a bigger impact on the world of hardware. But I'd argue that the Sinclair branch gave us a much bigger impact on the world of software, precisely because a much larger percentage of the population could afford to buy his cheap and shonky hardware.

> But in his defense he was not the financial fraudster that Jack Tramial was or the criminal psychopath that Steve Jobs was. Clive was always doing his wheeler deal schemes all of which eventually failed

For me, I think the problem is that the world moved on. Clive's "build it cheap and accept a high failure rate" approach worked well enough in the 60s and 70s, when - as with his calculators - it was basically a case of choosing Sinclair or nothing for most people.

But as we moved into the 80s and the electronics industry started to mature, prices began to drop and an entire eco-system of systems at varying price points began to emerge.

Which meant that it was no longer a case of Sinclair or nothing, and given his somewhat chequered reputation (in terms of both reliability and delivery timescales), people increasingly went for the options which were a bit more expensive, but far less "quirky".

Equally, and to be fair, Clive did continue to explore interesting stuff post-Spectrum, such as wafer-scale integration. But again, this work was focused on ways to use "defective" hardware, and with Moore's law marching on and yields generally improving, it was easier and cheaper to just use mass produced components...

juice Silver badge

> I'm sure Sir Clive was great, but this is nonsense. I played around with a TRS-80 back in 1980, borrowed from a family friend. Apparently launched in 1977.

It's worth bearing in mind that the view from Sweden would have been very different to the view from the UK.

At a glance, Sweden's unemployment rate was around 2% for most of the 80s.

Conversely, the UK (aka: the "sick man of Europe" at the time) had an unemployment rate which was rapidly rising as it entered the 1980s; it peaked at 12% in 1984.

So fundamentally, there were far fewer people in the UK who could afford to buy a "real" computer - and even many of those who could afford it would have been wary of spending a month's wages on a home computer.

As such, there were arguably far fewer home computers in the UK than there would have been in Sweden, and far less general awareness of the direction that the computer industry was moving towards.

Which isn't to say that people weren't aware that micro computers were the future; after all, the BBC ploughed large sums into their Computer Literacy Project, which saw BBC Micro computers installed in schools across the country.

But that didn't really get going until 1982, and by that time, Sinclair Research's cheap little machines had already gotten a strong toehold in the market, thanks in no small part to the fact that they could be bought for less than a week's wages...

RIP Sir Clive Sinclair: British home computer trailblazer dies aged 81

juice Silver badge

Re: QL

> Sinclair should have simply built an IBM PC Clone based on the 68000 and called it the QL. It would have been expensive but it would have been able to do actual work.

Wha? The technology at the time was way too primitive to bridge the gap between x86 and 68k in that sort kf way. Tricks like VMs and bytecode were several decades away at best, though you did later get hardware cards to let Amigas pretend to be Macs, etc.

The big problem with the QL is that it was specc'd in 1981 but didn't actually launch until 1984. By that time, the various hacks dreamed up by Sinclair to cut costs looked increasingly poor compared to the competition - e.g. the Amiga came out in 1985 - and it was bug ridden to boot.

juice Silver badge

> if Sinclair hadn't done it, someone else would

I can see where you're coming from, but I think we may have to agree to disagree :)

For me, Sir Clive's philosophy can pretty much be summed up as:

a) find a market where things are expensive

b) make things as cheap as possible

c) find ways to reuse obsolete technology

d) accept high failure rates

E.g. back in 1972 the Sinclair Executive calculator cost the equivalent of £1000 in 2019 money. Which was half the cost of any other digital calculator on the market at the time. Which basically meant that it was the *only* choice available to most people, and a significant improvement as compared to working out calculations with pen, paper and slide rule!

And I think there were very few people who could combine that particular set of ethoses. Especially when it comes to d) - Wozniac was arguably at least as clever as Sinclair when it comes to innovation, but I suspect he'd have been horrified at the idea of building anything which wasn't as robust as he could make jt.

It's a very high risk strategy, and it did in fact go spectacularly wrong; back in the 70s, the high return rate for the Black Watch essentially bankrupted Sinclair Radionics, and it essentially had to be bailed out by the government, while Sir Clive spun up another company which eventually became Sinclair Research...

Could someone else have been clever enough to come up with the various cunning hacks creates by Sinclair and his team? Would someone else have been prepared to literally dig up scrapped components and reuse them? Would they also have had a thick enough skin to weather the complaints and costs of the high failure rates?

I'm not convinced there would have been. I mean, I'm sure there would have been competition and Moore's Law would have marched on regardless, but I suspect it would have been a much slower process and prices would have stayed higher for longer.

Though saying that...

> which is why when half decent systems like the BBC Micro came out, Sinclair sank fast and without trace.

The BBC micro predated the ZX Spectrum - in fact, the ZX Spectrum was at least partly created due to Clive being furious about the fact he'd lost the BBC contract to Acorn.

I'd also argue that the ZX Spectrum had a very successful commercial life - it lasted all the way up to 1988 or so, once Amstrad took over and started to produce the +2 and +3 models.

What caused Sinclair to sink was a number of things; the amount of money ploughed into things like the C5 is one obvious factor.

But for me, it mostly boils down to two things. The first is that Sir Clive didn't really care about computers: to him, they were just another electronic device, alongside the amplifiers, calculators, watches, TVs and various other things his companies produced.

The second is that prices dropped drastically in the 80s. Partly thanks to Moore's Law, but also thanks to the commercial battle between Commodore and TI in the USA, as well as the way that the Japanese government ploughed vast sums into subsidising their newly fledged memory fabs.

All of a sudden, buying a "real" computer wasn't really that expensive anymore. And so by the time Sinclair released the successor to the Spectrum, in the shape of the QL, there were enough choices on the market that people no longer wanted to opt for "quirky" and underpowered systems anymore...

juice Silver badge

Jet Set F*cking Willy, shirely?

(For anyone who's watched Micro Men. I once had the pleasure at a retro computing convention of watching this in play form. And then the double pleasure of following the actors who played Clive and Alan Sugar, as Alan pushed Clive around the convention in a Sinclair C5 which someone had brought...

https://youtu.be/vDnPODC89FA

)

juice Silver badge

Re: Let us not forget the

> That was his true genius. He didn't invent the transistor or the stereo amplifier or the hobby microcomputer, what he did was slash the cost, repackage and sell direct to a new market.

I think his genius came from the fact that he could think outside the box and wasn't afraid to use some downright shonky hacks; in fact, I strongly suspect that Nintendo's "lateral use for withered technology" ethos may have been directly inspired by him. After all, Thatcher did make a point of showing off the ZX Spectrum to a Japanese business delegation...

For instance, with his calculators, the cost of LED segments was directly tied to their size. Clive therefore cut costs by making them smaller and just putting a magnifying lens in front of the LED panel.

Similarly, the CPU was some TI chip which hadn't been designed for calculators; it didn't have the right instructions and drained too much power. Sir Clive and his crew figured out how to run the CPU at a lower power level by dropping the voltage and "pulsing" energy to it at just the level needed to keep it from losing data. And they implemented the missing instructions in software; slower and sometimes less accurate, but Good Enough for most people, especially when offered at a fraction of the price of a "real" calculator.

Of course, the problem with these sorts of innovations is that they're easy for others to copy, and as the industry matured, the price difference between the low end and mid end dropped drastically.

But without that sort of innovative thinking putting evolutionary pressure on things, that price drop may never have happened...

juice Silver badge

> Oddly they overlook the QL, which also fell flat on it's face (overtaking the Mac and PC wasn't likely in business, even back then).

Dunno. Some bloke called Linus Torvalds got his start on the QL; you might have heard of him?

The QL hit a lot of problems, thanks to Sir Clive's attempts to both undercut the competition and stick with Sinclair's in-house technology (e.g. the microdrive).

The result was a machine which was seriously delayed to the point where legal action was began against Sinclair. And which was buggy as hell - if memory serves, the original models came with a "dongle" containing part of the machine's ROM. And hideously underpowered; it may have used a 32bit 68000 CPU, but only had an 8-bit data bus. And unreliable to boot, thanks to the continued reliance on microdrives.

I tend to think that the QL is a prime example of a company failing to recognise that the world has changed. In the early 80s, the no-frills approach for the ZX series made perfect sense, but Moore's Law quickly marched on and people were starting to expect far more from their machines (in terms of both features, performance and reliability) by the time the QL eventually staggered onto the shelves. You can maybe even draw parallels with Nintendo, and the way that they thought that the best thing after the Wii would be the Wii U, despite the fact that the novelty of motion controls had worn off, leaving people with an underpowered console with an odd secondary display mainly intended for asymmetric multilayer games...

Equally, and to be somewhat fair: the QL was conceived in 1981 and could have made more of a splash if it had arrived on time. And it's always worth bearing in mind that Sinclair Research was an electronics company rather than a computer company; home computers were just an unexpectedly lucrative product line which Clive used to fund things such as handheld TVs and electric bicycles. Which both failed but were perhaps just a bit too far ahead of their time...

juice Silver badge

> That doesn't actually make Spectrums, grammar schools or leeches good things, because it ignore all the people who were put off, discarded or killed.

Sounds like you perhaps weren't around at the time.

The early 80s were a grim time for the UK, thanks to various national and international things in the seventies which had crippled the economy. E.g. the oil wars, the union strikes, etc.

And then (and without wanting to get political) Thatcher's government came into power and did some significant restructuring of the economy, especially when it came to publically owned industries.

Between the "diseases" and the attempted cures, unemployment shot up, peaking in 1984 at around 3 million people, or around 12% of the working population.

So, all those fancy American computers were way out of many people's budgets; as previously mentioned, the C64 cost £399 at launch, or about £1500 in 2021 terms. And that sort of money was way out of the reach for both those 3 million unemployed people and their dependents, especially since there were far fewer financial options back in the 80s; this was the era when you still had to go into a bank and talk to your manager to arrange a loan!

But then Sinclair started to offer cheap computers; first the MK14, then the ZX80 (£100 assembled) and ZX81 (£70 assembled), and finally the ZX Spectrum (£125 for 16k or £175 for 48k).

And yes. They offered the bare minimum necessary to compete in the market. They were notoriously fragile; some were even built using components Clive literally rescued from landfill[*]. You even had to provide your own display and cassette tape player.

But they were Just Good Enough, and a much larger percentage of those 3 million people could afford to buy one for themselves, or their children, as compared to the luxurious American/Japanese imports or even the British equivalents such as the BBC micro and Acorn Atom.

So, yeah. There's a reason why Sinclair struggled to meet demand for the rubber keyed marvel: it was literally the only machine many people could afford to buy. And the UK IT industry would be very different if the many bright bods at Sinclair Research hadn't risen to the challenge of meeting Clive's mandatory price point requirements...

And that's why - especially when it comes to video games and despite the fact that Britain essentially gifted the entire computer industry to the USA, post-WW2 - the UK has arguably punched way above it's weight in IT since the 80s. We've even seen things come back around full circle, with the Raspberry Pi revolutionising modern low-cost computing in much the same way as the Spectrum did.

Because they might be underpowered and a bit crap. But they're affordable.

[*] if memory serves, some manager at TI had used a load of half-working memory chips as fill for his drive. Clive head about this and paid to dig this guy's drive back up. Because hey: half the chip was still working, right?

juice Silver badge
Pint

Farewell to a legend...

Sir Clive's inventions may have been cheap and cheerful (albeit sometimes a bit too cheap), but the humble Speccy was a significant part of my childhood, and helped to direct me into a life in an IT career. Alongside many others!

In fact, I suspect the UK's computing industry - especially video game developers - would be a lot smaller without the ZX81 and ZX Spectrum.

After all, they may have not have had sound chips or fancy graphics capabilities, but what they did offer was enough to get people started, and at a price which meant that a far wider range of people could afford to buy one; the Spectrum may have cost £175 at launch (approx. £630 in 2021), but the BBC Model B cost a whopping £335 (£1315 in 2021) and the Commodore 64 arrived with a £399 price tag (£1565)...

So, yeah. Some of his products may have been shonky. Some of them may have failed miserably. But the ZX Spectrum was a definite success - not just for Sir Clive, not just for his company, but for the entire country!

And I think that's more than worth raising a pint for.

Thanks, Clive. For everything.

Apple debuts iPhone 13 with 1TB option, two iPad models, Series 7 Watch

juice Silver badge

> This said, Sony invented the Walkman, but not the iPod.

tl;dr: Sony did invent the iPod, they just screwed it up.

=====

Back in 1992, Sony released a new technology, in the shape of their Minidisc portable player - these stored digitally encoded/compressed data on erasable magneto-optical discs, and offered far more capacity than any equivalent solid-state (or even HDD-based) system could offer in 1992 - they gave you 140mb of storage/70 minutes of music, at a time when you'd have to spend several thousand pounds to get a 486 with 4mb of ram and a 200mb HDD.

(Admittedly, the players cost around $350 at launch, while blank discs cost around $16, but with the technology having failed to catch on, by 1994 Sony was starting to engage in some aggressive price slashing. Either way: a 20mb laptop SSD would have set you back about $1000 back in 1991, while a 200mb HDD would have been around $400!

https://www.minidisc.org/wsj_article.html)

Alas, a combination of internal politics within Sony and a music industry wary of this newfangled way of producing "lossless" copies meant that the Minidisc was deliberately hamstrung from launch.

Then in 1993, Fraunhofer released their newfangled MP3 technology with a deliberately cheap "decompression" pricing model. And then had to drop the price of their encoding package when someone leaked their code on the internet.

And then it turned out that high-end 486s and Pentiums could rip CDs to MP3 in near-realtime.

And at 128kbps, the resulting MP3 files were justabout small enough to be transferred in near-realtime via 56k narrowband internet connections.

And then some bright spark released a program called Napster, which made it easy to find other people's ripped files.

The moral of the story is that as with most devices, it's not just about the hardware, but the surround. And the ability to rip your own music and copy other people's music without any artificial limitations was a major factor in why Minidisc failed and MP3 succeeded. Even Microsoft and other companies (e.g. Real) tried to get in on the act, though they suffered from the same issues as Sony, in that they had to at least pay lip service to the music industry and "protect" your music files from copying.

I can remember a Unix sysadmin at work in 1997, who was gutted when he reinstalled his Windows box, only to discover that all the music he had on the HDD was now useless, since he'd lost the decryption key.

Then in 2001, Apple released the iPod and bundled it with iTunes. And whatever people may think about iTunes, it did a good job of Just Working when it came to putting songs onto your iPod; by November 2001, it even let you burn DRM-free MP3 CDs!

And it's perhaps not a coincidence that Sony's NetMD hardware and Sonic Stage software were released in mid-2001, after the iPod had launched. But by then, it was too little, too late.

Oh, and Sonic Stage absolutely sucked, to boot ;)

So, yeah. Arguably, Sony did invent the MP3 player, but it was crippled from the start by commercial considerations. And so cheap/unlicenced MP3 hardware dominated the market, until Apple came along with a much more pragmatic approach, in which they offered both a streamlined user experience without any of the heavy DRM constraints which had crippled Sony, Microsoft, etc.

Facebook building 'on-demand executable file format' that self-inflates using homebrew compression

juice Silver badge

Re: Nothing to see. Move along...

> Packing the executable with DLLs wasn't the novelty. The novelty was in how they packed it, removed redundancy and encoded entropy to improve compression ratios.

From reading the article, it sounds like Facebook have come up with the idea of embedding some sort of programming language into the compression system, so that it can run decompression routines which have been perfectly tailored for the content, rather than a generic "one size fits all" algorithm.

But that's not something new - the RAR format has had a VM system for years, that lets people hand-craft code to improve compression ratios. And as is ever the case, someone even worked out how to write a Hello World program within said VM.

https://sudonull.com/post/136971-Hello-world-for-RAR-virtual-machine-RarVM

https://blog.cmpxchg8b.com/2012/09/fun-with-constrained-programming.html

To be fair, when you've got around 2 billion people using your app on a daily basis, even knocking a few kilobytes off each download makes a difference. Especially when at a glance, they've rolled out over 30 updates just for Android phones in the last 9 months.

https://androidapksfree.com/facebook/com-facebook-katana/old/

And it's not like Facebook is short on resources. But I'd still be inclined to get someone to take a long hard look at why the app itself is so "bloated", rather than faffing around with ways to reduce the impact of said bloat.

G7 countries outgun UK in worldwide broadband speed test

juice Silver badge

Re: I'm not surprised

> I use VM too, but I don’t recognise your description of their kit, generally it’s solid.

For me, the router has been ok; it's needed rebooting two or three times in the last year, which is fairly acceptable.

The TV box is a bit more shonky though; the first one had to be replaced a week after installation, due (I think) to a dodgy HDD. The second one seems ok, but makes a lot of physical noise which makes me suspect the HDD is potentially as fragile as the first.

Though TBH, I keep it switched off most of the time anyway; once the novelty of watching the 90s dance channel wore off, there wasn't really much to keep me interested.

> However, I do resent the fact that they have an entry level service (tests at 110/10 mbps) which costs for so much speed I don’t need or use. 20mbps would be more than enough for my requirements, and should be priced according

The underlying infrastructure costs are the same, and more pricing tiers = more billing complexity and management costs. So I'm not too surprised that they've elected to keep things simple.

> Brag about gigabit fibre, what would I do with that? It’s like supplying me with 500 apples per day when I only eat one or two.

And therein lies the issue for me: how many people actually need gigabit fibre?

I mean, with video streaming having become the standard, the era of people downloading All The Things via bittorrent have mostly come to an end.

And with a 1080p stream generally taking around 5mbps, even a family with 2.2 kids will generally find it difficult to saturate the standard 100Mbps Virgin offering; even if everyone decides to watch a 4K video stream at 20mbps, there should still be just enough bandwidth left for the family dog to have a video chat with the neighbour's cat.

I'm guessing that there are cases where higher bandwidths are needed, especially for businesses and/or where upload speeds are important (i.e. TikTok/Twitch/etc, for all those wannabe influencers and the like).

But for the most part, this is feeling like the "media wars", where we went from VHS and Betamax (ADSL/ISDN) to DVD (fibre) and then to blu-ray (gigabit fibre).

And in much the same way as the jump from analog 288i to digital 720p was welcomed by consumers, the jump to digital 1080p was mostly met with indifference, as 720p was generally Good Enough for most use cases.

In fact, to stretch this already strained analogy even further, the arrival of streaming - even though it's generally lower quality - put paid to blu-ray's hopes of media dominance, since it turned out that once again, Good Enough + Convenience was preferable to Better Quality.

And in much the same way, the fact that people can now stream video to their phones via 4G has somewhat further reduced the need for high-speed broadband. Hell, given that 4G streaming is offering that magic Good Enough + Convenience balance, it's arguably making it hard to justify the push to 5G...

Sort-of Epic win as judge kills Apple ban on apps linking to outside payment systems

juice Silver badge

Re: Epic Greed

> There is no guarantee that Apple will let Epic back into the App Store after this verdict

Bear in mind that Epic is owned by Tencent, a Chinese mega-corp ($68 billion in revenue in 2020), which owns a lot of game studios (including Sumo Digital from Sheffield) and -depending on who you ask - earns between 35% - 50% of their revenue from online gaming.

And they're currently taking a bit of hammering in their home turf, as the Chinese government is currently targetting video games as a "social evil" and drastically limiting how much time children can spend playing video games.

https://www.bloomberg.com/news/articles/2021-09-08/china-tells-tencent-netease-of-need-for-tighter-games-oversight

So I'm sure that Tencent is more than happy with the ruling, as it gives them a chance to extract a bit more money from iOS within their international markets.

And since $20 - $30 billion in annual gaming revenue is big enough to make even Apple take not, I suspect that there's going to be some realpolitik between Tencent and Apple.

In fact, I wouldn't be surprised to see Fortnite IP being sold off to a "third party" also owned by Tencent, so that it can reappear on the App Store...

A developer built an AI chatbot using GPT-3 that helped a man speak again to his late fiancée. OpenAI shut it down

juice Silver badge

Re: No good idea should go to waste

I doubt OpenAI will, but someone might. So I can see why OpenAI are keen on trying to keep some sort of brake in place while waiting for the social, political and legal side of things to catch up.

In fact, I've already had several experiences on Facebook which make me suspect scammers are already trying to use chat-bots. I.e.

a) take over (or clone) a Facebook profile

b) chat to people who are friends with the "real" owner of the profile in a relatively realistic way

c) proceed to either scams (E.g. "have you heard of <american specific grant system>" or extortion attempts (E.g. "I have photos of you")

The last time this happened, I had great fun confusing said chat-bot by responding with stuff like "Ooo, you have photos? Are they the ones where I'm in the maid's outfit?".

A human would have recognised that I wasn't going to bite and cut their losses; the bot just kept repeating a mix of threats and demands until I got bored and blocked/reported it.

So, yeah. This sort of stuff is already happening and being utilised by non-technical criminals, in much the way that "l33t" hackers discovered that there was money to be made from packaging up server exploits into neat little scripts and selling them on the dark net to wannabes.

Fun times...

juice Silver badge

Re: But isnt that how it works for humans?

In best Father Ted style, I'm tempted to reply "that would be an ecumenical matter".

For me, I guess the question is: would the chat-bot[*] be capable of spontaneously taking the data which it's been fed and using it to create something new? Is it capable of learning, changing or acting on it's own initiative?

If all it's doing - as suggested above - is picking the "best" answer from a pre-defined list based on some scoring metric and then massaging it a bit, then the answer is a resounding "nope".

To be fair, that's just the old Chinese Room debate, and there's a case to be made (depending on where you sit on the equally contentious nature vs nurture debate) that humans work in much the same way.

But humans do (mostly) learn, change and act on their own initiative...

[*] As tempting as it is to call the chat-bot Samantha, assign it a gender, etc, that sort of anthromorphism tends to muddy the waters for this sort of debate...

Oh! A surprise tour of the data centre! You shouldn't have. No, you really shouldn't have

juice Silver badge

Re: Gotcha

I knew someone once, who was looking for a new job, and had a job interview arranged. Unfortunately, they were also short on leave and unsurprisingly, didn't really want to discuss their itchy feet with their manager.

Fortunately, their company also had a call-out policy, where if you were called out in the wee hours, you didn't have to come in the next morning.

One little cron script later, and they were able to attend said interview...

30 years of Linux: OS was successful because of how it was licensed, says Red Hat

juice Silver badge

Re: licensing technology

> Proprietary kernels are black boxes; it is easy point out all of Linux's flaws but not see all the potential architecture problems or hacks that keep proprietary kernels operating

It's a trade-off: open-source is theoretically better, since anyone can see the source code and fix it. The problem is that the number of people with the holy trinity of resources needed to do this (aka: the skills and experience to find and fix things, the time (or funding) needed to make and test the changes, and the motivation to work through the processes to get said patches rolled) is highly limited.

Meanwhile, black-hat types are far more motivated to go crawling through source code looking for flaws, since there's a potential monetary reward waiting for them.

So, yeah. You pays your money (or not, if it's open source), and you takes yer chances.

Anecdotally, I've recently had a major issue with an Ubuntu kernel upgrade on my Lenovo Thinkpad laptop; processes were spinning out of control, hammering the CPU and then turning into zombies when killed, which were both linked to the "id = 1" parent process of the entire server *and* still chewing up major amounts of CPU resources.

The only solution was a reboot, which I had to do every 2-3 hours, until I spotted that the kernel had been updated and reverted.

And that brings me back to the point above: this is a pretty major bug to be sneaking into a kernel, and one that maybe (at a stretch, admittedly - maybe a denial-of-service could be triggered) could have hacking implications.

But, I don't know if I have the skills or experience to go delving into the kernel to figure out what's going on, nor do I have the time to do said delving, and since it's a work laptop, I definitely don't have either the time nor motivation required to go and figure out how to submit a bug report or submit some sort of patch.

Instead, I reverted back to the previous kernel, and I'll leave things as-is for a while to see if a later kernel update fixes the problem.

juice Silver badge

Re: Late to the party

My first experience of Linux was around 1997, which was *counts on fingers* about 24 years ago - we managed to get funding for the university computer society to buy a relatively high-spec machine (might even have been some dual-CPU monstrosity IIRC), and then managed to persuade the IT team to let us stick it in the corner of one of the computer labs and plug it into the network!

Admittedly, at the time, it was just this weird, almost retro text-based system which looked like something out of the movies, and which we accessed by firing up X-windows and twm on one of the "normal" computers in the lab (which generally ran Windows). I didn't really start tinkering with it more until I left uni, acquired an ancient 486 all of my very own, and got a job with internet access, which meant I could download Debian (or maybe even slackware?) onto a huge stack of floppies...

Pi calculated to '62.8 trillion digits' with a pair of 32-core AMD Epyc chips, 1TB RAM, 510TB disk space

juice Silver badge

Re: Secret messages?

> Actually we can't prove that. There's a word for irrational numbers that contain all arbitrary sequences (that I can't remember) but we can't prove if pi is one of them.

Thank you, mathematically minded bod ;)

juice Silver badge

Re: Secret messages?

One idle bit of musing I did a while ago, was that (in true Shakespeare's Monkey Typewriter style), if you iterate through Pi long enough, you'll theoretically be able to pull any number sequence out of it, by providing an offset and a length.

E.g. "You want '0123456789'? That'll be at offset 5 squllion, length 10".

Admittedly, this is a pretty inefficient way to transmit data, since both sides would need to have at least 5 squillion+10 digits of Pi to hand.

And having just fired up https://www.piday.org/million/, I can sadly report that you can only get "0123" from the first million digits...

Then too, I'm sure that some mathematically minded bod will point out that since Pi isn't random, this isn't actually possible. But hey :)

India makes a play to source rare earths – systematic scrapping of its old cars

juice Silver badge

Re: Price of a tank of gas

> What the Bbc (of course) fails to mention is all the greenwash that gets added to electricity bills. Like adding the cost of building out EV charging networks. Electricity bills are already fundamentally dishonest as they don't (and AFAIK can't) itemise all the green crap consumers are paying for.

Could be worse.

The government gets approx. £40 billion a year from fuel duty, fuel taxes and car road tax. Which is about 5% of total revenue.

https://ifs.org.uk/publications/14407

Throw in the VAT from new/used car sales (~£8 billion [*]), the income tax raised from people working in the automotive industry (~850,000) and the monies raised from garage business rates/spare parts VAT/etc, and I wouldn't be surprised to find that number more than double.

Meanwhile, the government spends £11 billion on road infrastructure and £4 billion on public transport.

So even with just the baseline £40 billion value, that means that around 60% of the money earned from motorised vehicles is used elsewhere. So the government is going to have to put up taxes elsewhere.

Such as on electricity.

Don't get me wrong - I fully understand and agree with the need to move away from ICE-based vehicles, and some of the above will simply transition to doing other things which also pay taxes, in much the same way as old Victorian theatres became cinemas, and then became bars/luxury apartments/etc.

But the ongoing move towards a "post-carbon" transport system is going to be highly disruptive across large swathes of the economy, both in the short and medium term.

[*] Assuming 2 million cars at an average sale-price of £20k per car. Though since a lot of new car sales are via companies, I'm guessing there's a lot of tax amortisation dodges being applied. Still, it's a big chunk of cash!

Right to repair shouldn't exist – not because it's wrong but because it's so obviously right

juice Silver badge

Re: Spot-on analysis

> That wasn't a matter of lowering your hardware specs though, just the quality of the smoke settings

Yeah - that's partly a combination of technical limitations of the engine, and reducing the amount of work your PC has to do... which gives you a better (and/or more stable) FPS.

Because that's the other thing as well, Frames Per Second are not fixed. They can wildly vary, especially if there's lots of stuff happening.

And then there's all the fun with v-sync - if you turn it off, then you'll probably get visual tearing. But if you turn it off, that might drastically drop your effective FPS - e.g. if your machine is managing 50FPS on a 60HZ monitor, then you might end up locked to 30FPS.

So, yeah. The more grunt in your machine, the more effectively you can tune your in-game experience and the more of an edge it gives you.

juice Silver badge

Re: Spot-on analysis

> Is it really that much of an issue? Home PCs are responsible for 3% (previous Reg article) of household electricity consumption.

Depends.

I work at home (two monitors + laptop), and have a personal machine, in the shape of an old but relatively powerful dual-CPU Xeon machine running WIndows 10. The only issue is that it's a bit flakey when it comes to hibernation, so it's either "on" or physically switched off.

Since I live by myself (*sob*) in a fairly small flat - and have a gas boiler - my electricity usage is fairly low; around £1 per day on average. But if I leave the above beast running, that jumps up to £1.50 per day.

Which isn't a huge amount, and it's certainly something I can live with, as opposed to the cost of upgrading to something which is both measurably faster and less power-hungry.

But it's still a 50% jump.

> Also, you are under the illusion that a high-end gaming rig will give you an advantage, it doesn’t. [...] As long as your PC + monitor can do a decent refresh rate at say 1080 upgrading to a top end GPU electricity guzzling gaming rig will not make you better, but the game will look better. Real advantages come from keyboard/controller and a shit hot internet connection.

Actually, it really does: a faster refresh rate means that your input activity is effectively polled more frequently.

Nvidia aren't exactly unbiased when it comes to discussing why you should buy a faster/more expensive GPU, but their website does have a nice overview of why professional gamers are competing with Bitcoin miners for the high end gear:

https://www.nvidia.com/en-gb/geforce/news/what-is-fps-and-how-it-helps-you-win-games/

For me, it's the same as the financial companies who lay their own fibre optic cables between continents, to shave an extra microsecond or two off their transactional activities. For most companies, that's hugely OTT; for them, it can affect billions of dollars worth of trading!

Tesla battery fire finally flamed out after four-day conflagration

juice Silver badge

Re: Extinguishers...

Dumb question: why not use sand? It's generally chemically inert.

Admittedly, pumping sand is technically complex, but for an installation like this, you could maybe have a crane or robotic JCB on site, ready to tip a few tonnes on any container which is making sparking noises...

Happy 'Freedom Day': Stats suggest many in England don't want it or think it's a terrible idea

juice Silver badge

> The question in my mind is that if every year the media whipped up a hysterical frenzy over how many might die from this year's flu variant, whether or not we would see a similar response from the public.

Perhaps there should be more attention paid to flu deaths - at 25,000 per year (or ~75,000 for respiratory infections in general), they sit just behind heart disease and cancer as the third biggest killer in the UK.

Which puts them way above other causes such as traffic accidents (2,000), despite the fact that there's arguably a much bigger and visible public campaign and policing effort around the latter.

https://www.theguardian.com/news/datablog/2011/oct/28/mortality-statistics-causes-death-england-wales-2010

However, it's also worth noting that even with all the efforts put into minimising the impact of Coronavirus, gov.uk puts the total number of Coronavirus deaths in 2020 as being approx. 75,000, while flu-related deaths dropped to around 15,000.

Or to put it another way: it killed roughly five times as many people in 2020 - or roughly three times as many as flu normally does. And perhaps ironically, with the way that anti-C19 measures have effectively dropped flu infections to zero, there's a very real risk that come wintertime, we're going to get an tidal wave of flu deaths atop whatever we get from Coronavirus.

https://www.cnbc.com/2021/07/07/winter-flu-season-could-be-big-experts-warn.html

So yeah. We should be taking flu a lot more seriously. And Coronavirus is by any measure far worse than flu.

> Now, anyone in the IT arena should be all to aware of how readily the general public buy into these sort of messages

Anyone in the IT arena should also have a grasp of how exponential growth works. And how quickly mutations (or in IT speak, computer viruses using a new 0-day exploit) can rip through the population, especially if there's no measures being taken to mitigate the risks.

> The biggest problem we now face from COVID is not the virus itself - it's the ongoing point-scoring and backstabbing between the politicians and the media, with the general public getting caught in the crossfire.

Perhaps. It's a complex balancing act, and I don't envy any of the people trying to decide on the best way of getting us through this.

I just wish I was more confident that the current political incumbents for this country were making their decisions based on purely altruistic principles...

juice Silver badge

Situation Normal...

All Fecked Up, as Father Jack would say.

I mean, I can see the logic in going ahead with dropping restrictions; with all adults jabbed (or at least having the option of getting jabbed; I've heard some odd arguments from several people about why they're choosing not to, including one guy who didn't want it because his wife was pregnant), the link between C19 infections and long-covid/deaths is probably as weak as it's going to get.

And the amount of future economic pain and aftershocks being built up in the background is only ever going to get bigger.

However, the key word there is "weakened", not broken. And we're already seeing a significant upturn in infection rates, which to this inexpert eye looks to have gotten a nice speed-boost from all those pubs crowded with people watching the Euro final, last week. Which may well get bigger and nastier if a new variant appears which turns out to be even better at sliding through the gaps in the vaccination-defences.

So, yeah. It feels like a pretty big gamble - as several thousand scientists[*] and a number of prominent international politicans have noted.

Certainly, for the foreseeable future, I think I'll be maintaining my current "facemask when shopping, outdoor tables when socialising" approach...

[*] E.g. https://www.theguardian.com/world/2021/jul/16/englands-covid-unlocking-a-threat-to-the-world-experts-say

The world is chaos but my Zoom background is control-freak perfection

juice Silver badge

Re: Don't cover it up!

> It was his rather plump middle-aged wife walking past the doorway - not a pretty sight at all.

I have a couple of friends on Facebook, who've occasionally put up a mortified post about (e.g.) how they were handing their partner a cuppa in the morning, only to realise that said partner was on a conference call, and they hadn't bothered tying their dressing gown closed...

Hungover Brits declare full English breakfast the solution to all their ills

juice Silver badge

Re: Monopoly

IT'S A CONSPIRACY!!1!! THE MAN WANTS US TO DRINK ALCHOL! WAKE UP SHEEPLE!!11!

Sheesh.

Alcohol is literally just yeast piss, and has been drunk by humans for thousands of years - and consumed by animals for millions, any time some fruit fell off a tree and was left to ferment.

It's also far from the only "mind altering" substance which people are free to consume. Have some caffeine. Or some e-numbers. Have a smoke. Chew some mushrooms. Hell, just eat a spoonful or two of processed sugar and then wait for the sugar rush... and the crash.

Alcohol has something of an advantage on other substances, in that it's woven into human society across the globe. Not that it's stopped various people from making an effort to do so on religious grounds, with the most famous perhaps being US prohibition. Which went well, at least for the Mafia.

Then too, as per above, it's ridiculously easy to make - since yeast particles drift through the air, you can literally make beer by just leaving the lid off a vat of wort...

https://en.wikipedia.org/wiki/Lambic

> Probably if we also taught at schools how one can defuse build up of emotions and the whole thoughts and feelings connections, without resorting to any substance, then our society would have been in a much better shape. But then, how all those booze peddlers would make their money?

Or, we could fix the many social, political and economic issues which lead to peoples lives being a misery. Hell, there's currently 2.5 million children in the UK alone, who are currently experiencing "food insecurity":

https://www.mirror.co.uk/news/uk-news/worried-kids-watch-mums-skip-23799872

But yeah. Let's ban alcohol and tell people to get in touch with their emotions. That'll fix everthing!

Three million job cuts coming at Indian services giants by next year, says Bank of America

juice Silver badge

Re: Predictions are like arseholes...

> Take all predictions with a grain of salt & take a "wait & see" stand to determine if it turns out true or not.

In general, I'll fully agree with this one, but I suspect this particular prediction has a bit more going for it than most...

in 2005 the difference between the wage for a UK or North American techie and their Indian equivalent was $92,000. By 2019 that gap had narrowed to around $40,000

Time was, offshore contractors cost peanuts - and because they were so cheap, they were in great demand. Which led to massive growth and a rise in costs, because the people being hired could very easily boost their wages by jumping over to another agency with little or no notice.

And now, we've reached the point where while they're still cheaper, it's now the difference between a junior and senior engineer, rather than the difference between a janitor and the senior engineer.

(To pick two arbitary and unresearched comparisons).

So the offshoring companies are now a bit stuck; after you stick their overheads on top and add a juicy profit margin, the cost/benefit analysis isn't looking quite as good as it used to for the various onshore PHBs.

So this leaves the offshoring companies with two choices. They either improve the quality of their services, or they find a way to cut costs back down.

Improving quality will be difficult, because there's still plenty of attrition from people jumping jobs to get a higher salary. And that leaves them with several options.

The first is that they can figure out ways to get more out of their current staff, while also balancing any changes with the risk that it'll increase the rate at which people jump ship.

The second is that they can look at offshoring their own work to somewhere cheaper. However, that's tricky, as there's not that many countries which can offer a cheap(er), educated and english-speaking workforce, reliable internet connectivity and stable government.

The third is that they can start to automate stuff, now that it's getting too expensive to throw an infinite number of wannabe-Shakespeares at an infinite number of typewriters.

And that's where things are going to start to get interesting, as by their very nature, automated processes - especially those based on open source technologies - can be ran from anywhere in the world...

'Welcome to Perth' mirth being milked for all it's worth

juice Silver badge

I am disappoint

This was a prime opportunity to fire up the Reg's Online Standards convertor, so that we could definitely quantify the distance between Sydney and Perth.

I'm happy to say that I've now gone and done the appropriate >clickety<, and can confirm that you can actually fit 355786.9617 double-deckers between Sydney and Perth.

Bit of a shame about that last 0.0383 of a bus, but a quick bit of angle grinding'll soon sort that out...

Samsung brags that its latest imaging sensor has the ittiest-bittiest cam pixels in the world

juice Silver badge

> That just sounds like "our software overcomes the laws of physics"

Not really. More like "if we average out the data from X inputs, we'll get a better result than if we just took a single value from 1 input".

And that's where opinions vary.

Personally, I went from a Samsung S10+ (12MP, 1/2.55") to a S21 Ultra (108 MP 1/1.33"), which does indeed default to mashing together 9 pixel blocks to produce a 12MP image.

And I had the two handsets for a few days - I traded in the S10+ against the S21U, but Carphone Warehouse gives you about a week's grace before you have to post the old one back to them[*].

So I did actually go for a wander and took some comparative photos. And for the most part, the results were pretty much identical - there were differences, but to my amateur eye, they were minimal.

Admittedly, this was just after launch, and Samsung has since done several rounds of OS patches, most of which have tended to have a generic "Camera Performance" comment in the release notes. So it'd be interesting to see if anything's changed for the better since.

But for now, the answer I'd give to the above is basically just a shrug and a "mebbe"...

[*] I think it's technically 2 weeks, but the clock starts ticking when you place the order, not when your new phone actually arrives!

juice Silver badge

The general theory is that you can use software to combine the data from X "mediocre" sensors into something which is potentially better than the data from a single "good" sensors.

Opinions and results vary, especially since companies these days insist on muddying the waters by claiming that their image-processing systems are AI-enhanced...

juice Silver badge

> I've never owned a Samsung device.

The odds are good that whatever device you've owned has had Samsung hardware in it though; they do sell a lot of components to other companies, and that most definitely includes imaging sensors!

EE and Three mobe mast surveyors might 'upload some virus' to London Tube control centre, TfL told judge

juice Silver badge

> No agreement from TfL is needed first as a consequence

Fair - I wasn't aware of this. Even so, I'd assume that if a telco is exercising it's legal rights to install a mast on a third party's building, it still needs to notify the third party in advance and get agreement on the compensation and anything else (e.g. access rights, making sure the mast is not a visual eyesore, etc).

Someone at TfL must have known that this was all going ahead!

juice Silver badge

I have to admit, I am mildly bemused by this one, not least because the full sentence reads:

> The two mobile network operators wanted to send surveyors onto the roof of a TfL-owned office block in Southwark, as the first step to putting a mobile mast on the roof

So TfL have agreed to have a mast on the roof of this building (presumably for some suitable financial recompense), but have then turned around and decreed that the mobile network engineers can't actually survey the site to see if it's suitable?

Sounds like a classic case of left hand not meeting the right hand...

And if I was a betting man, I'd maybe flutter a fiver on the same happening again when it comes time to install said mast ;)

Australian cops, FBI created backdoored chat app, told crims it was secure – then snooped on 9,000 users' plots

juice Silver badge

Re: 'What kinds of mobile phones would these be then?'

> But surely they connect to a cellular network. I'm pleased if this is the way forward rather than mass snooping EVERY individual. But methinks thats not going to be the case.

I think you might be missing the point a little here. The criminals aren't directly worried about the fact that the phone connects to a celluar network.

They're worried about the fact that - as popularised by decades of sensationality TV shows - that little device they're carrying around has a microphone and a video camera built into it. Which means they can potentially be used as wiretap devices by any authorities who happen to have a suitable exploit handy to install such things, and which will then let them listen to the criminal's activities 24/7.

So they've opted for a brute-force solution: they disable the device's audio and video capabilities, preferably by physically removing them. And then they stick to text-based messages sent via encrypted channels and which are either kept in encrypted storage, or deleted after reading.

And that vastly reduces the risk of accidental/unaware leaks.

> I think this 'success' will just empower the agencies further to do what they hecking want in terms of backdooring anything and everything they desire, alongside the sneaky entrapment methods that they eventually come clean about. utilising a method of 10% overt, 90% covert.

This particular "exploit" only worked because of the authorities both built the hack directly into the hardware for this specific device and then managed to persuade people that it was a secure device.

As a double-whammy physical/social engineering hack, it's superb. But it's not something they can do on an ad-hoc basis, not least because both Apple and Google are fully aware that any such backdooring mechanisms can be used for both good and evil. After all, if an exploit appeared which lets black-hat hackers remotely steal data from a phone, you can pretty much guarantee that millions of people would wake up the next day to find their bank accounts emptied.

As such, they will continue to actively limit such things, at least until/unless the NSA comes knocking with another Clipper chip proposal and the legal backing needed to force them both to comply.

Equally, for all their strong words, this probably isn't something the authorities are likely to be able to repeat - they've only gone public with this as they're losing their legal cover and therefore had to either "use it or lose it".

Even if they are in the process of rolling out a repeat of this sting, I suspect the top-tiers of criminal organisations - or at least their very well-paid security consultants - will be taking a good, long and hard look at any future devices - or possibly maybe even commissioning their own customised hardware.

Today I shall explain how dual monitors work using the medium of interpretive dance

juice Silver badge

Re: Examples...

> That sounds more like a cache to me.

The key point is that you have fast storage (RAM) and slow storage (HDD), and a warehouse-store like Argos is a fairly good/understandable analogy for this. The further the wee man inside your CPU has to go to get your data, the longer it takes ;)

To be fair, von Nuemann architecture is based around the concept of a CPU hooked up to a unified storage device which would store both data and instructions.

https://en.wikipedia.org/wiki/Von_Neumann_architecture

This just proved difficult to implement in the Real World(tm), and so we ended up with the CPU cache/RAM/HDD architecture which we use today.

After all, 90% of what's in your RAM is stuff that's been pulled from the HDD to speed up access ;)

Anyhow, this sounds like pub talk, so mine's a pint of pilsner, please!

juice Silver badge

Re: Examples...

> So imagine you have a desk, and some filing cabinets

My go-to explanation was always Argos (or Index, going back further in time!).

If you want something, you give a code to the man behind the desk (the CPU). If it's something that gets sold a lot, it'll be on the shelf behind them (the RAM). If it's something else - or something too large to fit on the shelves behind the counter, they'll have to go into the warehouse to find it (the HDD).

So if you want to make your machine faster, the quickest way is to give the man more shelves. Or the CPU more RAM.

People generally seemed to nod and smile when I used this metaphor. Though maybe they were just hoping I'd stop talking ;)

How many remote controls do you really need? Answer: about a bowl-ful

juice Silver badge

Nowt wrong with a bit of Laibach

Their cover of The Final Countdown is a thing of strange beauty. Especially if you first encounter it in a gothically-vamped-up function room in a pub in York, which you get to by passing by the skeleton sat at the bar #trufax

https://www.youtube.com/watch?v=-E72v6G9JHY&ab_channel=Mute

Remote control-wise, I've long been fairly happy with my Logitech Harmony 650 remote. Y'know, the hardware that they've just discontinued.

Admittedly, I think I bought it about 15 years ago, and they're theoretically keeping the software side of things running, but still.

The only real problems have been that it's a faff reprogramming it, since you have to have an account on Logitech's system (which entailed remembered what email and password I'd been using 15 years ago), and the top-right programmable button needs to be well and truly mashed, even after several cycles of dismantling the remote to clean the contacts with nail polish remover.

At some point, I'll have to dig out a USB 2.0 cable and plug it into a laptop so I can remap "power-on" to a different button, but see the point about the reprogramming faff... ;)

It's a bit of a shame that mobile phone manufacturers decided to drop I/R ports from their phones - I think the last handset I had which could be used as a remote was the LG G4. Oh, the fun we had in pubs, once we'd spotted who the television manufacturer was...

FYI: Today's computer chips are so advanced, they are more 'mercurial' than precise – and here's the proof

juice Silver badge

Re: Error detection

>As in minority report...

I think that "voting" concept has been used in a few places - including, if memory serves, the three "Magi" in Neon Genesis Evangelion.

https://wiki.evageeks.org/Magi

There's even a relatively obscure story about a Bolo (giant sentient tanks), in which the AI's multi-core hardware is failing, and it has to bring a human along for the ride while fighting aliens, since there's a risk that it'll end up stuck with an even number of "votes" and will need to ask the human to act as a tie-breaker...

The common factor in all your failed job applications: Your CV

juice Silver badge

Re: Different types don't match well

As mentioned here previously, I went for a programming-job interview once, but was pipped to the post by someone who had done better at the interview.

Only to receive a phone call a few weeks later; it turned out that the person they'd hired actually had zero programming skills, but had somehow managed to blag their way through the interview.

I've never really understood why people do this - I've no doubt that there are jobs where you can blag your way for a year or two, but technical jobs generally require technical skills!

In another life, I worked for a company which was struggling to fill a technical-support role, not least because the agencies kept sending through anyone who even mentioned a computer on their CV. I ended up throwing together a "jack of all trades" quiz for candidates to respond to - a mix of programming, network and database questions, reflecting the sort of day-to-day issues that you'd have to deal with in that role.

And all the candidates were given an hour to run through this, and were even allowed to use their phones to help them. Because it was more about how they approached their answers, than what they answered.

Scarily, a lot of people scored very low on all fronts when it came to this quiz; some just stared blankly and gave up immediately. But we did hire the one person who gave it a good go, and they've done very well in the company ever since!

Firefox 89: Can this redesign stem browser's decline?

juice Silver badge

Re: Privacy

> TL;DR - people are lazy.

Bollocks.

My dad's a roofer; his wife fosters several children. Are you saying that after a hard day's manual labour - and an evening of dealing with the many aspects of fostering multiple children - he should then be spending what little spare time he has on jailbreaking his phone to install privacy software, as well as reviewing privacy policies for all the things he uses?

To grossly over-simplify, Human civilisation is built on two things: trust and specialisation.

I.e. I'll do something specific, such as baking bread. And I'll trust that while I'm doing this, you'll be over there tilling the fields, blacksmithing, butchering, candlestick-making and the like.

Well, trust, specialisation and the development of an exchange medium to balance the books. That candlestick'll be ten loaves of bread, mate. And while I could demand a full list of ingredients for said bread, I'll trust that you've not done anything like using ground up acorn-flour to boost your profit margins. And you'll trust that I've not made the candlestick out of light wood and painted it with flammable oil-based paints.

Spanish inquisition comedy and recipe tips aside, that "specialise and trust" principle extends all the way up through the ages, from Rome's legions to Ford's assembly processes and up to Amazon's world-dominating logistics network.

I do my specialised thing, and I exchange the money I receive for doing that thing for goods and services from other specialists.

And that applies just as much to legal matters, such as privacy. We have an entire branch of society called "government", whose specialisation is entirely around managing such things; we even nominally hold them to account with regular elections to keep them focused on their job.

And we /should/ be able to trust them to deal with it. Assuming, of course, that they can come up with a privacy system which balances rights in a way everyone can agree with. No rush, I'll just sit in the pub with a pint (or several million) while the details are being thrashed out.

Admittedly, there's several issues with the above utopian view even above and beyond the fundamental "what is privacy" question.

The key one being that public policy can be... shaped by private entities (e.g. the ultra-rich and mega-corporations) with deep pockets, which in turn leads to a vicious circle of these entities continually acquiring more power with which they can further shape public policy to their needs.

And that cycle will continue until something drastic resets things, such as a scandal, a major economic upheaval or even the fragmentation of empires after someone passes.

In the meantime, anyone want a candlestick? Special deal today - buy two for just twice the price!

Ex-Apple marketing bigwig tells Epic judge: Our revenue-sharing model is designed to stop money laundering

juice Silver badge

Re: The usual doom and gloom FUD

Schiller's statement is interesting in several ways:

"I proposed other commission levels even lower [than 15 per cent] and our finance and anti-fraud team was pretty adamant that if we get much below 15 per cent, the rate of money laundering attempts will increase dramatically,"

In the first instance, having been told that it was unwise to go below 15%, Apple then decided to opt for 30%, or double the minimum recommended by their financial experts. Despite the fact that the App Store was originally positioned as being a not-for-profit system.

Seems like the sort of "monopolistic" decision Epic's lawyers should be jumping all over...

Equally, if in the highly unlikely event that I ever find myself with a wodge of undeclared cash which is large enough to need a quick wash, at least I now know what the going rate is likely to be at the cleaners!

Protip: If Joe Public reports that your kit is broken, maybe check that it is actually broken

juice Silver badge

Re: Civil service paying for excuses

> I’d seen several times in the civil service where someone had screwed up and instead of owning up and sorting the mess out they just carried on and paid for stuff that wasn’t needed.

It can go the other way. A decade or so ago, there was a council department in a Northern Town, the head of which was an old school chap who didn't understand all these newfangled computer things.

So he basically just sat on the budget which had been allocated to maintenance and upgrades. For several years.

Eventually, my relative joined this department, and having seen people struggling to share the same decrepit desktop PC, gently suggested that they'd be happy to take on the responsibility of sorting out the IT needs for the department. Despite the fact that said relative wasn't particularly tech-savvy!

One set of orders later, and every individual in the department had their own laptop, and my relative was very popular indeed. Except with some of the other local departments, who were a bit cheesed at losing access to this "free" money that they'd been able to tap into for years.

And in the end, it was this sort of internal politiks which lead my relative to decide to move on. Gotta love local government...

Water's wet, the Pope's Catholic, and iOS is designed to stop folk switching to Android, Epic trial judge told

juice Silver badge

Re: Pot calling Kettle

> I don't know how this is different from any hardware/software platforms. Sony, Microsoft, Nitendo, Apple, Disney, etc. all participate in this lock-in.

More precisely, it's the responsibility/choice of the IP owner as to whether to port from one platform to another, not the platform owner.

To take an example, Monument Valley is an amazing mobile game which was first released on iOS. It was (much) later released on Android because the content owner chose to do so.

In fact, if memory serves, there's plenty of examples of iOS apps/games not being ported over to Android because the profit margins are generally lower for various reasons - not least because of how much more hardware variation there is and how that impacts QA and support costs.

Be interesting to see if Apple use that in their defence.

Should the platform owner(s) make it easy to port from one platform to another? Perhaps, but that's a much wider question - and it also tends to result in the weaker platforms being killed off, as happened with Blackberry/WinMobile when they decided to start offering Android compatibility. After all, if you're not going to make any use of that platform's special features, what's the point in using that platform?

Either way, I can't help but think Epic is playing a dangerous game here - the stuff they're bringing up touches on a lot of stuff around platform "lock-ins". They've been very keen to stress the monopoly/anti-trust angle, but it wouldn't take too much to point the same attack at Sony or Microsoft.

Or even Epic themselves. After all, Fortnite is essentially a monopoly/eco-system, and there's an in-game store. Should they be forced to open this up so that players can re-sell previously purchased goods - and/or for third parties to make and sell their own goods?

(Yep, this is a definite stretch. But it's the sort of thing which needs to be thought about more as the market continues to shift towards microtransactions and resellable content, as the debacle with Diablo III showed...)

juice Silver badge

Re: Apple App Store "a necessary evil"

> If they simply made it POSSIBLE to load a non-app-store application [similar to Android downloading and installing a non-store APK] this whole issue would PROBABLY go away...

Would it? It's worth bearing in mind that Epic has launched the exact same lawsuit against Google, despite the fact that you can sideload on Android. That courtcase simply hasnt' received as much media attention.

https://www.theverge.com/2020/8/13/21368363/epic-google-fortnite-lawsuit-antitrust-app-play-store-apple-removal

> After following the first link in the article, I'm reminded that Apple banned Epic's game because it allowed in-game purchases outside the Apple store. But I recall _other_ applications being banned by Apple for different reasons. If there are no exploits or gross vulnerabilities, WHY ban them?

Because it went against the rules which Apple have set for vendors which wish to use Apple's infrastructure to sell goods and services to consumers.

And therein lies the key point: Apple can (to an extent) choose to offer different terms to users of it's platform, in a similar way as to how open source software can sometimes be made available under multiple licences.

But a vendor can't turn around and demand that Apple change those rules.

Epic's argument is that Apple (and Google) are monopolies and therefore should be forced to give better terms to the vendors which use their infrastructure.

And, y'know, I can see where Epic is coming from. However, I can also see that Epic is basically fronting for Tencent (which owns a lot of game companies other than Epic - 60% of their annnual revenue comes from mobile gaming), and it's mostly other software giants (Facebook, Microsoft, etc) who are jumping onto the hammer-Apple bandwagon.

That makes me very suspicious of the motives for this little "won't someone think of the little guys" crusade - if Epic/Tencent does win, I'm equally doubtful as to whether either the vendors or consumers are likely to see any benefits after all the dust has settled.

> regardless iOS is great if it's what you want - I just don't see why they need a STRANGLEHOLD on "The Store" like that. I have to wonder how many customers they LOSE because of it.

Apple has spent decades building an eco-system up which runs according to their rules. You don't really buy a PC or mobile phone from them - instead, you buy an appliance which then plugs into this eco-system.

Apple's contention is that this eco-system needs to be protected, and there's some merit to this argument. OTOH, there's also some merit to the argument that all they're really doing is protecting their monopoly.

Which on the third hand, is something they've built entirely by themselves, using private money and resources. Which is the American way, dontcha know.

So, yeah. It's messy, there's a whole host of complex law and economic aspects, and the lawyers are definitely the main winners in all of this...

Preliminary report on Texas Tesla crash finds Autosteer was 'not available' along road where both passengers died

juice Silver badge

Re: Ban it

> There is no technology that is idiot proof, especially when driving at 60 MPH with hands off the wheel

While I'm at least as wary as anyone else when it comes to the current state of self-driving technology, the entire point of this article is that autosteer couldn't be engaged on that particular road, and therefore

The Tesla in question was equipped with Autopilot, which requires both the Traffic Aware Cruise Control and the Autosteer systems to be engaged. The former is a jumped-up cruise control, which deals with acceleration and deceleration while the latter assists with lane keeping. The NTSB showed in tests with an exemplar car that the latter also could not be engaged on that part of the road.

So I'm just saying, it might be worth actually reading the article before grabbing a pitchfork to wave at the nearest sentient toaster.

Beyond that, Ars Technica has a much more detailed article, which actually reveals a more troubling possibility (with the worrying part emphasised).

https://arstechnica.com/cars/2021/05/ntsb-finds-no-reason-to-suspect-autopilot-in-fatal-tesla-crash/

As for why the driver was not found in his seat, one troubling possibility is that the front door was inoperable or obstructed and the driver died while trying to escape from the rear of the Model S. Unlike most cars, Tesla uses IP-based electronic door locks that fail if the car loses power (as it would have in this crash). Although the front door handles will continue to work in an emergency that cuts power to the car, under such conditions the rear doors of a Model S can only be opened using a plastic tab found in the rear footwell.

For me, the fact that in the event of a power-cutting disaster you can only open the rear doors by rummaging around in the footwell to find a release tab hidden under the carpet is an absolute NOPE.

Page:

SUBSCRIBE TO OUR WEEKLY TECH NEWSLETTER

Biting the hand that feeds IT © 1998–2021