* Posts by rg287

868 publicly visible posts • joined 13 Apr 2018

Page:

European Court of Human Rights declares backdoored encryption is illegal

rg287

Re: Well good thing the UK had Brexit

The ECHR was founded by their hero Winston Churchill.

I think there are some in the current Westminster Conservative Party who consider Churchill and Sir David Maxwell-Fyfe (who drafted most of it at Churchill's behest) to be woke, lefty melts.

Especially Maxwell-Fyfe - as a prosecutor at the Nuremburg trials he is the absolute epitome of a woke, activist lefty lawyer!

rg287

Re: Cue Daily Heil headline "Euro Court Won't Protect Our Children"

What is really needed are smartphone companies, ie mostly Apple and Android to create parent/child relationships between devices so that parents can monitor what their children are up to.

To a point. And with a carefully restriction on "monitoring". For instance, should a parent be able to see that their child has been looking up support websites for domestic or child abuse? Is it good that they can see their child didn't actually hang out with their friends in the park after school, but was still at the school (talking to a child services bod about how daddy hits mummy, mummy threw boiling water at daddy, about how their older step brother sometimes comes into their room after dark and touches them).

Now, good opsec might see them do this on school machines, not a personal device. But there are limits. My wife has worked on Women's Aid websites and they invariably have a special keybinding so you can smash Esc three times and it opens google in the tab. They also avoid setting cookies and do stuff to clear the browsing history.

Of course that doesn't apply to you or me because we're good parents and just want to make sure our kids aren't getting bullied. But I'd suggest the answer there is to cultivate a relationship where they show/tell us this stuff voluntarily.

On the flip side, I know one single mother who has constant two-way tracking between her phone and her (young adult) daughter's so they both know the other is okay. People have different needs, but it's basically a trope in InfoSec social media of "No, I will not help you hack your kid's phone. Have you tried talking to them?".

On this topic, I strongly recommend Consent of the Networked, (Buy Used for just £2.65!)which is a decade old and talks about a bunch of dead sites and platforms - but has many relevant discussions on platform design and safety considerations which never go out of date. Like that time some Techbros at Google thought it would be really convenient to connect everyone's accounts so that they could see all their services and contacts in one place. And hey, wouldn't it be great if your contacts were all suggested and linked together too? Cue domestic abuse victims suddenly having their new account details connected to their old ones, identifying their new identities and locations to their estranged abusers/stalkers. Really helpful.

rg287

Re: Well good thing the UK had Brexit

It would be much easier to do that if we hadn't (for instance) closed the Afghanistan scheme and stopped letting people fly in. Then we wouldn't be having to sift people in Dover or Calais.

We could also take up the French on their offer to let us open a processing centre in Calais - but that was declined and hushed up in the press because it suits the Government's narrative to have boats crossing the Channel.

In the long run, our treatment of Afghan and Iraqi aides and interpreters is likely to cost British lives in future conflicts - why would locals help our troops (even for money) if they knew they were going to be abandoned afterwards. Not looking after those who have supported us is a very stupid Foreign Policy and Defence decision.

rg287

Re: Well good thing the UK had Brexit

I suspect that the Rwanda story is just a facade. There were Tory MPs trying to get the UK out of the ECHR before government ministers even knew where Rwanda is (or what a small boat is, for that matter).

The Rwanda Policy was never intended to send a single person to Rwanda. It was designed to fail. But in the process, it would give the government room to undermine and discredit the legitimacy of our domestic courts, the Human Rights Act and - by extension - the ECHR.

Every minister has access to well-qualified solicitors and barristers to advise on these matters.

* They knew that the Rwanda policy would be challenged in court, and they would lose.

* They did it anyway. Even though they're the government and could change the law to fit their policy, but prefer to fight and lose in court.

Why? So that they could whinge about "lefty judges" and "activist lawyers", further undermining the rule of law and the separation of government and judiciary.

We started along this path with the Article 50 case. The law was clear - you needed an Act of Parliament to declare Art. 50. But the Government chose to fight the case, lose, and then have their client papers run seditious1 headlines like "Enemies of the People", painting high court judges as somehow being bad and placing their personal security at risk23 when all they were doing was interpreting the law as passed by our (sovereign) Parliament.

And it's not even like it was difficult - Daddy Pig introduced a Bill and ran it through Parliament in record time. But in the process, they'd publicly criticised and undermined the legitimacy of the High Court and the judiciary in general.

1 Some might think that a government can't commit sedition - which generally means inciting rebellion against the authority of a state. However, if we consider the slightly broader interpretation ..."excitement of discontent against the government, or of resistance to lawful authority." Then clearly it's possible for ministers or a government to commit sedition against the broader constitutional basis and lawful authority of the state, or against those institutions (like the judiciary) who serve as a check-and-balance against executive power. Parliament can't commit sedition, since they are in fact sovereign and have the right to change our constitutional make-up. But the Government can.

2 Thousands spent on judges' security amid growing hostility

3Lord Chief Justice says he sought police protection in wake of Daily Mail ‘enemies of the people’ front page

rg287

Re: Well good thing the UK had Brexit

All this.

But we must also remember of course that our erstwhile Justice Secretary Dominic Raab is extremely keen to scrap the Human Rights Act and withdraw us from the ECHR.

Brexit was the first step in that - not in a practical sense, but in the social aspect of "tAkInG bAcK oUr SoVeReIgNtY" and moving the media and Overton Window in a direction where it would even be plausible.

The matter has then been wrapped up in the small boats "crisis" (of the government's own making) because the measure of a person is how they treat those they perceive to be below them - and as we all know, an awful lot of people are horrible human beings without a shred of empathy (see also: certain strains of evangelical/fundy Christians who consider getting-to-church/being-seen-in-church to be more important than being the good samaritan. More bible study required!).

Those people will wilfully throw away their own rights (as they did with legal aid) to "get one over" on the issue over the day, never stopping to realise that it could be them. And they don't have £50+k for a barrister either!

John Deere tractors get connectivity boost with Starlink deal

rg287

Re: "Great for Farmers"

That's taking the skill from the farmer and turning it into a subscription.

It's not about skill. It's about reality.

My uncle isn't less skilled for having GPS guidance on his tractor. When you're on your 9th hour in the cab, the GPS glosses over those brief drifts of attention and avoids double-seeding, saving seed and money. When you know how much seed costs per tonne, these are tangible savings.

Likewise on the spraying and fertilising, real-time video analysis across the entire boom width and only selectively spraying is something the farmer literally cannot do. You can't identify pests across a 60ft swath from the cab, and a human couldn't manually control all the individual nozzles anyway. This system augments the farmer, reducing chemical inputs, which is good for both the bank balance and the environment.

But for all that, my uncle hasn't gone all-in on JDOC (or <brand equivalent>, since he's not running JD). He just likes the GPS driver assist and some of the smart spray bits. Of course he's farming a few hundred acres in the UK and does basically everything except harvesting by himself - so no need for management tools to monitor drivers or larger fleets.

In the longer run, replacing drivers with autonomous tractors does obviously entail replacement/deskilling.

It's a nuanced one though - a lot of ag-tech is very, very beneficial and augments the farmer (just as tractors are more productive than horses). Some of it is just snake-oil and most farmers leave it alone (because it's a business purchase not a new iphone, and they want to see the ROI). Much like you'd raise your eye at anyone trying to sell you a modern car without ABS brakes or lane departure buzzers... but those driver assists are useful, unlike "Autopilot" and not-quite-actual-self-driving.

That runaway datacenter power grab is the best news for net zero this century

rg287

So, just buy of a small fleet of nuclear submarines and wire them up to the data network.

The throughput is going to be terrible though if you're allowed "One ping only".

rg287

Re: Zero

Most of the digital offerings are surplus to the requirement, designed to kill boredom, stroke egos and make us over indulge in vanity. I mean if 99% of websites were gone tomorrow

Not sure about 99% of websites. Most websites occupy a couple of MB on a cPanel server somewhere and use functionally no electricity. They're not the ones doing the harm.

If we want to make a difference:

* Ban crypto from datacenters. If you want to launch a coin, you have to run it from an on-prem DC. If that's not viable then that's your problem - make it more efficient until it is. The entire notional point of cryptocurrency is to unshackle you from "the man". If you can't run it on an end-user device and nodes are all owned by firms with racks full of hardware then... meet the new banks. A lot like the old banks, except unregulated and probably Ponzi schemes.

* Ban tracking-based advertising and RTB. Loading 70+ trackers on a page load isn't free. It substantially raises the energy and bandwidth footprint of a site. Why should loading one page call 70-plus unrelated services (and servers)? NPO have shown that contextual-based advertising is just as cost-effective as tracker-based. Cutting out the middle man saw their ad revenue rise substantially, as reported in these pages. Plus, it's privacy friendly and there's less risk of leaking large databases of PII.

* Tax React. Make people buy carbon credits to use a chonky framework. No, I haven't thought out how to actually do that. I'm being silly. But again, reinventing the wheel isn't free. Standard ebooks serve a million views per month, as well as hosting their entire git and build infrastructure on a single-core 2GB VPS. A paean to the classic web And you know what? It's great. News websites serving mostly static content don't need a hefty client-side framework. Quite frankly, neither do most brand websites. For the most part, sites served via a complex K8s infrastructure are very much doing it wrong.

* Tax the buggery out of datacentres (and warehouses, factories and logistics parks) that cover less than 90% of their open roof space with solar panels. No, obviously solar won't come close to powering a dense datacentre (though it will for a warehouse that's mostly shelf space), but it's free land. It's frankly bizarre to see fields full of solar down the road from warehouses and DCs with bare roofs, or just a handful of panels in one corner (older buildings may not have the load-bearing capacity, but that's not an issue for new-builds). Equinix LD4, LD5 & LD10 could each get the equivalent of 0.9-1.1MW of solar on their roofs (a basic area calculation, rounding down generously - a fully naive calculation pops out more like 1.2-1.5MW each, I'm also ignoring their curved roofs - so don't downvote. I'm caveating the hell out of this!). Whilst 3MW won't come close to running Slough Trading Estate, it's also not nothing, and I've only looked at three of the many bit barns in the area. Cutting your power bill by 5% is non-trivial for a datacentre, and leaves overhead for the incoming grid supplies - particularly with people mithering about GPU-laden boxen increasing the required kVA per rack. Slough Estate should be a glittering array of solar from the air. If nothing else, 5-10MW would offset the total consumption of the town, if not the data centres.

And electricity of course is only half the problem. Sucking up groundwater to drive evaporative coolers (and subsequent disposal of brine) is a major sustainability issue for DCs.

Need to plug in an EV? BT Group kicks off cabinet update pilot

rg287

Re: From what I can recall ....

It'll all be slow chargers as well. Which no-one really wants.

Don't they? The average car journey in the UK is 8 miles. Average weekly mileage is 190miles. A trickle charge at 3 kW or 7kW is more than sufficient - 7kW would fully charge a 70kWh battery overnight (10hrs). Since most people are not charging from empty... much less. And thrashing the battery at 100kW is bad for it. If you're not roadtripping, then letting it trickle overnight is much better for the battery. Most people want slow chargers most of the time.

If you're on a roadtrip, you won't be pulling off the motorway and hunting for a BT charger down a residential street. You'll be looking for a supercharger at a service station.

Of course the big issue is power supply to the cabinet, and cabling to the car. I suspect they're simply looking at the cabinets as a thing that already occupies a spot on the street, fences may have been built around them. Some of them may also have conduit in that a fatter power cable could be pulled through. That's a space that can be repurposed more easily from a planning perspective than identifying spaces on streets for completely fresh builds. Of course quite a few of them are on corners where you can't park, so some sort of embedded conduit to parking spaces would have to go in.

SpaceX snaps back at US labor board's complaint, calling it 'unconstitutional'

rg287

Re: Administrative State

Regardless of whither the rule was a good idea or not Congress never passed a law authorizing the EPA to issue such a law...

That is literally the point of executive branch agencies. If Congress had to write out every single rule and regulation as primary legislation, nothing would ever get done anywhere by anybody. They do not have the bandwidth to deal with that much business. Do you really want Congress to have to issue a law approving each individual model of car as being suitable for sale, because it's "unconstitutional" for the NHTSA to set safety standards and approve (or reject) models. That's what you're asking for. For your Senators and Representatives to be arguing in Congress about whether the 2025 Canyonero is safe or not.

No one in Congress has the expertise to - for instance - define what a safe level of heavy metals is in drinking water. But the EPA do, so their experts issue a rule on it. That's the point - Congress gives executive agencies a mandate and sets them off to cover <field>.

...which makes it unconstitutional.

No. No it doesn't.

rg287

Publicly mouthing off would indeed be sackable - most contracts do have a clause about not publicly bad-mouthing the company.

However, sending an internal memo to the COO raising concerns about the erratic behaviour of the CEO and whether the CEO is tarnishing the reputation of the company is - in most developed nations - not a sackable offence. Albeit it will often be career-limiting (in ways which are very hard to prove, even where constructive dismissal is unlawful).

Is the USA a developed nation? I guess we'll find out.

rg287

They are fighting to go back to the age of the robber barons, the very reason most of the regulators exists in the first place.

We're living through a new Gilded Age. We're past time for the FTC to do some trust-busting, alongside a union renaissance for workers rights, health and safety, etc. Given that a non-trivial portion of Americans still want to vote for Trump though, things seem likely to get worse before they get better. Sooner or later though, the shared Stockholm Syndrome will be broken and people will stop worshipping billionaires - at least for a little while.

Then we get a reset, things get better, everyone forgets why, and so the wheel turns.

To be, or not to be, in the office. Has returning to work stalled?

rg287

Re: Remote

Rail travel in most of the country is actually back up overall, but it's more spread out (includes more leisure travel) rather than being as concentrated on the peaks as it was.

Yes, we've been at 100% of pre-Covid levels for a while. Significant increase in long-distance leisure, only being dragged down by the massive drop in commuting around London. Sod the rest of us I suppose - as it has ever been.

Which is why a return ticket to Edinburgh costs 3x more than it should unless you take the midnight train and there's only one train every two hours out of the West Midlands. Rail is not expensive in itself, pricing is purely a function of demand management. Which is why ended up driving that trip recently - the petrol was 1/3 the cost of rail, even though I would have much rather have spent 4 hours each way reading a book instead of driving (and my trip didn't allow for arriving at midnight).

HS2 was supposed to unclog the legacy lines and provide dedicated paths to get those long distance trains about. Alas, Sunak don't care, because helicopters don't need rails.

Dump C++ and in Rust you should trust, Five Eyes agencies urge

rg287

Re: One area where it won't currently work

Rust - Embedded applications.

YMMV for specific architectures of course.

The Embedded Rust Book

Fujitsu-backed FDK claims nickel zinc batteries ready for use in UPSes

rg287

(although apparently breathing in nickel dust, as may occur in a refinery setting, is a suspected carcinogen).

I can't imagine breathing in any form of metal dust is going to do good things for you. Some are worse than others, but generally HSE/OSHA rules would mandate proper filtration and/or breathing apparatus.

Admittedly there's a whole separate section of health & safety known as CLAW (Control of Lead At Work), but this is something that - for instance - the entire target shooting community manage to implement safely even for volunteer-run membership clubs. Good hygiene, extraction, get specialists in to empty the butts. People shoot all their lives and don't get lead poisoning.

I wouldn't want to be working in a recycling plant dealing with lead (or, indeed, concentrated sulphuric acid with nasty lead compounds sloshing around in it). The thing about nickel and zinc is, that they're not going to give you heavy-metal poisoning

I guess the question is... what solutions and materials do they use to recycle nickel and zinc? Are they present in an elemental form in the battery and easily recycled, or do you need to dissolve them in a nasty solution to separate them out for processing? What happens to that solution afterwards? What are the byproducts of the recycling process? Or can they be reprocessed as their existing alloys? What about if they have dendritic growth on them? Does that need dissolving off?

The thing with lead acid batteries, is they're very easy to reprocess. They feed them into a hammer mill, smash them up and dump them into a pool of water. The plastic floats off and is ground into pellets and used as fresh stock. The lead is scraped off the bottom and cast into ingots for reuse. The liquids are fed into a tank where they either reclaim the acid for reuse, or they dump something else in to neutralise it. I recall reading somewhere they can actually turn it into a household cleaning agent of some sort in one step. At the end, you don't have any waste byproducts, leftover catalysts or or other waste. Is the same true of trying to separate nickel/zinc alloys? (Genuine question).

No system is truly closed-cycle, but lead-acid batteries is about as close as you can get.

rg287

the Ni-Zn batteries were lighter with lower environmental impact than the lead-acid alternatives traditionally used in UPSes.

Quite the claim considering that lead-acid batteries (at least those used in cars) are basically a closed-loop and exceptionally low impact. More than 99% of the materials in a lead-acid car battery can be recycled, and if you buy a new one, it will usually be made of 100% recycled materials. Unless there's something different about the lead-acid units in UPS?

I suppose if you double the energy density, then a given application only requires half as much battery by weight, which would give you a lower impact - so long as they can be appropriately recycled at EOL and are drawing from recycled materials to start with.

Scientists use Raspberry Pi tech to protect NASA telescope data

rg287

Re: I'm reminded of early space exploration

Largely apocryphal

Entirely apocryphal. The Fisher AG-7 "Anti-Gravity" Space Pen was developed completely privately. NASA were in fact initially sceptical and put it through rigorous testing before acceptance (they had previously been using mechanical pencils, which solve the issue of sharpening, but they were concerned about flakes snapping off. They had also managed to spend >$100 per pencil somehow, which had caused some consternation in the press).

Ultimately both NASA and the Soviets ordered Fisher Space Pens, with records showing they both got a 40% discount for bulk orders ($2.39 each instead of $3.98!).

It's unclear how much NASA spent testing the pens before ordering them.

Apple exec defends 8GB $1,599 MacBook Pro, claims it's like 16GB in a PC

rg287

Re: I was gonna say...

One advantage of additional RAM that Apple overlooks is caching of data to speed up operation.

I'm not sure they've overlooked it. They're counting on the fact that their SOC architecture can hypothetically give much better storage latency than SATA storage (or even NVMe). Notwithstanding that they've mucked about with that in recent editions.

Consequently, the perceived performance dip when something wanders off to swap or has to be retrieved from storage is much lower, and the whole system "feels snappier".

And on entry level systems, doing more with less because of that architectural decision is not awful. The average Macbook Air user won't notice.

But to be shipping any "Pro" branded hardware with less than 16GB these days is woeful, because it's not a matter of occasional swapping or start-up performance for a browser. Whether its a local DB, build processes or media wrangling, you need GBs to work. Yes, maybe the OS is a smidge more efficient, but obviously the moment you start loading images or video into LightRoom or FCPX you immediately run out of RAM.

As we all agree... having very fast access to storage is no substitute for having enough memory for the workload. But they haven't overlooked that. It's very much by design. It's just that design decision (whilst passable for Airs) is inappropriate for Pro machines and exists only (as others have said) to suppress the "From" price and let them mark up sensible configurations.

FTC interrupts Copyright Office probe to flip out over potential AI fraud, abuse

rg287

Re: Free pass

Most books i see explicitly prohibit "storage 8n an electronic retrieval system".

Err... no they don't.

1. Does that include indexing? Is there an explicit exemption for title/author/precis/blurb? How would a library system manage their stock?

2. Pretty much every book published today has an ebook offering. You can't prohibit storage of an ebook on an electronic system... there are even publisher-approved systems for borrowing ebooks from libraries using apps like Libby.

Books are stored in electronic retrieval systems all the time, with the consent of the publisher.

It may be that you have a print edition, and in the copyright blurb it does indeed make that assertion. But the AI crowd aren't scanning hard copies in for their training sets. They're scraping up ebooks (and the internet).

I still don't get the idea that copying book content is not itself actionable.

This is of course the weeds where it gets lairy, and where we need to be a lot more specific with the word "copy". Copying the content of a book or journal is of course not actionable. For instance, it's useful (and fair use) to be able to index documents. That's going to involve some level of copying into an index/database. If I rent an ebook, it needs to be copied to my device, and then from storage into RAM.

Me reading a book from the library (and copying it into my brain) is not actionable. Neither is taking notes from it. Repeating chunks, or substantially reusing it in a new work is. The breach of copyright comes from the output of the model. Whether that's an LLM, text-to-image, or my brain. The breach is not generally from the feeding in. Moreover, I'm not breaching Hemingway's copyright by writing an article in the style of Hemingway based on reading his work. So is an LLM breaching his copyright if it does the same? If it starts spitting out identifiable phrases or chunks of his writing then that's easy to prove. If it's just very terse text, then hmm, sort-of-not-really? At what point does me writing an article "in the style of" stop being a derivative work and become new (or at least, non-infringing). And then how does that differ from the output of a software system that I operate?

Of course we're all left with that quite correct and reasonable feeling that none of this is quite right. Probably because the AI folks have mostly scraped this stuff one way or another without paying the creators for it (if I borrow a book from the library, the library has at least paid for that copy). And because the AI doesn't understand the work - it's just putting the next most likely word after the last (but then plenty of humans are capable of completely missing the point and misrepresenting work or research as well!).

Bad eIDAS: Europe ready to intercept, spy on your encrypted HTTPS connections

rg287

Would you even need to compile it yourself? Mozilla, Google & Microsoft are all US-based. They could issue an EU-build and a RoW build with different CA trust lists.

Preventing EU users from accessing the RoW build would of course be approximately as effective as US munitions controls were at stopping PGP escaping into the world.

The UK government? On the right track with its semiconductor strategy?

rg287

Re: As some of us said at the time...

El Reg reported on the announcements in May and took the line shared with many commentators that not subsidising new fabs was a mistake.

As I recall, the line many commentards (myself included) took was:

Nice that they've actually developed a strategy (now do the rest of the economy - UKGov has no over-arching industrial strategy), but:

1. It's table stakes. You don't need to be subsidising fabs to take a proactive interest in things like developing connections with universities, putting the right investment in place to develop clusters (whether that's for shiny logic silicon or - very sensibly - leveraging existing expertise in power silicon and other niche sectors). But they're not taking a tremendously proactive approach on that.

2. Realistically, this government is allergic to investment or infrastructure. It's cynical but not entirely unreasonable- to suggest that them saying "We're going to pursue this niche stuff, so don't expect big press about a new 3nm plant or anything. It's important but very low key" is actually expectation management for "we're promising low because we have no intention of delivering anything anyway" and nobody can be surprised when it all goes quiet and we hear nothing about it again.

3. Good luck delivering even if they do want to - they've gutted the civil service such that DBT would have a real job delivering some of this stuff even with strong ministerial backing.

4. None of this matters because industrial strategy is necessarily long-term. But this government has less than 12months in office, and is then looking at another decade in opposition. And they know it. One can have their own opinions on HS2, but the manner in which Sunak has cancelled it and is then expediting the sale of land, at a loss to the taxpayer is a deliberate attempt to salt the earth and prevent a future government restarting the project. It's scorched earth politics from a party that doesn't give a toss and is busy burning every bridge they can, to make life hard for the next government. They're not good-faith actors. If they do something good, it is purely by accident - but even then, it probably means you haven't looked hard enough to find the donor who is cashing out.

ULA's Vulcan Centaur hopes to rocket into Christmas

rg287

Re: New Galilleo Launches

Citation required.

Galileo generates its own master time (Galileo System Time) using masers in Fucino, Italy. This is determined independently and in principle, Galileo receivers can determine positioning just using that.

Of course because people use GNSS as a general time signal for all sorts of non-positioning services, it needs to be kept matched to UTC, which is calculated in Madrid.

There is also a GPS-Galileo Time Offset (GGTO) calculated to accuracy of <5ns in cooperation with the US Naval Observatory, so that receivers can synthesise signals from satellites on both networks.

But none of that is a dependency. If GST drifted from UTC, that would break stuff. If the GGTO wasn't available then it would mean the receiver could only use Galileo signals, which might cause lower accuracy or longer TTFF, but wouldn't inherently render Galileo inoperable.

Of course, if there aren't enough Galileo satellites in sight because old birds have failed or you're in a city or deep valley (or in the early days when they were building the constellation out and there were only a handful of Galileo satellites on orbit), then it will be impossible to get a fix without infilling with GPS. That's not an architectural dependency though - it's an operational one.

Boris Johnson's mad hydrogen for homes bubble bursts

rg287

Re: Electricity for heat pumps

Electrons are electrons - you push some in the South, you draw some off in the North. The rest is accountancy..

My thought exactly. Offset Spanish/Portugese demand, which leaves them surplus to sell into Southern France, which leaves the French with surplus to sell GB/Ireland.

If the paperwork on that is too complicated, then run the interconnect to France and cut the Spanish out. But it's all just numbers in a spreadsheet.

rg287

Re: Electricity for heat pumps

Why the blazing f- would anyone build an interconnect from Morocco to... the UK?

Transmission losses are projected at 13%, which is not as bad as I expected... but still significant.

Surely it makes more sense to interconnect Morocco with Spain/Portugal(1), offset Spanish/Portugese demand, which means generation in northern Spain/Portugal can be sold into southern France, and we continue to buy from the French. Surely you want generation to be as close as possible to demand?

Or at the very least, connect from Morocco to France.

But I suppose in applying a vaguely sensible engineering approach, I haven't allowed for geopolitics, or the ability of grifters to leverage government subsidies. And yes, I know this is attached to generation specifically built for export, not just a load-balancing link.

1. Yes, there's already an 800MW Spain-Morocco interconnect, with another 700MW link in the works. Which will be dwarfed by this 3.6GW link.

rg287

Re: Electricity for heat pumps

2010 - Cameron announced Hinkey C, had they not wasted years of faffing about before starting the build billions would have been saved and we'd have the leccy to use.

Well, it was shortlisted in 2010. Whilst his erstwhile (Lib Dem) Deputy PM was badmouthing them and casting FUD. Also worth noting that the 2005 Conservative manifesto made no mention of nuclear. So they were not particularly interested in energy security either.

And then the Tories delayed the whole process by insisting that the private sector fund it themselves (backed against extremely generous guaranteed strike prices). This of course was a period when interest rates were <0.5% and the government could have borrowed extremely cheaply. Far cheaper than EDF could borrow from the commercial money markets. The government dragged those CfD negotiations well into 2013 - instead of just issuing gilts and hiring EDF to build it on a contractor basis. Discounting the (not inconsiderable) site prep, major works for Hinckley C didn't start until.. 2019. We could be at least 2-3 years ahead, but for government.

So yes, the Tories did get on with it... eventually. But they still managed to do it in the slowest, least efficient and most expensive manner possible.

Astronomers spot collision between two exoplanets, both feared vaporized

rg287

<older brother deity to little sister deity> "Don't put your marbles on my model space-time! Oh now it's rolled into that solar system and broken one of the planets. That was the best one as well - I got a prize at the science fair for the fjords. Muuum! She's breaking my planets! Muuuuuuuuuum!"

Why can't datacenter operators stop thinking about atomic power?

rg287

These companies have very deep pockets indeed Apple would be the eighth richest country in the world, MS the twelfth, Amazon the fourteenth....

They really do have the resources to develop something like this, not only to reduce their own power bill, but also to sell the technology to competitors and grids across the world.

Of course they have those resources. Note in my original post "can't raise/won't commit".

BUT THEY'RE NOT.

They are investing billions in civilian-ising nuclear submarine tech. Instead of just doing the foundational R&D to bring clean fuel cycles to market.

They could do the good thing. But they are choosing to go the (relatively) low-technical-risk route and re-package existing tech.

There's no question of "will they, won't they". They won't.

rg287

Re: France is finding that nuclear power isn’t that reliable either

That is not the case , Trawsfynydd and any site on the Severn estuary proving the point.

You're evidently not familiar with the concept of an estuary. Particularly one like the Severn with a massive tidal range. Anything on the Severn is sea-water cooled. It's not "river cooled" like the Saint Laurent, Belleville or Bugey. With proper placement of intakes (beyond the lowest low-water from spring tide), you're not going to run dry!

Trawsfynydd is the only "inland" nuclear plant in the UK, and it pulls it's cooling water from a reservoir, which buffers the supply compared to direct-from-river. It's also less than 10miles from the coast. So if the reservoir cooling really hadn't worked out, hypothetically you could run some pipes down. That location also guarantees more reliable rainfall than the inner departments of France.

rg287

And that's the big hope of these massive consumers wanting to build their own SMR/micro reactors.

Because they have the resources to actually overcome those challenges in the pursuit of cheap power to feed their habit.

But they don't. That's the point. These SMR designs are mostly running naval submarine reactors on conventional Uranium cycles - which is why there's a lot of regulatory (nuclear proliferation) concerns about having small units deployed in many locations (as opposed to a handful of sites with large scale reactors).

In fairness, Thorium is only proliferation-resistant when used in a light water reactor - it still generates some nasties when used in a molten-salt reactor.

The problem with all this is:

* Grid-scale nuclear power stations are expensive - private industry can't raise/won't commit that much capital.

* Novel reactor designs/fuel cycles are expensive - private industry can't raise/won't commit that much capital.

* Much of the world's political thinking is still in thrall to a lite version of Reaganomics and so governments won't make those investments.

* Small nuclear sub-type reactors are somewhat in the reach of the likes of Google/Apple/Microsoft, even though they're less efficient than their grid-scale counterparts.

So that's what we end up with. Just as our big uranium-cycle grid reactors are ultimately derived from military breeder reactor tech, so the SMRs are just a scaled-up version of submarine tech. But it's still not actually a good fit for power production or long-term waste management.

Basically, they're all investing in the compromised designs that the military already paid for, because it's what the private sector is willing to pay for. Even though we could - as a society - get much better value for money out of doing the research and deploying low waste, proliferation-resistant technologies en masse.

It also means we spend a lot less time negotiating with the likes of Iran about "honestly, our nuclear programme is peaceful". We can just hand them the IP for a proliferation-resistant thorium reactor. We can even offer to build it for them. If they say "no thanks" then they're fessing up that their programme is a weapon programme (which yes, we know, but it just cuts the crap. Anyone shows any interest in nukes, we hand them a power plant and see if they actually want it. It instantly closes down any discussion on refining uranium).

rg287

Now if the AI can either come up with a nuclear power source which doesn’t generate any waste, or a way of safely disposing of the waste, we’d be on to a winner.

We don't need AI for that. We already have the Thorium cycle, which generates little to no Plutonium, nor nasty actinides.

It's not a silver bullet or course. You still get waste, but much less. In a reactor, Thorium-cycle burns up far more of the Uranium, far more efficiently. By contrast, Uranium-to-Plutonium cycle reactors have to pull the rods when they get poisoned (at which point fission slows despite there being loads of decent fuel still in there, which we then have to process out from the Plutonium and actinides).

But nobody is interested in funding that properly or overcoming the engineering challenges, because you can't make bombs out of it at the end.

Thorium cycle does generate U-232, which has a viciously dangerous decay chain (Thallium-238, very strong gamma radiation), but which follows the "live fast, die young" rule. U232 has a half-life of 68years - not millennia. The Thallium has a half life just under 2 years. Consequently storage is not a horribly long-term problem.

And we've already worked out how to store the remaining waste. Dig a hole in a geologically stable formation and stick our vitrified leftovers in it. It's honestly not that hard. Not that I'm a fan of burying waste in general, but it's a matter of scale and proportion. Burying all our plastics without recycling (or just reducing what we use) would be bad. But a small quantity of nuclear waste (order of tonnes)? Yeah, we can objectively get away with that in return for clean, safe power.

Software patch fixes Euclid space telescope navigation bug

rg287

Re: "the telescope's Fine Guidance Sensor"

And how much fuel did it use up by all that repositioning?

Probably little to nothing directly, as the changes look pretty minor in absolute terms (hunting around the immediate field of view, not doing pirouettes - which would have probably seen the FGS fighting with the coarse sensors and any inertial instrumentation), so the wandering we've seen was probably all done with reaction wheels.

Of course if the wheels are now wound up a bit, the first "unwind" or desaturation event (which will burn propellant) will come sooner. But it doesn't seem like it will have a significant impact on mission lifespan.

Textbook publishers sue shadow library LibGen for copyright infringement

rg287

Re: Welcome to the new corporate Register

You can wave your hands about and shout "knowledge monopoly" all you like, but this is still people copying other peoples work without permission and taking money for doing so.

The work belongs to academics and universities. But for some reason Elsevier et al are of the opinion that once you have paid them in excess of $5k for the privilege of being published in their peer-reviewed journals (which involves peer-review... except they don't remunerate academics for reviewing papers), that research now "belongs" to the publisher. Whose one and sole contribution has been to compile and edit the journal. Which is legitimate work. But does not constitute a creative contribution to the content.

Their pleas of poverty would sound a lot stronger if they weren't generating profit margins in excess of 30%.

It's long past time that a court ruled the only component of copyright the publishers can lay any claim on is layout. The text and intellectual property is neither their work, nor their property.

ISP's ads 'misleadingly implied' existence of 6G, says watchdog

rg287

But if you bought from a company called "ACME Full fibre (FTTP) internet provider", you'd be a tad miffed if you found out that they were selling basic DSL over POTS.

Ah no, that's absolutely fine and dandy according to the ASA. They dismissed a complaint by FTTP providers against incumbents (advertising coax & VDSL as "Fibre") on the basis that customers know what they're getting and it was "not materially misleading" for ISPs to describe copper hybrid services as "fibre broadband". So there. CityFibre sued them over that and sought judicial review... and lost.

And for what it's worth... I have one former colleague who insisted in 2014 that he had fibre. He still plugged the router into the phone socket, but insisted that it had been "upgraded from the exchange" without anyone drilling holes in his walls. Alchemists could have had quite the field day trying to tease out how copper had spontaneously transmuted into glass, but there we have it. The ASA have told us that people aren't confused.

6G's name is intentionally misleading. They sell a wireless service too. It's dishonest.

Is it also dishonest for Three to sell 4G and 5G services? What about voice-only plans with no data on them at all? They literally called themselves Three when 3G was launching. Is it dishonest for Virgin Media to sell services to people who aren't... okay, maybe we won't run down that train of thought.

rg287

Their argument here seems to revolve around the expectation that where a company name refers to a thing, then the service delivered by the company is expected to utilise that thing.

Quite. And remarkably enough, the provider 3, who - back in the day - launched to much fanfare as the UK's (self-claimed) leading provider of 3G have moved on to 4G and 5G.

But we have to remember that this is the same ASA who dismissed a complaint by FTTP providers against incumbents (advertising coax & VDSL as "Fibre") on the basis that customers know what they're getting and it was "not materially misleading" for ISPs to describe copper hybrid services as "fibre broadband".

Apparently customers know the difference between fibre and er... "fibre", but not fibre and cellular.

Arc: A radical fresh take on the web browser

rg287

Re: Off topic

I have long held the view that organisations need to stop assuming that everyone knows how to use Word Processors and Spreadsheets

And employ technical documentation specialists - whether that's writers, illustrators or editors. Because they spend all day in their tools and will inevitably do a better job than asking an engineer/developer/CAD-jockey to write the documentation or provide illustrations. And it'll be more consistent as a result.

Although the concept of typing pools has a poor reputation for sexual harassment and misogyny, we had them for a reason - the professional typists (usually - but didn't have to be - women) were a damn sight better at what they did than the engineers/managers/men who sent them work - faster & more accurate. Senior bods still get a PA/Secretary for this reason (it's judged that the executive's time is too valuable to be spent on booking flights or managing their calendar), but there's a case to be made that a Secretarial/Professional Services pool should still be a feature of large organisations. In the modern era of course, they wouldn't be typing up emails for people - they would be specialised in helping people prepare for presentations, prepare bid documents, internal/external documentation, etc. How many staff-years are wasted by engineers manually renumbering the pages on documents because they don't know how to use the layout tools?

AWS: IPv4 addresses cost too much, so you’re going to pay

rg287

Re: IPv6-mostly?

They could. But as it turns out, the top-level/high-profile ones don't.

<government.nl> and <defensie.nl> both advertise AAAA records pointing back to Prolocation.net, whilst amsterdam.nl is likewise "self-hosted" IPv6 with Logius.

No CF doing a MITM on them.

I'm sure there's probably some local council or school districts behind Cloudflare, but good on them for trying to lead by example.

<gov.uk> also goes to "native" IPV6, albeit on a block owned by Fastly (of California)... <army.mod.uk> hits Cloudflare...

Twitter name and blue bird logo to be 'blowtorched' off company branding

rg287

Re: Moron alert. Again

There are a fair number of artists and other creators that relied on twitter to advertise and support their work. A good alternative hasn't really arisen, and if they are transitioning to some other network they need to rebuild their entire following.

I'm not sure that's quite as dire as made out. Most of those people are on patreon and have built followings on Mastodon/ActivityPub, as well as Flickr/DeviantArt/Instagram/YouTube/Discord/Twitch and now Threads. Twitter was an important way of advertising their work, but that's really died off over the past year. The writing has been on the wall for a while.

FCC boss says 25Mbps isn't cutting it, Americans deserve 100Mbps now, gigabit later

rg287

Re: My home cable modem...

Well, I have 600 down and 20 Mbps up on my current connection. For years I have been telling my cable provider that I would pay just as much as I do now per month for 100 Mbps up AND down. They have the fiber in place. The fiber terminator hangs off of a pole right next to my apartment building in Chicago.

I entirely agree with the sentiment, but it's not going to happen because it's probably a PON architecture with more downstream channels than up, and they're not going to change that just for you. If it were a point-to-point architecture (not point-to-multipoint) where your apartment was connected to a switchport at their end, then you could certainly pick an arbitrary symmetric speed or even more upload than download.

That being said, the ratios are still open to them - G.984 offers 2.4Gb down, 1.2Gb up. That's shared with as many as 128 endpoints but still represents a 2:1 ratio, not 10:1 or worse. G987 gives 10/2.5Gb, which is 4:1. There are symmetric PON standards but they need more expensive burst-mode lasers, which they don't want to spend money on. They don't need to be giving people quite such shonky upload speeds but alas, they're optimising for people downloading the latest 30GB Call-of-Fortnite DLC.

Ofcom proposes Wi-Fi and cellphones share upper 6GHz band

rg287

Re: Interoperability vs spectrum sharing

I would much rather see a maintained separation in terms of spectrum allocation and services, but better service integration enabling switching from one to another mid-call.

I quite agree. Without wanting to sound too much like "well why would anyone ever want higher speeds?", I don't really see the value of 6GHz in wifi.

2.4GHz and 5GHz give a decent option between speed and range/penetration. I can also see the (diminishing) value of 60GHz for line-of-sight streaming applications, perhaps for VR headsets. Though in many cases the better solution is an HDMI cable!

6GHz is unlikely to bring meaningful real-world speed improvements. Some people will mumble something about high-density environments like conference & exhibition centres. But in my experience, providing decent service there has more to do with antennae design; AP placement; how well your system manages client roaming when a device keeps trying to flick between access points like a demented hummingbird and (of course) whether you actually have enough backhaul to support the traffic or whether the centre has cheaped out.

Arguably, the important bit of 802.11ax/WiFi6 was introducing OFDMA (as well as 6GHz), but you can do OFDMA perfectly well on 5GHz (and 2.4GHz), which it does.

Seeing ax advertised for corporate offices and dense residential apartments will make most people here twitch. Any high-bandwidth applications in those settings should be wired anyway. We all know this. The only way you can consistently saturate a network link is with big downloads, which are most likely to be things like games consoles. And for those, you want a wired link to get consistent low latency when you're playing. For downloading flappy birds to a mobile device, you're not going to perceive any benefit on 6GHz vs 5GHz.

6GHz does avoid the games of DFS and APs having to monitor for RADAR, but not if it's doing a DFS-equivalent of playing nicely with cellular services. Meet the new boss, same as the old boss.

This all stands in stark contrast to mobile cellular service where wringing that extra bit of performance is actually worth it for dense environments - like standing on Embankment, Times Square or in central Manchester and having full signal but garbage throughput because of contention.

SpaceX says, sure, Starship blew up but you can forget about the rest of that lawsuit

rg287

Re: "terrifying" sounds were reported in Port Isabel

Seriously, what was that about? Even the SpaceX commentator lady sounded confused when that was shown.

I assumed it meant the sweepstake was settled and the cheers were from people not buying the beers that night/supplying cake the next day.

Ariane 5 to take final flight, leaving Europe without its own heavy-lift rocket

rg287

Re: But wait! There's more...

How long would it take ESA to develop and build it's own re-usable Ariane? Given the Ariane 6 is 10 years in development, at least another 10?

Why "at least another 10"?

SpaceX started in 2002 and first launch of Falcon 1 was in 2006. In just 4 years they had conceived, designed and built an entire rocket - hardware, software and (most importantly) engines - from scratch.

Falcon 9 launched four years later in 2010 and attempted to land (with a parachute - Elon having fully misunderstood what parachutes can sensibly do). They pivoted to relighting the engines and landing under power, which was then demonstrated in 2013-14 with controlled re-entries and the first successful landing (on land) in 2015.

ArianeSpace starts with a functioning rocket, engines, some superb rocket engineers and a decade of watching SpaceX piss on their chips. It should be entirely possible for a team with the knowledge of "this can be done, SpaceX have been doing it for a decade" and granted the budget and autonomy to "get on with it" to go and pull Ariane 6 apart and modify the booster to support reentry in 3-5 years. It requires ArianeSpace to commit to it (rather than furtling around with "well maybe we could have a go at it"), and possibly poach a few SpaceX engineers. It can be done. If it isn't, it's because Arianespace don't want to rather than because it can't be done.

You're right though. Available evidence suggests ArianeSpace management would doom such a project to development purgatory.

rg287

Re: But wait! There's more...

Using taxpayers money to create jobs is an efficient use of it.

Working people aren't on dole, pay taxes, are better integrated into the community, are less prone to violence and abuses.

Entirely true. But it would also be nice if those working people were developing/building a rocket which won't be obsolete before it launches, rather than handing the initiative to Musk's effective monopoly (outside of government launches propping up the old-space incumbents, who would otherwise be looking at bankruptcy, having been totally outclassed on cost and reliability for private sector launches).

Would it be actually that terrible for ArianeSpace to hot-house a small team of engineers on a "start-up" basis and tell them "here's a fat budget, go for SpaceX" and then go hands-off?

Would it be any less efficient than the current process? They've proudly turned out Ariane 6 - a fine rocket to be sure (when it launches) - but 15years behind the state of the art (Falcon 9), and possibly obsoleted by StarShip before the end of 2024. Between ULA and ArianeSpace, is it so ridiculous for taxpayers to be saying "Oh come on one of you, take the fight to Musk. Less evolution, more revolution"?

Rocky Linux claims to have found 'path forward' from CentOS source purge

rg287

Re: "Certified"

but who also run a downstream rebuild for training, testing and/or development boxes because they don't need support on those.

Worth remembering that a free RHEL Developer account gets you 16 unsupported licences (up to 128 cores across those instances), which is 16 more than Microsoft gives Windows devs. So for training or dev or even small (self-supported) production workloads, developers can use their dev account licenses.

This policy/allowance of course remains at the whims of RHEL (beware building a business on it) and testing can be difficult because you have to activate those instances with your account credentials - as Claudio4 mentioned above, for automated CI/CD pipelines or anything where you might be standing up and tearing down instances automagically, throwing CentOS or similar at it was much more straightforward from an activation standpoint. I doubt that is unfixable, but you'd need some monitoring to avoid it trying to spin up a 17th instance and falling over.

As you mention, the main appeal of RHEL is support, and certification for vendors and customers operating in regulated industries.

Missing Titan sub likely destroyed in implosion, no survivors

rg287

Re: There's a lot of outpouring of grief for the loss of five people at sea

Quite.

On the one hand, I have no problem with the response as-was. Anyone in nautical distress should get whatever help can be mustered. That's been the first rule of the ocean for as long as there have been seafarers. And in this case, it was a decent training exercise in search and establishing the fate of the craft - even if the prospects of recovery were always slim.

It does throw into sharp relief the handling of various refugee boats and migrants though. I don't see the media spending a breathless week covering the fate of 700 drowned migrants. But 4 rich tourists? What could be more important?

rg287

I don't blame the Coast Guard for continuing the search, though, sonar data can be ambiguous and they'd want to have definitive evidence before giving up.

Also. Morbidly. If you haven't got another vessel in immediate distress demanding your attention and would simply be at standby, then this is a decent training exercise on search and (maybe) recovery, or at least establishing the fate of the craft even if you don't get anything back.

rg287

Yeah, there should have been no rescue efforts at all - seriously. Those are just the government doing unnecessary safety meddling in the industry with taxpayer money. 'Pure waste!' as he put it.

No no, anyone in nautical distress should have a reasonable expectation of receiving at least a best-effort attempt at rescue. That's a long-standing law of the sea.

What this has highlighted of course is how the "impossible" situation of stopping migrants drowning in their thousands is nothing more than a policy decision - because there's plenty of resource when there's a political/media will to stage a large scale search (and - if it hadn't imploded - "daring rescue") when its a handful of millionaires on their holidays.

I do agree with the sentiment in so much as there's been some fawning over these "brave explorers".

They're not explorers, nor adventurers. They're disaster tourists, rubbernecking a mass grave which has been extensively profiled and documented by far better hardware than they had. The billionaire amongst them probably spent more money vetting their chauffeur than they did on due diligence into the company selling sub rides in an experimental harbor-freight special. He could easily have afforded to charter DSV Limiting Factor, or even commissioned an Alvin-class sub from Triton.

Don't panic. Google offering scary .zip and .mov domains is not the end of the world

rg287

"other people also have problematic TLD so Google creating more isn't that bad"

Yeah, the whataboutism is deafening.

.com dates back to a simpler, more naive time when not many people were using the internet, and now we're stuck with it.

.sh is potentially quite dangerous, but most people don't know what a shell file is to start with and hopefully (!) won't try and run it. It won't get you very far on Windows anyway.

.zip and .mov are extensions people know and recognise. They might even expect to receive legitimate emails with attached zip archives (or links to). The fact that .com and .sh exist are not good reasons for ICANN to allow other common file extensions as TLDs.

There are a bunch of active countermeasures out there... but it makes no sense to rely on active countermeasures to address a passive risk. That's poor security design.

.zip is confusing. "url.zip" does nothing that bit.ly doesn't already do. The world simply doesn't need them. They're just going to end up on arbitrary block lists with most of the other wanky gTLDs.

The Hubble Space Telescope is sinking! Two startups want to save it for free

rg287

Re: But why...?

I perhaps wasn't clear, but when I referred to Hubble 2.0, I wasn't referring to an all-new telescope design. I was referring to literally a second Hubble.

By the time you'd done the R&D for one, built out the science software, etc, building a second one would have been relatively cheap. Bear in mind the first service mission cost $500m plus the shuttle launch. $1.5Bn would have bought you a second Hubble with change. They'd also paid Kodak for the back-up mirror at that point.

This is not to say that the missions were all a waste of time - obviously they needed SM1 to get Hubble working properly and a couple more for station-keeping and upgrades/repairs.

But arguably 3A & 3B (a mix of upgrades and actual service/repair) could have been consolidated into a single service-only mission focussed on failed/failing components and the 3B budget spent on Hubble 2, which would have got the upgraded science instruments. Maybe Hubble 1 would end up with a shorter lifespan, but we'd still have a (newer) telescope on orbit, and for a while you'd have two telescopes (with a diversity of instruments).

But most importantly, don't forget the Hubble project started around 1970 and it was launched around 1990, that is 20 years later... If you start working on Hubble 2.0 tomorrow, it might, potentially be operational around 2043 (or later)...

Maybe, maybe not. The Roman Space Telescope is looking at around a 10-year development cycle (funded in 2016, launch 2027). If one were being picky one might point out that it was first proposed in 2010 - we're discussing how long to deliver from getting the "go" with actual financing. JWST took 20 years (actual studies commissioned 1999, launch 2021) and any Hubble-a-like (i.e. single primary mirror) would be much, much less complex.

Let white-hat hackers stick a probe in those voting machines, say senators

rg287

Re: There's absolutely nothing wrong with computers doing the counting

Everything from president, congressman and senator, to state representative and senator, retention of state judges at the local, appeals and supreme court levels, and random stuff like county water commissioner, assessor, etc. and maybe one or two propositions.

In fairness, this has always struck me as a bit overly complex - and it’s been shown in at least one case that confusing ballot layout affected election outcome because people accidentally voted for the wrong person.

Scantron is definitely a way of reducing count error, but ballots could also be broken out into multiple ballots (colour-coded so the staff can check people are posting them in the right boxes). One for President and/or congressional elections, then a couple for state and county/local elections. Two or three ballots could significantly simplify layout and the presidential ones (say) can be prioritised - it’s an odd thing to be slow delivering a result for a national vote because they’re concurrently counting ballots for local sanitation superintendent.

rg287

Re: If you want secure elections

There's a difference between recording the votes and counting the votes though.

Following a couple of respected security bods on Mastodon who have given a lot of thought to election security, it does seem that good progress has been made in the last 10 years. The main one is that it's now recognised that computers can help with counting votes, but must never be responsible for recording votes. No system should ever require you to enter your vote electronically - there must be a physical ballot paper. Old touch-screen voting machines have been largely phased out. Some systems had you enter your vote and they printed a receipt - but these are also unacceptable as the receipt was often a barcode combining some sort of unique ID with the vote cast, which is not easily human-readable.

The happy medium seems to be that you mark your vote on a scantron-type ballot paper. You then scan it (and perhaps receive a receipt), and deposit your ballot in a traditional ballot box.

The computers provide a quick provisional result when voting closes, and you do an audit count of a statistically-significant proportion of ballots. In a landslide, you probably only count 10% of ballot papers. In a narrow race you'll basically end up doing a full manual count. On average this saves time (not waiting days for a result whilst mail-in votes are counted <side eye to Georgia>). Importantly, you've always got the paper ballots if one of your checks and balances ends up initiating a full recount.

Moreover, even if you decide as a matter of policy to always do a full manual count, an electronically-counted provisional result nukes the hours (or in the US, days) of commentators testiculating about what's going to happen, or candidates submitting vexatious lawsuits to "stop the count". If voting closes at 10pm, there should be a provisional result published by 11pm and we can all go to bed. Whilst some might say "be more patient", it's reasonable to suggest that the fake news in the wake of the last US election (when early provisional results shifted because mail-in ballots favoured Biden, giving rise to "tampering" theories) makes a strong case for being able to publish a provisional result quickly, and then quietly do the manual count (or audit counts) without the pressure of releasing misleading interim results. Obviously also... just don't release interim or partial results.

If you're using a scantron-type counter and doing good audits, then by far the larger source of fraud will be coerced postal votes.

That said, I agree that it's a bit bold to call the last US election "the most secure". Whilst I'm not suggesting it suffered any tampering, elections pre-1980 were more passively-secure by virtue of being entirely manual.

Page: