* Posts by RobLang

123 posts • joined 16 Jun 2020


Is it time to retire C and C++ for Rust in new programs?


Entirely anecdotal evidence

Initially learnt Rust by recoding some personal pet algorithms to see if I could make them as fast as my optimised-over-many-iterations C99. Almost made them as fast; got close enough. It took longer than I expected because new terms for old concepts slows me down. I'd prefer libraries to be called libraries, for example. That's not unique to Rust but with each language I try, I live in hope they're sensible with terminology.

Second project was small contract-work-for-a-friend to split up a C tool that had got too big into separate C tools. The majority was just moving code around, so I finished with time to spare. They asked me to rewrite one smaller chunk in Rust so that they could compare in house and it went OK - I found (well, the Rust compiler found) two edge case memory issues while converting over. Arguably neither were dangerous as they both required an impossible state to be reached but it was interesting. Today's impossible state might be tomorrow's feature.

The feedback I heard at the time from their internal team was that they liked Rust and would consider using it for new stuff but only because the developers were already keen, not because they suffered from client-losing memory bugs (their test regimen was impressive). I know since that they're building the next version in Rust and the downside they have is that some libraries they depend upon in C/C++ don't have Rust counterparts (yet) so that makes for a more complex build and development cycle.

I don't think Russinovich's Tweet has helped anyone. I'd rather he hadn't tweeted that. Opinions are like bum holes. Everyone has one but you should take care getting it out in public.

Even robots have the right to learn from open source


Microsoft didn't create CoPilot, OpenAI did

Disappointed with El Reg for this whopper:

"Microsoft has been industriously mining the code in the GitHub repositories and feeding it to an AI to train it in programming"

Is wrong.

CoPilot's AI engine is Codex, built by OpenAI (not Microsoft), a reduction of GPT-3, also created by OpenAI. Microsoft has the exclusive license for GPT-3. Codex is a reduction model of GPT-3, which is trained on the internet as a whole as a general language processor.

I would expect an article bashing Microsoft (which I'm all for) to include some basic facts. It wasn't Microsoft trawling the web, it was Elon Musk's OpenAI. Microsoft is licensing not embracing, enhancing and extinguishing - for once. Microsoft is monetising the resultant model but then so is OpenAI.

API rate limits at the core of Elon Musk’s decision to ditch Twitter


While I don't understand the tit-for-tat legal wranglings of a billionaire and a billon dollar company, one statement jumps out...

"as he feels the number of dodgy accounts impacts the company's value"

In this way, Musk is following the markets. Most social networks are judged by the monthly active users (MAU) by the markets and therefore does set - in part - the share value of the company.

My feeling is that he should have become certain of the accuracy of the reported MAU long before letting his ego declare his intent.

Even more astounding is that Twitter are hell bent on having the deal *complete*. They want Musk to be their owner! That I don't understand. Why would you want to have an owner that intrinsically doesn't want any part of you? Surely that means that you'll be sold again, leading to more instability and uncertainty for the workforce.

Running DOS on 64-bit Windows and Linux: Just because you can


Emulators and redistributions are really handy for people like me teaching my Zoomer offspring about what it was like. We're too far from NMOC and I can't figure out why the Speccy won't boot, so these emulators are the next best thing. My mum used Wordperfect 5 at work and dad used Wordstar at home, now I get to show the lad what that looks like. Hats off to those that got it all working!


Well done you for helping out an old fella keep doing the thing that he loves. The shred is going to be tough; but he's happy doing his thing and you're helping. That's the good stuff.

Cookie consent crumbles under fresh UK data law proposals


It's default opt-in.


Re: Britain leading again (?!!)

Agree with this. GDPR is far from perfect and the cookie banners are annoying but the processing part and its impact on rights is the bigger issue.

The decisions that are made by the algorithms based on the data collected is the real problem. Not thinking of today but tomorrow when insurance companies profile you based on your browsing habits and adjust your premium automatically or the private hospital provider automatically refusing to engage you as a client because of your profile. They can't do it now because of GDPR, they can if the processing requirements are slackened.


Re: Straightforward solution

No longer the case; most browsers have local storage, which can hold state but are not cookies because they cannot be shared to 3rd parties. You can pass authentication as a bearer token in headers without the need for cookies.

How did you mourn Internet Explorer's passing?


Long overdue

We had to support IE until yesterday. Now there's going to be a huge library update-athon and burning of pollyfill code while the front page politely apologises to IE users. We've dropped it from our test process, filled the company Slack with memes and now looking forward at all the features that were going to be too hard to make work in IE.

IE 5.5/6 was a good browser, rightly nailing the slow Nutscrape to the wall but it should never had been delivered with the OS after XP, locking MS into having to bare-bones maintain it. They should have done this retirement process long ago.

Google engineer suspended for violating confidentiality policies over 'sentient' AI


No, suspended for splurging onto the internet

It's perfectly fine to have outrageous ideas behind closed doors. His ideas should have remained internal conversations and personal opinion.


We can't measure either sentience or intelligence

We don't have a reliable, verified, reproduceable way to measure sentience or intelligence. Philosophers are still struggling to pin it down in a way that science can then measure. So we're left with anecdotes, human bias and argument such as The Chinese Room still. That's not going to tell us if we've achieved either.

Oracle closes $28.3b Cerner buy amid warnings of commercial challenges


"move them to the Gen2 Cloud, which he said could be done quickly"

"Quickly" compared to? Glacial basal sliding?

Tough news for Apple as EU makes USB-C common charging port for most electronic devices


Re: The BS 546 Brexit connector next

Of course you'll be familiar with the fact that British Standard BS7671 for earth wires are yellow/green striped since 1977, so that no live or neutral wire could be misconnected. Also, red/green colourblindness in males is 8%, which isn't a majority even in post-Brexit-mathematics-land.

Mars helicopter needs patch to fly again after sensor failure


I won't complain about patching a cloud server again*

* I will if it's Windows.

Bravo, NASA/ESA system engineers, bravo.

Behind Big Tech's big privacy heist: Deliberate obfuscation


Re: For the Apple haters…

Apple and MS are under investigation; I can only find $10m to Apple from the Italian regulator but that isn't strictly the EU using GDPR because.. A Wired article^1 explains the one-stop-shop where Ireland is responsible for litigation against a dizzying number of big tech including Apple and MS. There is an incredible backlog by the looks of it. Details in the article.

I'd imagine that Apple and MS will come off well (but not perfect) because their business models aren't around advertising. Apple's brand particularly relies upon privacy and IMO they are bigger privacy regulators than many governments.

I wouldn't ascribe favorable MS bias to El Reg; MS regularly gets eviscerated - largely by their own hand!

[1] https://www.wired.co.uk/article/gdpr-2022

Elon Musk says Twitter buy 'cannot move forward' until spam stats spat settled


Re: Definitions of spam are tricky

Thanks for the tip, Anonymous Coward, I'll give it a gander.


Re: Definitions of spam are tricky

I agree that it works for email but how does that work for Twitter? If you are seeing posts retweeted by people you follow or appear in your suggestion feed because the Twitter algorithm has decided you would be interested, how do you judge the originator is a spam bot?


Definitions of spam are tricky

In the early days, Twitter was touted as a human readable event feed and for some of the accounts I follow it's just that - automated news/weather/traffic that doesn't have a human directly typing tweets but are of value. So at what point is it spam? If my local hobby shop automatically posted new products and I followed that feed, is that spam? It's spam to some, not me. If UK members of parliament repeat the same text automatically to boost signal, is that spam? It's a human pressing the button but the text is copied is just like a bot.

I don't really care about what rich people do with their money or how they choose to behave but I am concerned that a tool I find handy might be cut to shreds so that I have to find another way!

Tech pros warn EU 'data adequacy' at risk if Brexit Britain goes its own way


Probably more work for me if they go their separate ways

Right now our EU clients are relaxed about where our data centres are (UK or EU) because of the data adequacy agreement. GDPR isn't perfect (no law is IMO) but it's what a chunk of our clients use, so we do too. Any UK reform won't change that. If there is deviation and we lose data adequacy, I'm going to have to move some stuff out of the UK and (if the NI protocol is anything to go by) create extra paperwork to prove that we're still GDPR compliant. Not looking forward to that.

Google shows off immersive maps, AR-flavored search, Pixel 7, and more


Glasses with eyes painted on them

Are we at the stage where Google can replace us with a virtual version that can nod sagely and offer platitudes like "take that offline" etc? A modern version of glasses with eyes painted on them.

OpenAI's DALL·E 2 generates AI images that are sometimes biased or NSFW


The word "understanding" is problematic

Whenever I see people say that an algorithm "understands" something then my hairs stand up. Philosophers can't agree on what it means to understand a thing, except that humans can do it and other animals appear to too but only in certain circumstances and perhaps in a different manner.

An algorithm may apportion text to an image or vice versa but it doesn't "understand" what it means to cry. It just knows that humans use the word "cry" next to an image of a person crying.

So "Can AI express its understanding of the physical world with text?"


Can humans measure whether something truly understands something? Also no.

Accenture announces 'Accenture Song' – not a tune, but a rebrand


Do we know which song it is?

Perhaps the Reg commenters know? Is it the Birdie Song?

Machine-learning models vulnerable to undetectable backdoors: new claim


Nothing wrong the with the mathematics, unsure of of its utility

From my reading of the maths, it's absolutely correct that a back door can be invisibly encoded into a trained model. However, the back door would have to be a very specific set of input parameters for it to be invisible and that reduces its utility. Unlike a validation step, where you use a separate data set to check the veracity/confidence of the model, a back door would only match on a few specific inputs. For scenarios given above in the comments such as banks/insurance companies, only a specific policy would be accepted rather than a broad raft of policies.

I also disagree with the author's comments regarding unsupervised learning. Unsupervised learning could be back doored but the utility is even lower as in unsupervised learning, you're not trying to force the output to a specific value.


While it's possible that you could detect the backdoor, in practice it would be very difficult to do because the back door can be (and if understand the mathematics correctly), must be a very precise set of parameters. How useful that makes it, depends on the application but I doubt validation + fuzzing will catch it.

Why OpenAI recruited human contractors to improve GPT-3


Re: IT just has to say of that & those following trails & tails wearing blinkers or blinders*

Nice to see GPT joining in the conversation.

Happy birthday Windows 3.1, aka 'the one that Visual Basic kept crashing on'


It was the first time I had tech envy

My Dad had a Zenith 8088 running DOS at home on 5 1/4" floppies and that felt very cool and professional. It's all he needed for the kind of work that he did on it. I wrote roleplaying games in Wordstar on it, printing out on the furiously loud dot matrix printer. My mate's Dad had a brand new 386 with 3.1 and I never looked at the Zenith in the same way again.

How Google hopes to build more efficient, multi-capability AI systems


Ensemble neural networks are still separate models

While the architecture of Pathways is extraordinary and to be applauded, I think there is a leap taken here:

> Pathways will enable a single AI system to generalize across thousands or millions of tasks, to understand different types of data, and to do so with remarkable efficiency – advancing us from the era of single-purpose models that merely recognize patterns to one in which more general-purpose intelligent systems reflect a deeper understanding of our world and can adapt to new needs

That's not quite true. In ensemble neural networks, you have separate models each trained to be an expert in a single thing. You have a language model, an image recognition model, etc, each are called an "expert" and thus an ensemble of experts. When an input is presented to the trained network, it picks the expert that is most appropriate and shows the input to that one. That network then performs its analysis and gives an output.

From the outside, it's sort of general purpose because you only need one system running to do different things but it's not really general purpose from an AI perspective because inferred knowledge is trapped in each of the expert silos. You can tell that this is the case because truly general purpose AI has the problem of encoding inputs from different domains - how do you represent an image in numbers in the same way that text is so that knowledge can be inferred? That's a *very* hard problem that this paper doesn't address, so I'd draw the line at calling it general purpose.

For current deep learning, this is very cool indeed.

UK Ministry of Defence takes recruitment system offline, confirms data leak


Re: Final straw for the Army/Capita marriage?

Or eject.

Because is not 1941.


"Stop hitting yourself!"

Capita is the big brother that hits the smaller brother in the face with their own fist. "Stop hitting yourself!"

Complaints mount after GitHub launches new algorithmic feed


Discovery suggests I have lots of free time

When I'm coding, I'm trying to build something, fix something, improve something. When I'm learning new tech, I do so in a targeted way: how is it similar to things I know, how is it different? I'm not just sitting at my desk surfing through packages thinking "oooh, this looks fun, I wonder what this package does" like I would idly watching TV. The feed should be issues, PRs and releases of things I've starred and that's it.

It's a really weird addition. Not a fan.

Reg reader rages over Virgin Media's email password policy


They better hurry up and employ "Business Information Security Practitioner" so that they have someone in post they can ignore.


Google's DeepMind says its AI coding bot is 'competitive' with humans


More tools is a good thing

The world needs more software and there aren't enough of us making it. Many business apps could work just fine with higher abstraction levels than they have today. I can see this kind of AI being a great support to low-code. I've worked with plenty of people who can explain business logically but weren't interested in code, this would help them a great deal.

Every time there's a new abstract layer, you hear calls of "programmers out of a job" and yet here we are.

I'm more interesting in the idea that AI might find a new paradigm that programmers haven't thought of yet. In the early 00s I saw some research that would use genetic algorithms to generate lots of snippets of code to solve a task. At each generation it would take all those that worked and create another generation of code snippets. Processing power was something of a premium then compared to now and so limited it to very simple problems. I wonder if exploring the solution space using neural networks and evolutionary computing might turn up something that's useful to us all!


Re: What does "in the top 54%" mean?

There's no human involved in this test. Testing the validity and performance of an algorithm is the easy bit.

God of War: How do you improve on perfection? You port it to PC, obviously


Re: How do you improve on perfection?

That's great advice but I didn't even need to do that! Possibly because my last three cards were nvidia based.


Re: How do you improve on perfection?

In 5 years the only times I've ever had to fiddle with a PC game is after a GFX card upgrade, where I've had to hit the "auto gfx settings" to accept the improved boost. Otherwise, not had an "obtuse technical reason" in memory.

James Webb Space Telescope has arrived at its new home – an orbit almost a million miles from Earth


Re: ...and "blow" through a "whopping" $10bn in funding

You could try not being condescending. I doubt I was baited, I imagine it was a cliché applied inappropriately.


...and "blow" through a "whopping" $10bn in funding

Come on Reg, that's a little derogatory for a great achievement, isn't it? $10bn over 25 years for discovering the secrets of the universe is great value for money. In 2008 the UK gov alone blew $850bn in 3 days and what did we discover in the 13 years since? That the richest benefitted the most. I'll take the $10bn Webb telescope any day.

Microsoft's do-it-all IDE Visual Studio 2022 came out late last year. How good is it really?


Re: The Microsoft naming department

Some errata for you:

Visual Basic did not completely die, it lives on, VB6 is supported in Win 11.

C# was always called C# and was separate from J#. J# was built at the same time and converted Java source/bytecode to MSIL to run on the .NET Framework.

C# is related to C and C++ as an ancestor. It was original called "Simple Managed C". Did you know it stands for C++++, the four pluses meaning +2 and forming a #? If you look under the hood of 1.0, C# is more like C++ but in the early days it looked enough like Java for people to convert over.

I agree more with the original comment that naming is a complete disaster zone but not for the reasons you give.

Meta trains data2vec neural network to grok speech, images, text so it can 'understand the world'


Redefined what multi-modal means

It's clever idea but I think they've redefined what multi-modal means to avoid the difficult bit. 3 separately trained models that have outputs combined isn't multi-modal. Multi-modal is desirable because it's one of the hard problems left in classification neural networks.

Also, this was a red flag:

"We have not specifically analyzed how our models will react to adversarial examples"

Then Meta AI shouldn't be releasing news stories until you've properly tested it. That includes trying to break it to understand its bias and limitations.

Also, unsurprisingly, the original blog article and this story mentions nothing of ethics.

£42k for a top-class software engineer? It's no wonder uni research teams can't recruit


Every year I look back at Uni and then think no, not yet

I had to leave after my PhD (Artificial Intelligence/Cybernetics) in 2003 because the only Post Doc funding available wouldn't pay my rent/debt. I've been an enterprise web/algorithms/data dev since and although I'd love to go back to the collegiate atmosphere, the pay and Uni bureaucracy puts me off. My skill set is exactly what Goldacre is after but Unis need to return to being research establishments that pay good people well rather than teaching quasi-businesses. That's not going to happen any time soon.

After deadly 737 Max crashes, damning whistleblower report reveals sidelined engineers, scarcity of expertise, more


Peer review?

The important part of scientific method is peer review. For proper peer review, you need an outside organisation to test the product. Just saying "our results are reproducible and reliable" is not enough. Your own results are meaningless until someone else has reproduced them. That's the actual scientific part of the phrase.

Academics horrified that administration of Turing student exchange scheme outsourced to Capita


Re: Running

From the tax man and to the Cayman Islands.

More than half of UK workers would consider jumping ship if a hybrid work option were withdrawn by their company



Is there a context problem here? Do you employ a lot of IT professionals? In the context of El Reg, most will be reading this from the perspective of IT professionals, which is a world away from industries where there is manual work or work wear.

Virgin Media fined £50,000 after spamming 451,000 who didn't want marketing emails


50k out of a turnover of 5.13bn

They'll think twice about doing that again!

This House believes: A unified, agnostic software environment can be achieved


Not possible

If you take AI to mean feedforward/recurrent neural networks then it's feasible but unlikely for all the good reasons other commenters have mentioned.

However, there are other algorithms that won't fit your model so well at scale - unsupervised learning or Bayesian networks. If you build your high performance compute for feedforward/recurrent neural networks, you will be creating a bad fit for other algorithms.

Leaked footage shows British F-35B falling off HMS Queen Elizabeth and pilot's death-defying ejection


617 Squadron is also a RAFAC conventional glider unit based out of Manston. Perhaps they should try a winch launch on the F35?

All change at JetBrains: Remote development now, new IDE previewed


IDE in the cloud is not quite killer app... yet

I like the idea of a whole development environment in the cloud but the difficult bit isn't the IDE or code share or even paired programming (most IDEs have that now and you don't need cloud for it). It's attaching to and interactively debugging your code while it's running. Sure, your debug environment could be in containers now but that kube config is finicky even when running locally. Running that same config on someone else's IDE-cloud-platform and making it play nice with the IDE isn't quite there. They require port forwarding through the thin client to your local machine. So, it's not completely dev-in-the-cloud. Not yet, anyway. Those are just technical things that eventually get ironed out.

It's good that VSCode has some competition but that competition is going to need to be open source, not for the ability to fork but to be seen to be developing in the open rather than behind closed doors.

India backs away from digital services tax after US pressure


15% does sound low but it's easier to introduce it low and raise it later than it is to start high.

A tiny typo in an automated email to thousands of customers turns out to be a big problem for legal


Einstein's theory of general relativity is rarely so well demonstrated.

Red Hat 8.5 released with SQL Server and .NET 6 ... this is Linux, right?


Still unsure what they're saying here...

> NET 6 is quite a watershed release and as part of us having a more predictable release schedule we want to introduce content based on its natural lifecycle

I've read it through a few times and although I understand the words and statements individually, they neither answer the question nor add anything new.



Biting the hand that feeds IT © 1998–2022