* Posts by The Indomitable Gall

1521 posts • joined 10 Jun 2009

Bill Gates debunks 'coronavirus vaccine is my 5G mind control microchip implant' conspiracy theory

The Indomitable Gall

Re: Very good skeptoid podcast recently debunking this stuff


"Asking somebody to explain it forces them to think about it. "

Unless they've already thought about it (or had someone think about it for them) and they already have an answer, such as "the brain--why do you think tinfoil hats exist?"


But if the question is repeated again and again with no malice or implication of stupidity, over time it might have an effect. Lots of us here will remember the long, slow decline of racism, sexism and homophobia... that is still going on, decades after it started. Ignorance is never dispelled overnight. Patience is required.

The Indomitable Gall

Re: Very good skeptoid podcast recently debunking this stuff

If you can't change their minds, don't talk to them.

Educating the ignorant is a slow process, and for those of us who try to act with patience and help people slowly move out of their cognitive traps, it is supremely frustrating when people blunder in and shout "YOU FUCKING NUTTER!" then run away proud of themselves for speaking up in the name of truth, having demolished our attempts to get the ignorant to start to understand the truth a bit better.

Trump bans Feds from contracting H-1B workers and makes telehealth the new normal

The Indomitable Gall

Re: Really a ban?

Has Trump not already shown himself to be extremely flexible on definitions, restrictions and limitations...?

Google extends homeworking until this time next year – as Microsoft finds WFH is terrific... for Microsoft

The Indomitable Gall

Re: Self Control

" The 4 hours extra of working a week, can be offset by the reduced time wasted commuting. I've worked in jobs where commuting wasted 10 hours a week, so spending 4 of those working and 6 of them relaxing is a win for both sides, and helps justify the idea to the company. "

Your company hasn't paid you for that time, and hasn't "gifted" you that time -- that's your time, and you shouldn't be giving it away for free. If you need to put in the extra time to buy yourself the ability to work from home, that's a bit shitty, but go ahead. But to rationalise doing more work for no active recompense from your employer... that's being complicit in your own exploitation.

Unshackle the proletariat, and fight for a better tomorrow, comrades!

The Indomitable Gall

Re: ... welcome to the machine

Yeah, a dystopian sci-fi kick in the teeth.

The Indomitable Gall

Re: WFH is also good

I always felt stared at and disapproved of when I took an afternoon constitutional out of the office to try to get my head back in the game. But I did it anyway, as I knew it was helping, whatever the other drones thought of it.

The Indomitable Gall

Re: WFH is also good

Disturbing no-one? The mere thought that there exists a plushy form of the source of all human misery disturbs me, for one!

The Edinburgh Fringe festival isn't happening this year, but that won't stop a digital sign doing its own comedy routine

The Indomitable Gall

The gates have been locked almost continuously since last century. I walked down those steps maybe twice. They stank of urine, or, as we say in Scotland, they reeked of pish. And that, dear reader, is why they're locked.

The self-disconnecting switch: Ghost in the machine or just a desire to save some cash?

The Indomitable Gall

I would do anything for an upvote...

...but I won't do that.

Virtual reality is a bonkers fad that no one takes seriously but anyway, here's someone to tell us to worry about hackers

The Indomitable Gall

Re: VR is a fad?

" 2) "VR" Games. They suck. I don't want an immersive VR experience, where I have to mime getting out of the car and twisting off the gas cap in order to refuel. I want to press "Y" while next to the pump and have my tank meter zip to "F" on its own. In fact, I don't want to even stand up.

3) No/crap support for non-VR games. The main reason I want 3d is for stereo 3d, not for a VR experience. I like traditional styled games, where you sit and use the keyboard or maybe (for the less cerebral games) a console style controller. "

Lots of development houses thought that way, and started trying to do traditional gaming in VR (the original Oculus dev kits didn't come with motion controllers, after all) but in the end, they all came to the same conclusion -- that the sense of presence was too much, and using a controller just felt weird.

The Indomitable Gall

VR security...?

VR security is a non-issue. Not because it isn't a potential problem, but because either your headset is just a fancy display controlled by an external computer/console or it's a display with an Android device built it. That means VR security is IT security, and business as usual, not a new category. VR attacks will exploit the exact same attack vectors as every other attack, but there will be far fewer of them, as it's a small target group. Furthermore, with the exception of the Quest and similar standalone units, it's going to be pretty much impossible to identify potential targets from internet metadata, as there's nothing to set a VR-equipped PC or Playstation apart from non-VR ones unless you happen to be browsing the web from a inside a headset.

Hell hath GNOME fury: Linux desktop org swings ax at patent troll's infringement claim

The Indomitable Gall

Re: A money-grubbing Rothschild?

" both financing its operation and financing its abolition. "

Well, there is the question of morality vs pragmatism. If the world's run by slaves, conscientious object costs money that might render you uncompetitive. Then eventually you get the chance to do something about it.

Or maybe it's just that families are made up of different people with different views.

First Python feature release under new governance model is here, complete with walrus operator (:=)

The Indomitable Gall

If you look at the PEP, one of Guido's justifications was that reviewing existing Python code, he was finding plenty of examples of people duplicating work (eg [ f(x) for x in x_list if f(x)>0 ]) or doing redundant work to avoid nested ifs, and he saw this as a solution for real-world problems.

The other advantage is removing what the linguist part of me would call "long-range dependencies". The closer the assignment is to its use, the easier it is to reason through the code. Particularly, mathematicians and scientists are used to reasoning through things in semantically dense formulas, and less used to the step-by-step imperative style.

The Indomitable Gall

When you remove the possibility of =/== substitution bugs, the main danger of C-style assignment is gone.

The main reason I see the walrus as a good thing is in list comprehensions.

Consider :

newlist = [ f(x) for x in oldlist if f(x) > 0 ]

Now imagine the that f(x) is O(^n) or O(n!) -- you've either got to take the performance hit of running it twice or expand the code out. For a scientist or mathematician (and academia is a major target audience for Python), this one line, pseudo-mathematical, pseudo-functional approach is much more readable.

My personal preference would have been for local variable declarations, eg:

newlist = [ result for x in oldlist if result > 0 where result = f(x) ]

but that's not the way they chose to go. Instead we have:

newlist = [ result for x in oldlist if (result:= f(x)) > 0 ]

...which is less restrictive than my way, and Guido set out good reasons for it. But when you look back at the examples, I think it boils down to this: a great many users of Python aren't immersed in the imperative programming style the way most pro software devs are, and the reliance on branching blocks and lines is more confusing to them as it renders code logic implicit. I'm surprised it took Guido as long as it did (v2.4) to introduce conditional expressions, actually, as that's really useful for bringing a single mathematical function into a single line.

YouTube thinkfluencer Siraj Raval admits he plagiarized boffins' neural qubit papers – as ESA axes his workshop

The Indomitable Gall

Re: My relentless workload...

On the other hand, it is now getting more and more widely recognised that the YouTube algorithm is a slave-driver, and if you don't work yourself beyond the point of being able to produce decent content, you will be punished by being made irrelevant in the listings.

Whole multi-person production teams spend months to create a couple of hours of television, and YouTube is pushing solo producers to knock out around an hour of video per week to get a basic income. Of course corners get cut.

I'm not defending Raval here, though -- he chose how to react to the pressure. If he'd exited YouTube with his reputation at a high point, he could probably have got himself a fairly decent job. Instead he tried to continue in an unsustainable market.

The safest place to save your files is somewhere nobody will ever look

The Indomitable Gall

" It seems such a counter-intuitive thing to do and yet is so widespread. The only explanation I can think of is that users learn it from each other. "

The most underestimated force in UI design is the path of least resistance. Deleting is one of our quickest actions (thanks to having a dedicated key) and if deletion is (initially) non-destructive, the apparent continued availability makes it seem like the most efficient means of archiving.

If we also had an "archive" key on our keyboards, more people would archive properly.

Oh dear... AI models used to flag hate speech online are, er, racist against black people

The Indomitable Gall

Re: 6% toxic

It is not 6% toxic. they shouldn't have used percent to describe it, because it's not really a percentage or a proportion. Most machine learning algorithms score everything in the range (0,1], never scoring zero, because there's lots of multiplication.

The Indomitable Gall

Re: “I saw his ass yesterday”

I think Shakespeare, Chaucer et al. would dispute your use of the term "the original".

Thy coarse neologisms cause great pain upon mine ear. Get thee to a nunnery!

YouTuber charged loads of fans $199 for shoddy machine-learning course that copy-pasted other people's GitHub code

The Indomitable Gall

Re: why do people go for online tutorials?

All very much true.

However, the good course designer starts with a particular demographic in mind as his/her target audience, and builds the course around them. A good teacher will adapt the course on the fly if students are finding it too hard or too easy.

Modern digital courseware wants to sell to as wide an audience as possible, which means all notions of prerequisite learning go out the window. Then there's the tendency for everything to be live coding instead of lectures, which means everything's paced by lines of code rather than complexity of concepts.

I've been watching a LinkedIn Learning course on Vue.js today. and it's exclusively live coding. The presenter keeps going off on tangents just in case you don't know particular features of JavaScript as and when they pop up in a line of code he's writing.

The Indomitable Gall

I'd be more generous and say that he's fallen into the greatest cognitive trap of the crowdfunding era -- the idea that money comes first, then everything else falls into place.

Kickstarter was so swapped with this sort of thinking (for example all the non-engineers who said "give me some money and I'll hire an engineer to create my technologically impossible and/or financially infeasible games console) that they insisted on prototypes first (and now they all head to Indiegogo instead).

The notion of building a course around freely-available information is so much the mainstream that the notion on its own is valueless (how many of your uni lecturers invented and/or discovered the stuff they taught you) -- it's the execution that matters.

What I note as missing from the article is any discussion of quality assurance. If you're delivering a course to that many people, there should be several layers of oversight, and ideally also a fairly rigorous testing process involving teaching beginners... I doubt there was any of that.

It's the thing that bugged me most about Coursera and Udacity when they first came out -- 1000s of people taking courses that had never been run before, and many of which (on Coursera and EdX, anyway) were only ever run once. No beta testing, no refinement or improvement... all people ever saw was the first draft. In that situation, at least these courses were based heavily on tried and tested university modules, but even then, the change of medium really called for a lot more in terms of adaptation and testing.

Lucas Pope: Indie games visionary makes pen-pushing feel like an exciting career choice

The Indomitable Gall

That's an odd comparison on several levels.

First of all, the 80s were full on one-man dev teams, like Mike Smith (the Miner Willy games), Andrew Braybrook (although he was part of a software house, most of his genius games were solo coded) or Jeff Minter. With 16, 48 or 64k to play with, it was pretty easy for one person to fill all the memory in a few weeks' work.

But Mike Singleton... well, he did some clever maths-based programming to make games that most people wouldn't have thought possible -- even now part of me wants to believe that Lords of Midnight on a 48k spectrum is just a Mandela effect.

We're now living in an era where technological marvels are beyond the reach of the sole coder, and what indies like Pope are doing is finding ways to use what's already there efficiently to build compelling story-driven games without having to worry about the tech.

OK, so Pope did have to write a custom shader to get the "dithering" effect for monochrome shading, so it was far from a totally non-technical project, but the comparison just seems weird to me.

Reach out and touch fake: Hand tracking in VR? How about your own, personal, haptics?

The Indomitable Gall

Oculus Quest hand tracking

" Unlike the wizardry used by the HoloLens 2 to calculate the position of the user's extremities, the Quest will employ its monochrome cameras and "AI magic".

Good luck with that – the Quest is hardly the processing powerhouse. "

Processing not required, allegedly. Oculus seem to be saying (https://www.youtube.com/watch?v=K2zLneGGbk8) that this is all done on the devices DSP hardware, and that the DSP hardware was specced up deliberately to leave enough capacity after implementing the basic inside-out tracking for later hand-tracking to be included.

I haven't looked into developer specs in any depth (if I play with VR dev, it's always with the dev kits in Unity or UE) but I'm not aware of any API to allow developers to access the DSP for custom functionality, so I don't believe it's going to have any real impact on device performance. Yes, there's reportedly currently noticeable latency in it, but that's likely to be down to the DSP itself.

MPs call for 'immediate' stop to facial recog in UK as report underlines bias risks in 'pre-crime' algos used by coppers

The Indomitable Gall

An overlooked problem

"Algorithms have to crunch through tons of data in order to make accurate predictions. Relying on a single individual's data is probably not enough to get it to work. Even if a tool is effective for groups of people, it doesn't mean it'll necessarily be accurate for a single person."

It's even worse than that, though. Setting negative expectations regarding an individual is likely to increase that person's alienation, making the negative prediction a self-fulfilling prophecy. A famous education experiment in Switzerland told teachers that a set of kids were going to be "late bloomers" and highly intelligent. The teachers treated them differently, and the prediction game true. There was nothing particularly special about these kids, other than teachers' expectations.

If convicted criminals are treated with constant suspicion by beat cops, they'll be convinced that rehabilitation isn't possible and bam! -- self-fulfilling recidivism prophecy.

GIMP open source image editor forked to fix 'problematic' name

The Indomitable Gall

Re: Eh?

Self-deprecating is different from insulting towards a minority.

It's like MongoDB -- how did they ever think that was OK?

The Indomitable Gall

Re: Eh?

An insult used by able people against disabled people.

I think "gimp" is only used in that sense in the US, but that's surely enough. Still don't get how MongoDB didn't get that their name is pretty f*cking insulting too.

DeepNude's makers tried to deep-six their pervy AI app. Web creeps have other ideas: Cracked copies shared online as code decompiled

The Indomitable Gall

Re: I'm impressed and worried

The problem here is that the problem solved here is one which has a known solution: what does a naked body look like?

The cure for a new virus is now a known problem, so needs a different type of AI from one that can be trained to draw cats, nipples or beards.

These boffins' deepfake AI vids are next-gen. But don't take our word for it. Why not ask Zuck or Kim Kardashian...

The Indomitable Gall

" Which make me wonder where all the "deepfake" audio development is happening. "

Baidu and Adobe, I believe.

Deepmind, Google and Microsoft have been working towards it too, but I think Baidu and Adobe are the only ones who've demonstrated actual computer-faked spoken speech so far.

Please be aliens, please be aliens, please be aliens... Boffins discover mystery mass beneath Moon's biggest crater

The Indomitable Gall

Please tell me...

Please tell me that the Aitken crater is named after Aiken Drum.

(There was a man lived in the moon, in the moon, in the moon,

There was a man lived in the moon and his name was Aiken Drum.)

We listened to more than 3 hours of US Congress testimony on facial recognition so you didn't have to go through it

The Indomitable Gall

Re: File under: No Sh1t Einstein

Said like that it's obvious, but the limited public understanding of AI and machine learning means it's crucially important that it is said.

if developer_docs == bad then app_quality = bad; Coders slam Apple for subpar API manuals

The Indomitable Gall

Re: This is a serious problem

I haven't really been able to examine Eiffel closely, because as soon as I look at it, I see yet another impenetrable jumble of fixed-width letters.

Why are we so obsessed with fixed width? It is hard to read, making it easy to make mistakes.

Why are we so obsessed with plaintext? It's so inconvenient that we spend our lives hacking IDEs to try to present layers of visual meaning on top of the text through colouring and font weight. And still it lets us type illegal code -- syntax errors.

Creating a truly smart language integrated with the IDE would be simpler than hacking the editor to highlight errors.

Just look at the STRIDE language, invented as a teaching language. Almost all the flexibility of Java with practically no scope for syntax errors.

iPhone gyroscopes, of all things, can uniquely ID handsets on anything earlier than iOS 12.2

The Indomitable Gall

Are we going to see a market form for vibrating phone cases now?

OpenAI retires its Dota-2 playing bots after crushing e-sport pros one last time

The Indomitable Gall

" OpenAI can also see the whole map, something humans can’t do. "

That is not the goal of gameplaying AI. OpenAI have just gone for a cheap headline-grabber rather than doing their homework.

Prince Harry takes a stand against poverty, injustice, inequality? Er, no, Fortnite

The Indomitable Gall

Re: He's only a royal

I always thought that was a bit of a moot point, but then I realised that we're paying him for being a royal. The public should have a right to a DNA test for anyone on the Civil List.

The Indomitable Gall

Re: Thanks Harry

" I take it that you hadn't heard that this process was completed in 1760 under George III? The Monarch's assets were separated off and paid to the Treasury, "

Which is why anything since taken from the people would be accurately described as stolen in a legal sense. Under previous monarchs, many state assets have gone mysteriously missing from high-security locations and no charges have been brought, including several generations of crown jewels. The crown jewels were quite certainly stolen. Who by, we can't say, but the lack of any action suggests someone who is above the law. Some of these disappearances are said to line up quite coincidentally with cashflow problems in the households of the then-reigning (now dead) monarchs.

" and in return the government ran a Civil List returning a set figure "

Ah, so we got back what they stole from us... by buying it? You think having to buy back what was stolen is justice. If so, I'll sell you your own car next Sunday.

Oh, and then you mention the Crown Estate. How did they get the Crown Estate? Now I'm not saying they stole it, but I think the previous poster would. Why did they have rights to own the land in the first place?

Users fail to squeak through basic computer skills test. Well, it was the '90s

The Indomitable Gall

If you're honest with yourself and think way back, your early mouse experience was probably like mine -- pushing the mouse to the edge of the desk and off the side because the pointer wasn't quite at the edge of the screen yet... then embarrassedly realising that I could just pick it up and move it back without the pointer moving.

We all had problems getting our heads round the mouse when we started.

The Indomitable Gall

Re: Not sure...

He was specifically talking about rosé. And he said he thought the French stuff was rubbish too.

The Indomitable Gall

Re: Mouse balls

For me the issue was rarely the ball itself, but the accumulated crud that had transferred from the ball to the rollers.

One of the PC magazines gave away a free cleaner that I thought was great -- it was a ridged ball on a stick and you just took the ball out then wiggled this in the whole to scrap out the rubbish.

Then I realised it was quicker and easier to use the pocket clip on the lid of a cheap bic biro and just scratch it all off directly.

Want to get rich from bug bounties? You're better off exterminating roaches for a living

The Indomitable Gall

Re: Eh, what...?

(N.B. This is not a defence of bug bounty, just an observation about handling fair recompense.)

The Indomitable Gall

Eh, what...?

"A bounty price can't really exceed what an in-house security person will make."

Ehhh.. what?!? Whyever not? Surely freelancers need to get more money than employees in order to rebalance the risk/reward ratio to compensate for the lack of guaranteed income?

Samsung’s new phone-as-desktop is slick, fast and ready for splash-down ... somewhere

The Indomitable Gall

To really take off, the convertible phone has to have a (near) universal standard, as then you can rock up to a public docking station in a coworking café and do your thang.

The Indomitable Gall


Windows Icons Menus Pointer


Windows Icons Mouse Pull-down-menu

Both had currency, so both are right.

But mixing the two up will always be slightly tautologous.

Python creator Guido van Rossum sys.exit()s as language overlord

The Indomitable Gall

Re: Reinventing a more limited wheel

" There is absolutely no way to ensure that f(x) is idempotent. If you don't understand that, then step away from the keyboard. "

And this is another reason to favour the PEP -- taking the same example

results = [(x, f(x), x/f(x)) for x in input_data if f(x) > 0]

if the function f is not idempotent, then we now have the possibility of throwing a divide-by-zero exception, which would cause the whole comprehension to be binned. (E.g. first call to f(x) returns 1, but in x/f(x), f(x) returns zero.)

In the case of the assignment expression version...

results = [(x, y, x/y) for x in input_data if (y := f(x)) > 0]

... as f(x) is only evaluated once, x/y will never result in a divide-by-zero exception.

The Indomitable Gall


Anyone arguing whitespace vs block delimiters is really missing the point -- these are both compromise solutions for coding in plaintext on memory-limited machines.

Modern computers handle block nesting in much more sophisticated ways -- think XML.

Take a look at Stride, an educational programming language based on Java.

The editor is "frame-based", meaning all the block delimitation is implicit, and it's damn near impossible for a coder to mix up the flow control.

It's also impossible to commit any syntax errors, as starting a line with if, while etc leads to a template being presented with all possible slots presented as text boxes, so you can only put something where it's permissible.

Better still, you no longer have to type full commands, with a single keystroke indicating the line's key function: "=" for assignment, "i" for if, "w" for while etc.

It's quicker to code than standard java, it's less bug prone than standard java, and it's made up almost entirely of keyboard shortcuts, and because it saves as XML, you technically could edit it in plaintext if you wanted. It seems to be everything programmers want... so why aren't programmers picking up on it?

The Indomitable Gall

Re: Futuristic progression of Programming Languages?

I agree that visual programming is very limited.

I think the future of code is in frame-based programming. Check out the Stride programming language used in the Greenfoot and BlueJ educational IDEs. It's designed to let you develop traditional line-paradigm code more efficiently by reducing the number of keystrokes and making syntax errors impossible and scope errors rare.

Every type of code statement has a limited range of possible syntaxes, so Stride turns each statement type into a template where you fill in the boxes. As an educational programming language, Stride maps to a subset of Java, and any Java code can be called from Stride.

It also renders the block delimiters vs meaningful whitespace debate moot, as blocks, scopes and indentation are handled by the editor automatically as the programmer is no longer dealing with plaintext.

I think this frame-based paradigm has real potential to change coding practices in a way visual coding never really did.

Pi-lovers? There are two fresh OSes for your tiny computers to gobble

The Indomitable Gall

Re: noob boot

Not if they're MacDonald's apple pies -- those things will survive the nuclear apocalypse.

JURI's out, Euro copyright votes in: Whoa, did the EU just 'break the internet'?

The Indomitable Gall

Re: News sites should pay the aggregators.

" News sites make money off of the additional traffic driven to them by the news aggregators. "

Which assumes that:

A) aggregator sites drive traffic to content providers


B) the traffic driven is of high value.

Whether A is true or not depends on whether you're interested in unique visitors or page-views. Aggregators increase the former, but decrease the latter. This is where B comes in. "Drive-by" readers are less valuable than brand-loyal "sticky" readers.

Overall, aggregators appear to cost content providers significantly.

What's all the C Plus Fuss? Bjarne Stroustrup warns of dangerous future plans for his C++

The Indomitable Gall

Re: C and C-style C++


" Layer your software - assembler/C at the metal/kernel. C++ at the system level. Dynamic scripting language a the application level. "

Where do libraries sit though?

What do you see as the role for functional programming?

And why would you want dynamic scripting for applications when self-modifying applications are a security risk?

Tesla undecimates its workforce but Elon insists everything's absolutely fine

The Indomitable Gall

Re: sustainable, clean energy

The biggest issue with nuclear is that all current practical economics models emphasise short-term outcomes, and the last thing a nuclear reactor needs is a management team that can't see the bigger picture. This really comes out into play when the reactor hits the end of life, as decommissioning hasn't historically always been budgeted for, meaning the operator goes bust and leaves the cleanup to the public purse.

I reckon nuclear operators should be obliged to buy government bonds to insure the cleanup, and if they can do it cheaper and cash in the bonds, good for them.

The Indomitable Gall

Re: Undecimate?


" It's not an etymological fallacy if it's also used (and understood) in the original sense. It's only the case if the original meaning is almost never used. "

Yes it is, because the previous poster was talking about the etymological argument being used against the modern one. That's fallacious -- just because one version matches the etymology, doesn't mean the other version is wrong.

Machine learning for dummies: You needn't go back to uni to use it

The Indomitable Gall

" The university that I work for in 1996 dropped Computer Science in favour of Applied Computing, realising that industry doesn't need someone who can build a linked list library from scratch but rather knows how and when to use an existing library. "

But this brings up the problem of how to develop the mental schemata to process what you're doing. If all you ever do is work with libraries, you miss out on several levels of abstraction and don't fully understand what you're doing.

The other side of the coin is that if you only ever deal with fundamentals like manually programming lists, you're missing out on several levels of abstraction and can only do real-world tasks within a very narrow domain.

The problem we have is that most courses fall into one of two extreme camps, and few people are discussing the middle-ground. But if you look at things like Stride in BlueJ, we're slowly starting to approach it.


Biting the hand that feeds IT © 1998–2020