* Posts by ibmalone

1491 publicly visible posts • joined 6 Jul 2017

Logitech's latest keyboard and mouse combo is wired, quiet, and suspiciously sensible

ibmalone

Re: short life of Logitech

Dye infused caps used to be pretty standard (and might last longer than the keyboard), but I had to get rid of my last one a while ago and most new keyboards are black where dye infusion is less easy. That said I'm typing this on a cheap Dell keyboard (basically the add-on option for one their workstations) that must be at least 15 years old, and the only decals to have worn off are the direction keys and half a shift symbol.

ibmalone

Re: What is the issue with keyboard these days?

I can think of a couple of practical use cases for RGB lighting, neither of which is to display a psychedelic lightshow (which typically tends to be the domain of showrooms, see also 90s era stereos):

1. (For whole keyboard or zoned lighting) Adjusting backlighting colour to your preferences, some people may prefer warmer colour palettes for backlighting for example.

2. More for gamers, but being able to colour highlight particular keys can be quite useful. Flashing can even be useful there as a way to communicate status (although control methods tend to be proprietary so integrating these things can be a bit painful).

China spawns an x86 supercomputing monster, with an AMD connection

ibmalone

Re: Interesting

For HPC, it would likely be better for them to have more 128- or 256-bit vector-matrix units (rather than 512-bits), without SMT, and focus on optimizing scatter-gather memory accesses (over step, stride, gait) ... SMT4+ is more of a database processing thing iiuc (and it sunk SUN).

Indeed, for almost everything we run you may as well disable SMT (or make sure threads are limited to the number of cores), as floating point units are the limiting factor for a lot of scientific computing and the fragmentation from multithreading starts to hit performance once you're exceeding the number of threads that can genuinely run in parallel.

As US scientists flee Trump, MP urges Britain to do more to nab them

ibmalone

A friend, who eventually became a British citizen for this reason (and put it off for a long time because their birth nationality didn't allow dual citizenship), had to pay the ridiculous NHS surcharge, despite spending their entire working life in the UK, running a business that created jobs in the UK and being raised and educated at the cost of another country altogether (including paying foreign student tuition fees in the UK while at university) and therefore even without the surcharge being more of a net contributor to this country's fianances and health system than many UK born citizens.

Their example was particularly extreme, but many people targeted by the surcharge will similarly be arriving here at working age, earning money and paying tax (both on income and spending) just like everyone else. The only place it has any kind of (mean) logic is in preventing them bringing dependents, and if you want to attract skilled workers on UK salaries, asking them to pay an additional £700 per year for any children they might want to bring isn't exactly that appealing.

ibmalone

Or the funding for said salaries. As NIH and other US funding is cut (sometimes in the middle of programmes), charities like Alzheimers Society are having to bridge those gaps, leaving less money for projects elsewhere and so UK institutions are already starting to see knock on effects. If they want to attract US researchers the funding for the science needs to be there, £50M is approximately a single moderate to largish research centre.

Brewhaha: Turns out machines can't replace people, Starbucks finds

ibmalone

Among them all, I simply don't understand the success or existence of Costa. Price-wise it's similar to everything else, but somehow adds on a level of misery in a way I can't fully articulate. If Starbucks coffee is not great, then at least it manages to be somehow reassuring in the ridiculous comfort food way that only a pumpkin-spice latte can really be and their cakes and biscuits ridiculously sized (if over-sweetened), Pret is all about speed and convenience at the cost of joy, Nero makes a stab at doing decent coffee and slightly upmarket food in (varying) comfortable surroundings, but Costa feels like choosing a snack from the supermarket meal deal aisle for twice the price. I suppose it makes sense that their recent big move has been to put branded machines into supermarkets and cut out the middle man (themselves).

ibmalone

An alternative, that winemakers have been aware of for centuries, is have a good taster (or tasters) and blend appropriately from different sources. But it obviously costs more.

ibmalone

Re: Dont ever want

Upvote, but I also used to be in the lava camp and it has its appeal, especially if you need a strong flavour to wake you up. The lighter roasted Lazy Sunday is not the best coffee you'll ever drink, but definitely pleasant enough, has more flavour than just "roast" and is cheap enough for people not deep into their coffee not to blanch at.

Union Coffee is available in most UK supermarket chains now and pretty good if you're not sufficiently keen you're hunting out local roasteries and subscriptions. Currently they have a Brazilian "Bobolink" that is chocolatey and nutty enough to pass as a dark roast for people who're used to that while also tasting really nice. (Union also put their roast date on the packet, ideally you're drinking within about two months, if it's been sitting on the shelf for a year then you're probably better off buying something else.)

ibmalone

Re: quote: his first priority should be finalising fair contracts with the 11,000 union baristas

Why not? They pay money to staff the union, and (in the US) take employment risk to improve their bargaining position in order to get a slightly fairer sliver of pie.

Extending the award to non-union members is a boss class strategy to undermine the union, and in the long run, pay themselves more (and all workers less)

This is a slightly funny one that probably comes down to local attitudes. In theory it's correct, but it also risks making unions into cartels that attract resentment from other workers and the general public. My impressions of US unions are founded only on how they're represented in popular culture, but that impression is maybe best represented by Homer Simpson, "I always wanted to be a teamster, so lazy and surly". It's almost certainly deeply inaccurate and also (something that happens in the UK too) generalises from an attitude about one union in a particular sector to all unions. In the end, if you don't fight for everyone you can easily walk into another, much older trap; divide and conquer.

Copyright-ignoring AI scraper bots laugh at robots.txt so the IETF is trying to improve it

ibmalone

If any normal bloke had done the shite Boris, Elmo, Zuck, and the Angry Toddler have, they'd have been in the nick long ago.

I increasingly find myself reminded of Aaron Swartz, and every time I feel a little sadder than the last.

Free Software Foundation rides to defend AGPLv3 against Neo4j license add-ons

ibmalone

Re: Who owns Copyright in Copyright?

The question "Who owns Copyright in Copyright?" brilliantly sums up the stupidity.

Whereas "Who owns copyright in a license?" is pretty straightfoward, and if you got a lawyer to draw up a license for you they'd expect to be paid for it (and might be able to stop you using it if you hadn't). The free availability of well-written licenses is one of the points of FSF and Creative Commons.

In conjunction with patent law, it is an utter mess, and offers paradise for nitpicking lawyers.

Sadly the people best placed to offer alternatives tend to end up as nitpicking lawyers. That said, nitpicking is probably inevitably the situation once you have any codified set of rules.

ibmalone

Even if the FSF is correct about all of this, all it means is that they could make Neo4J change it, which they have done voluntarily. It wouldn't necessarily mean that you can apply other terms to the software.

There's an interesting question, which is not really relevant to this case at all, about what happens there. If there was a situation in which FSF could make Neo4J change the license it seems clear that would be limited to not using the language of the AGPL (which is FSF's), but sufficiently changing the language of the license without altering its terms is hard (particularly everyone knows it's a derivative work to start with, so you can't argue similarities are not the result of copying). How much is legalese like mathematics in that you can't claim ownership of a theorem (or in this case the language to achieve a particular effect)? In some ways it seems opposite, since everything is down to wording and language, but we're talking about the operation of logic, albeit in natural language. If a particular phrasing is the only way to achieve a particular effect (flowering it up can introduce side effects) then is that phrasing fair game?

ibmalone

Exactly, the license attached is the license under which the copyright holder allows others to use it. They can at any time decide to stop offering it under that license (although for open source this alone does not stop those already in possession of it continuing to distribute), additionally offer under a new license, and potentially revoke the license (the GPL is not expressly nonrevokable, although I think there may be notice periods required depending on the legal system in question). If a company is the sole copyright holder this means that new versions (which contain new copyrighted material that was not previously covered by the license) can be provided under more restrictive licenses.

The case when changing is more difficult is when there is no single copyright owner and all owners would need to agree to change the licensing on their contributions. This is why the Linux kernel license cannot be changed from GPL2 to GPL3 for example, and why companies often want copyright assignment for outside contributions to their "open source" projects (the quotes are because this means there is no guarantee such a project will remain open source). Attempting to revoke a license to which there were other contributors might also result in mutually assured destruction if they decided to revoke your right to use their contributions.

In the Neo4j case the license change would have been possible, but (without reading through the license) it sounds like they have rather ineptly attempted to tack on restrictions to a license that said those restrictions could be ignored. As you say in another comment, if they had removed the language allowing those restrictions to be removed then this would be simple, but they left it in so they've actually themselves allowed the restrictions to be removed, essentially by not understanding their own license. (Although potentially creating a situation in which FSF could pursue them for mangling FSF's license and calling it AGPL, but that seems shakier ground that maybe FSF might not have wanted to venture onto.)

Techie pulled an all-nighter that one mistake turned into an all-weekender

ibmalone

Re: Ouch!!!

Or mv -t, which is helpful when combined with commands like find.

Dark mode might be burning more juice than you think

ibmalone

Re: Nowhere in the blog...

I do find some use in this kind of demonstrative work. A lot of stuff gets published that only gets read by academics, so it's quite useful to have these little more demonstrative projects that can be turned into public engagement. That's sort of what annoys me about them not being clear about OLED vs LCD here, because it seems like they've wanted to play up the importance of the work more than to educate people.

ibmalone

Re: For me it was never about power consumption

Reasonably modern / older expensive displays have a grid of backlights so each light in the grid only needs to be as bright as the brightest pixel in front of it.

How widespread is this? Because I've recently been looking at TVs and local dimming still seems to be a relatively premium feature there. (Putting them near to the price of cheaper LED.)

I find bright text on a dark background much easier to read.

Certainly for self-illuminated displays. I do find printed black on white more comfortable than either though, not really tried white on black non-illuminated (can be done with e-ink displays). Fun aside on e-ink displays, if you haven't noticed yet, next time you're in a LIDL, take a close look at their shelf tickets. A lot of the UK stores now use a programmable e-ink shelf ticket.

ibmalone

Re: And have these idiots...

This isn't really the part of it that makes them idiots[1]. Failing to clarify this is looking exclusively at LCD, at least on the blog summary is the real failing. I suspect people use dark mode for a variety of reasons, maybe they prefer the look, maybe it's more comfortable to read in a dark environment. The point about the text being made darker too is not necessarily really an issue, part of their takeaway is the palettes chosen will probably affect people's response, but this is how people respond to typical dark modes.

The big omission is that on OLED the situation will be completely different. Let's assume all people are doing is turning the brightness up so the text brightness is the same level as they would have light mode background, which I think is what you're suggesting, but now we're already making assumptions about what people's target level is, and that one is probably not a given. (Eyesight isn't linear, why should white on black require exactly the same levels to read clearly as black on white? You'd think the lighting environment might also play a role, although they did test that and no effect shows up in their sample. Science does include testing things that seem obvious, because they're not always true.). Anyway, if we're assuming people will adjust brightness so the peak displayed brightness is the same regardless of mode then the power use will necessarily be less than in light mode, because the sum brightness across the screen is lower and on OLED that's what affects the power demand. Not looked at.

[1] I don't entirely think they're idiots, but the way they've chosen to summarise in the blog post is actually misleading and possibly mildly harmful (since it could persuade people trying to save power on OLED to use bright modes), while they could have done the additional work on OLED. Conference papers don't usually get peer review rounds of revision, often just include or reject, which means while the work as it is is fine they probably haven't been asked to fill out the gaps they would for a journal article.

ibmalone

Nowhere in the blog...

From the actual conference paper:

"Further investigation is also needed to determine whether the observed rebound effect applies to devices with OLED displays, and to quantify the energy trade-off"

I'll bet not.

This is really not ground breaking, although maybe most people don't realise. They tested on an LCD laptop screen (2017 MacBook Pro), that's where the graph on the blog comes from. As they address in the paper, but nowhere in the blog, LCD backlights are set for overall brightness, pixel elements are subtractive, and therefore power consumption is fairly independent of the displayed image, while LED/OLED pixels are self illuminated and so larger black areas reduce power consumption.

It is actually interesting to show that on LCD devices people turn up brightness in dark mode, the increase in power consumption is therefore inevitable. On OLED though, the power consumption is going to be related to how much of the screen is illuminated, and even this effect of turning up the brightness is unlikely to counteract that. Now it's possible I'm wrong and the effect does counterbalance (LED is less efficient at higher brightness for example), but what I find sloppy is that this could easily have been tested with an external OLED monitor, and while the paper itself is at least clear about that, the blog post just outright says dark mode uses more power, which is wrong, especially on phones where OLED is relatively more common than it is for monitors and dark mode is more popular.

Oracle starts laying mines in JavaScript trademark battle

ibmalone

Re: EcmaScript

PHiP ("fip")?

I guess you meant "sequel" though. Well, if fewer syllables are wanted we could just go for one, "quil" maybe, or "s"? Mostly I type or think it, so it doesn't really come up and I just find it easier to parse it as written. Also, SQL may be more syllables, but the se in sequel has a long vowel, while the second syllable has a coda ("quiL"), and to me overall it just feels longer than es cu el. YMMV obviously (yumve?).

ibmalone

Re: EcmaScript

PHP files are called peeaichpee.

On first read that looked like it should sound "peachypee", which initially baffled me, but now i have a new name for PHP.

(Still can't get on board with "SQL"="Sequel")

Microsoft admits January's Windows Update broke USB Digital to Audio Convertor

ibmalone

Re: How about...

Hopefully, in this day and age, USB sound devices use USB 2.0? (While many devices still use PCIe v1 or v2, when newer specs exist: "if it ain't broke, don't fix it" ? - My answer: USB 1.0 was born "broke".)

I've got a very long blog post(/rant) on this topic that I'm still trying to reduce to something actually readable by the sort of half-tech-literate audience that might benefit from it. Some of the takeaways:

1. Many USB audio devices are UAC class 1, which dates to roughly the original USB (or USB 1.1, I forget right now) spec.

2. USB 1,2,3... are not speeds. USB LS, FS, HS, SS (low, full, high, super) are. USB number is not actually that useful in knowing the capabilities, leading to:

3. If a device is USB-C it is by definition USB 2 or higher, USB 2 was revised to include the USB-C connector. A USB-C device can never be USB 1 even if...

4. ... 2+3= a USB-C device may still only be capable of USB FS communication. UAC-1 uses FS.

5. USB FS is plenty of bandwidth for reasonable audio, particularly uni-directional, as in the comment above.

6. But USB FS communications when on a controller managing high speed devices result in scheduling that drops the available bandwidth for isochronous streams quite a bit below even the FS bandwidth.

7. Pretty much every USB-C headphone or headphone adapter is: a. USB-UAC1, and b. Pushing ridiculous sample rates and depths for playback, I've met some that are also trying to do 24bit 48kHz stereo input on a lapel mic. Put this all together and, if it doesn't fall over by itself, it quickly does if it has to share with, for example, a webcam or a GSM modem even when the nominal available bandwidth at even USB HS should easily be enough.

The model in USB-1 era for this kind of thing was multiple ports with their own controllers if you wanted to run the high bandwidth profiles in UAC1, but since USB2 the tendency is lots of ports on a single controller. At the same time cute little dongles for USB-C now support the kind of profiles only pro hardware used to, plug it into a phone and there's no way to choose (in linux you can if you're willing to dig about in pipewire configs, but audio menus wont let you, maybe windows can do similar). USB3 and newer USB2 devices at least have xHCI controllers, which can schedule a bit more robustly, worst of all worlds is if you happen to have USB 2 era EHCI but without a dedicated OHCI/UCHI controller for the USB-1 mode.

Okay, you can now see why that blog post is unmanageable.

How Windows got to version 3 – an illustrated history

ibmalone

Re: Happier Days

Basically yes. Although whether you draw that line at XP or 2000 is slightly open. XP has the distinction of being the putting out to pasture of 9x series windows in favour of the NT based stuff, making NT the consumer option too. That was actually a big change, the stability of NT was significantly better, prior to that you'd not be surprised if a home or small office machine completely locked up. We do still make fun of the Blue Screen of Death, but, unless you've got some particularly bad drivers, you can now go a whole day without the machine needing a hard restart.

But of course the Active Desktop stuff had really kicked off in Windows 98 and got carried through, so there's never really been a version of windows without flaws. However awful that was (and it really was), MS really were somewhat ahead of their time there, given the amount of Javascript and other dynamic features in modern desktops.

Brits must prove their age on adult sites by July, says watchdog

ibmalone

Re: Surely it's already in place at the comms provider

Most ISPs I've dealt with in the last few years you have to turn parental controls off, I've often found this is necessary to use the VPN for work. The most recent one I did this with was Three for mobile broadband and there a proof of ID was needed to allow the switch (what access to a credit card number actually proves is anyone's guess). I think for mobile phone providers it's a requirement for this to be in place, not sure about landline-based broadband, or maybe if your account predates that then the setting remains off.

The control isn't really granular enough as a result to be that useful for protecting children online for home broadband, only really in the case of a personal mobile device.

Haiku Beta 5 / In tests it's (Fire)foxier / It pleases us well

ibmalone
Joke

but there are no viruses or spyware for Haiku anyway

"but there are no viruses or spyware for Haiku anyway"

Pity the poor malware author who has been working on reimplementing one since the original cybercriminals closed the project in 2000.

The ultimate Pi 5 arrives carrying 16GB ... and a price to match

ibmalone

Re: Just sayin' 'no'

I'm not sure why you do but I'll take your word for it

Well, you probably do need 64GB if you want to run 70B LLM while playing a AAA sandbox shooter. Although that might just move "I'm not sure why you do" one level up, as well as adding "why do you think that's the use case for a Pi?".

Also, as someone with a Ryzen based laptop, mobile Ryzen power use is good, but I wont entirely believe 20W at full whack without seeing some specs. AMD Ryzen 7 5800H is 45W TDP on its own, without including the rest of the NUC. Beelink SER6 (just picking one of this class of device as an example) seem to come in around 55ishW on load with occasional 80W excursions under heavy loads.

ibmalone

Thanks, the Jetson looks interesting and probably more practical for serious use. Might be an good next step if we find a real application, I'm looking at the Pi as a proof of concept for SoCing this type of thing (and the GPU is only useful for some of these loads, the single threaded benchmarks for ARM Cortex-A78AE 8 Core from Orin NX and BCM2712 from Pi 5 don't look too far apart, although twice the cores isn't to be sniffed at).

ibmalone

I don't think it's publicly known how much RAM ChatGPT would require to run, but I've seen estimates starting from 45GB, up to around 80GB. OpenLLaMA models can apparently work within 16GB, although I'm not sure that's total system or required free RAM[1]. The Pi also does not have a GPU and the processor is rather limited, so it's not ideal for that application in many ways. (There is a "machine learning" module, but it uses a bespoke architecture and is mostly only suited to 2D vision tasks.)

That said I've been wondering for a while about trying a Pi for some medical imaging research applications, since they are a lot cheaper and less power hungry than what we normally use, even if it might be slower. Some of that is deep learning and some traditional algorithms, both are RAM hungry but not usually to those 80GB extremes, 8GB probably wasn't enough for some of it, but 16GB is a safe-ish bet. (Although less for our own use than for education and outreach and maybe making this kind of technology more available to researchers with less resources.)

[1] I had some fun recently getting a 3D-UNet type model working on a 8GB laptop GPU, would just about fit, but required running with desktop stopped to ensure as much free GPU RAM as possible.

ibmalone

To be fair to malus domestica, that's probably higher performance RAM. The fast stuff is expensive (whether it's as expensive as the premium they charge for the extra is another question, to which the answer is usually no).

3Blue1Brown copyright takedown blunder by AI biz blamed on human error

ibmalone

Re: Utter bullshit!

You can train an LLM

I thought this post was going in a different direction; you can train an LLM around scraped copyrighted material and build a billion pound business on it, but put up your own content as a small operation and you can still be struck off by the copyright vigilante industry.

How a good business deal made us underestimate BASIC

ibmalone

Re: pot?

Some interesting history that then runs into weird opinion piece:

Type a bare expression, the computer does it. Number it, the computer remembers it for later. That is all you need to get writing software.

It eliminates the legacy concept of a "file". Files were invented as an indirection mechanism.

This is both oversimplification and obfuscation at the same time. First part: python and many other interpreters have an interactive mode which works as described (bash shell scripting could be described as just this). Run ipython or jupyter notebook and you get that same experience. Files are a storage mechanism, a way to store those programs, just as I had to learn to do to tape when trying to program on the ZX Spectrum.

Second, maybe you will decide to call something like Jupyter Notebook extra levels of indirection. Now the thing is, so are those basic interpreters. This is the thing that it took me a long time to unlearn, the ZX's Basic interpreter, MS DOS, Linux command line, none of those are in any way some kind of direct interface or native link into "the computer" in a way that Harrier Attack, Windows 95 or Plasma aren't. Because the assembly of hardware is running some combination of that software and its own microcode that provides APIs to hardware. You are not any more directly entering commands into a Z80 with BASIC than you are in something like ipython or Bash. To this point, some python starter guides actually start with interactive mode.

Raspberry Pi 500 and monitor arrive in time for Christmas

ibmalone

Re: Keyboard layout

Not sure this is strictly true, my understanding was that "#" was used for pounds (weight) in the US and OED seems to support a use in 1923 https://www.oed.com/dictionary/pound-sign_n?tl=true

"2. U.S. The symbol # [...] 1923 Special Signs and Characters..#..Number or pound sign"

Which is pre-ASCII (although doesn't attest a relation to weight). ASCII did develop from Teletype terminals, but you wouldn't think there was enough interchange of equipment in that era for the #/£ keyboard key to be the origin (and of course screens weren't a thing at that point either).

What does always cause mild irritation is people (including people in computing) calling "#" in code a "hashtag", c.f. twitter. What do you call "#something" if "#" is a hashtag? A hashtagtag?

Panic at the Cisco tech, thanks to ancient IOS syntax helper that outsmarted itself

ibmalone

A potential problem here might have been that it could be safe for the telescope, but not for the attached equipment.

To kill memory safety bugs in C code, try the TrapC fork

ibmalone

Re: calloc?

-ftrivial-auto-var-init=pattern

This is a nice trick I didn't previously know about, although it obviously doesn't help with alloc heap memory. I've found libasan + -fsanitise=address useful in the past.

ibmalone

Re: calloc?

If the programmer wants something other than 0 to be the starting value, he can explicitly set it - which is no different than the situation today. If you ever have a stack variable that just happens to be preset to the value you want without explicitly setting it, you are relying on side effects of the stack framing which could change the next time you recompile and would definitely fail if you ever ported to another OS or ISA.

Well, this is exactly what I mean. I'm not talking about relying on things being non-zero on allocation, I'm talking about zero only being a suitable initialisation in certain trivial cases anyway and it generally being preferable to using explicit appropriate initialisation. Of which calloc is one type, but I'm not sure I'd propose always using calloc instead of malloc just because sometimes zero is what's wanted. As you mention, global variables default to =0, but it's difficult to tell a global variable someone wanted initialised to 0 from one where they just forgot to give the correct value. An example might be a volume in medical imaging where NaN may be a more appropriate way to initialise data in certain cases, but typically people start with zeroes, a more common one would be structures, where normally some initialisation function is needed.

Which is a roundabout way of saying I'd read malloc() as "allocate the memory, still to be initialised" and calloc as "allocate the memory and initialise it to zeroes", i.e. in the second case zeroes are the correct starting value, rather than just a starting value.

ibmalone

Re: calloc?

Though (in my experience) calloc far more often used than malloc ( unless speed was absolutely critical, & the miniscule overhead of clearing the data was just too much then do not use malloc) , in C would use typically calloc as always safer having "cleared" data than retaining preexisting value that happened to be in that area of memory e.g. there's a small but non zero chance* that if you are testing your pointer data to see if it matches a value, that might just be a match with the "junk" that was present in the memory to begin with.

I've generally been of the opinion that you should initialise correctly, and the correct initialisation of a region is not necessarily all zeros. I've seen bugs where failure to set correct initial values for an algorithm has been masked by an earlier initialisation (to shut the compiler up? I don't really know why people do this, int somevar=0; and then set the value it really should be later). Calloc is in the same class, if you really want zeros then it's the fastest way to achieve it, a nul-terminated string then you'll get it but it's not the most efficient. And if using realloc you'll need to deal with it anyway. (There's even a more subtle type of bug, which is more at the algorithm than processing level, where starting with an array of zeroes will bias your output, have been looking at that in a variant of multilevel spline fits for a while now, it's not a coding error as such, but a tendency to reach for things like calloc might encourage the mindset.)

Europe's largest local authority slammed for 'poorest' ERP rollout ever

ibmalone

Without doxxing myself too much (probably not at all...), the local council has a portal which appears to use the same login but be divided into completely different sections with arbitrary routes into forms and pages from the portal that sometimes just take you to a completely empty (as in no fields) form. Absolutely no desire to know what's at the back end. Submitting a change of details in my council tax recently resulted in two further statements being issued, payment not being taken, so getting pushed back onto the rest of the year and then finally adjusted downwards. All sort of worked out at the end of the day, but you do sort of wonder what's going on behind the scenes.

ibmalone

There's also the issue that councils are meant to be somewhat customised to local needs anyway, rather than rebadged versions of Capita+Veolia+Oracle+Multi academy trust. No, they probably don't need deep customisation and cooperation would be good, but there's a lot of pressure to go with big vendors.

I've probably given before the example of libraries NI, which is now run on some fairly generic US-centric software, staffed with lots of layers of management (except at the actual branch level which is increasingly agency staff) and makes purchasing decisions based heavily on what publishers suggest to them. It's the end result of a sort of efficiency driven managerialism which loses sight of what is actually meant to be being achieved.

(Edit, how could I forget Capita?)

Relocation is a complete success – right up until the last minute

ibmalone

I take the point, although dead presidents are less liquid in the UK than elsewhere (they come in a funny aspect ratio for one thing), although I guess the location in question is uncertain. I'll take a stab at a 220-240V locale though on the basis of current (rather than currency).

ibmalone

Would it pass inspection? Most likely.

Genuine question, as I'm not an electrician, but would it really pass inspection? If the PCs ("et al"!) are plugged into wall sockets and not hard-wired in in some way then that 10A switch is controlling 13A sockets (in the UK at least), don't you have to take into account that something else might get plugged in? In any case my laptop on USB PD will happily pull 80W regularly and 100W at times (have measured with a metered cable), and that's only because it uses a lower energy mode when on USB, the normal power supply is 230W. Tower workstations can easily use more, and all computers will tend to pull maximum power during startup (except maybe the GPU isn't fully engaged).

In the UK number of sockets on a ring main is unlimited (!, but a ring is meant to be limited in the area it serves) and a ring is 32A. It looks like a smaller radial can be 20A, but even then don't you need an actual isolator rather than a light switch? OTOH, isolators do make a bigger snap when toggled than single pole switches, so maybe what was wired in was actually an isolator, in which case probably fine electrically and just an unwise choice practically.

UK sleep experts say it's time to kill daylight saving for good

ibmalone

Re: Leave the clocks alone

Too late to edit: "has to go GMT, +1. All year helps nobody"s/, \+1. A/. +1 a/

ibmalone

Re: Leave the clocks alone

Exactly how I feel when some home counties MP comes up with yet another proposal for permanent daylight savings because they like long summer evenings and think adding an hour to the clock will make that last all year round. In Scotland and NI the days are longer in the summer in the south and shorter in the winter, on top of which NI is quite far west, meaning that from mid December to mid January the sun doesn't rise in Belfast or Glasgow until after 8:40 GMT, for London it's 8:00 GMT. Sunset on the other hand is about 1600-1630 Belfast, 1550-1620 London during the same period and 1540-1610 for Glasgow (being furthest north and losing out at both ends).

i love long summer evenings that last till 11pm, but if the UK ditches clock changes it has to go GMT, +1. All year helps nobody; you are not actually increasing the number of hours of daylight despite what politicians sometimes appear to think.

So even with +1 DST most people in the south of England will still be at work until sunset anyway, while those in the "provinces" would get to trudge to work in the dark as well.

Pixel perfect Ghostpulse malware loader hides inside PNG image files

ibmalone

Re: I'm confused...

Fun aside, back in the days when I spent time trying to help with Ogg Vorbis metadata (there's a limit to the value you can add when people like Chris Montgomery are doing the heavy lifting), one suggestion that arrived from outside the regular developers was to add a field that would cause a command to be run. It was obviously gently but very firmly rejected. I didn't (and don't) think it was made maliciously, just someone who had some idea for a thing they thought would be cool and had in absolutely no way thought it out fully.

Bandai Namco reportedly tries to bore staff into quitting, skirting Japan’s labor laws

ibmalone

One of the things Graeber (misspelled name previously) discusses is precisely this. It's typically the perception (certainly from the private sector) that it's government and regulations that produce the phenomenon, but as mentioned above there are plenty of instances where it happens in the slim and mean private sector too (flunkies to make senior executives look more important being one example).

ibmalone

Re: It's not that simple

Certainly a thing, but hard to actually argue (in the UK for example you have to quit and then pursue a claim against the employer, having to spend money and time while searching for a new job or getting to grips with one).

Also, that page makes no mention of Japanese law, which can of course be different. Some of the links from the current article do discuss the situation in Japan:

And in Japan as well, these practices have been successfully challenged in court. But even so, a broad swath of Japanese companies continue to use oidashibeya.

The rather fascinating "Employment Law World View" take seems to be pitching "we can give foreign corporations legal advice to help them do it":

When employees steadfastly refuse to leave, they are often reassigned to undesirable jobs, or even placed in special “boredom rooms” with minimal responsibilities and no outside contact, in hopes of inducing a departure. All of these strategies carry their own legal implications and must be employed with care.

The fact there's actually a name for them suggests that, whatever the law might actually be, the practice is not unknown.

ibmalone

Re: Oidashibeya

It's pretty cheap to print your own though...

ibmalone

Weirdly, they really wanted us to stay (for reasons of their status) while having nothing for us to do.

Anyone who has not read David Graber's book "Bullshit Jobs" on this and related phenomena really should.

Digital River runs dry, hasn't paid developers for sales since July

ibmalone

The Lenovo store also appeared to use digital river for hardware purchases, at least back at the end of 2021 when I last ordered something from them.

UK ponders USB-C as common charging standard

ibmalone

Re: I have one problem with USB C

Interesting. I see there's a reddit rabbit hole filled with people very concerned about static arcing and things, but I can believe in bad connection issues, particularly with order of pin connection. The one I have is essentially a shortened C plug and socket with a magnetic surround, which in the absence of a design standard for the things (something that should be remedied really), is probably the most likely thing to work properly. I can see there was a rash of other designs on kickstarter a few years ago which were more like actual magsafe, with pins and pads (and not enough of those to be USB 3).

On mine the "socket" side is attached to the C plug, so sits in the device, offering more shrouding for the exposed contacts. Might change my order of connection to laptop first and then charger at the other end, would let the true USB C connectors take care of any order of connection issues while still providing the mechanical protection. (I'm usually unplugging the cable at both ends anyway.)

ibmalone

Re: I have one problem with USB C

For USB-C fans, there are magnetic USB-C connectors (essentially USB-C to USB-C with a magnetic breaker in the middle) rated for 140W and 240W. Recently added a 90degree elbow one to my laptop and wondering why I didn't do it sooner, has tidied up the cables, less stress on the socket (easier to run the cable sideways across the back of the machine without a bend in it) and easier to remove (again without repeatedly stressing it).

Techie took five minutes to fix problem Adobe and Microsoft couldn't solve in two weeks

ibmalone

Re: Fast Start keeps breaking on dual-boot w/Linux

Locked NTFS, fun times.