Re: I should copyright this
They are generally clueless about how these things play outside the US... all those imperial measurements too and saying "fall". (Assuming one knows they mean "autumn", that is some time off down here.)
157 posts • joined 8 Jan 2017
You'd be wrong about Luxembourgish. It is a language. Check out the Netflix series "Capitani" – play it in the original language with English subtitles. With the little German I have, I began to recognise words and structure after a while, but it sounds different and there is Dutch and (IIRC) some French in there as well.
The M2 Macbook Air is not available immediately, because there's only one supplier of the notches, and they are affected by the lockdowns in China. Supply chain experts are concerned that, as other manufacturers adopt notches, world production capacity of notches is insufficient. Sources claim Apple failed to stockpile notches after the looming notch crisis became apparent.
Page vii of "The Design and Implementation of the 4.3BSD UNIX Operating System" says it is "Berkeley Software Distribution". The second listed author of this book is Marshall Kirk McKusick, who I think was project leader – first author is Sam Leffler.
This tallies with my own recollection from the 1980s.
(It was widely referred to as "berserkly". For the Brits here, Berkeley is pronounced to rhyme with "perk", not like Berkeley Square in London.)
APL was invented by Ken Iverson (actually as a notation before a programming language) and ran on more than IBM, although the most common implementation was on OS/360. I know it was still staggering along in 1980, being used for turnkey stuff in finance by a Canadian company called I.P. Sharp.
The Lisp-based editor you're thinking of is emacs ("Eight Megabytes and Constantly Swapping", at a time when 4MB was a largish mainframe memory) and it has little to do with Linux. It arose from the MIT media labs and ran on pre-Unix operating systems like ITS. Then Richard Stallman reimplemented it with a Lisp interpreter written in C, from where it went everywhere. Up until a couple of years ago you could find it installed on every Mac.
"How is a new CPU or other processor first programmed?"
Usually an *emulator* is written for the target machine, which runs on an existing machine. Instructions are just 1s and 0s, so all you need is the definition of what they are (the *architecture*) and you can write a program to do the same thing on any computer.
For example, when DEC developed the VAX they wrote an emulator for it that ran on their much larger PDP-10 mainframe. VMS booted on the emulator before any physical VAX existed. This allowed operating system and hardware development to proceed in parallel.
"How is a language made to function if it's the first of its type?"
You can find this discussed in a compiler textbook under the name "bootstrapping". There are several different strategies, depending on what you have available and whether you are designing a new language, a new CPU, or both.
If it's just a new language L, you can write the first compiler for it in some other language Q. Then you write *another* compiler for L, this time in L itself, and use the compiler you wrote in Q to compile that. At this point your language is what is called "self-hosting" and you can throw the old implementation in Q away.
On the other hand, if your language is already implemented on architecture W and you want to get it running on architecture X, you first modify the compiler you have on W to still run on W but output a program that runs on X. You then tell it to compile itself to X (still running on W), and copy the executable file from W to X.
When DEC ported VMS from the 32-bit CISC VAX architecture by 64-bit RISC Alpha, they wrote a compiler to translate the MACRO assembly language into native Alpha machine instructions, treating it as a high level language. (Some of it even had block structure, using the powerful macros that the assembler did indeed have.)
The irony here is that I've had problems with all sorts of chargers, but never the ones made by Apple. Big Clive, the well-known Youtuber who tears these things down looking for electrical safety problems, confirms Apple, and surprisingly IKEA, chargers as good quality. Some of the Chinese rubbish he takes apart is a shock or fire hazard. I've had no hesitation for years in recommending someone who needs a USB charger (for any device) that isn't rubbish to get Apple. This is on the basis that here in Australia there are Apple stores in many major malls, but usually only one IKEA per city, and my personal experience with the Apple ones.
The Apple chargers I currently own have a USB A port on the power brick, so the cable can be replaced when it wears out. The Lightning cables are weak and I've had to replace a couple, when the chargers keep going. There's no question the cables are weak, but on the other hand they're not captive to the charger like some other brands.
Oh, and here in Australia Apple have taken back their products as e-waste at the end of their life, no questions asked.
I think the EU would be better off mandating minimum quality levels for chargers, and product labelling particularly for the Power Delivery feature on USB chargers. Canon, for example, have started releasing cameras that can charge or be powered by USB, but photographers have found that not all USB chargers work with them. One brand that does is Apple, because it correctly implements this PD standard.
According to that article, the skateboarding rhino has a Boris Johnson subunit, with 1 rhino = 15 borises. However, as the reference boris is now lighter than when the article was written, this will no longer be correct. That is, Boris Johnson is an unstable unit.
It's actually a big intellectual leap from a light pen, where you draw directly on the screen, to a mouse, where you move something around to draw on the desk with it appearing on your screen. It's not even obvious the average person would be able to do that: they would have needed to build one first and try it out to know.
The actual hardware construction was probably the lesser part of this invention.
The key combinations you mention are all already supported by the Australian keyboard layout without any special configuration. If I go get a new Mac from the Apple Store 10 minutes down the road and turn it on, those will be active as soon as I select "English" in the setup screen.
I'm guessing if you didn't know that, those characters aren't important to you, and that is why you think it's perfectly all right for them to be hard to type.
And no, I don't want to install third party software like Karabiner into something as important and sensitive as the keyboard input processing.
The keyboard is missing the right OPTION key.
On macOS, OPTION Is another type of shift, and therefore to touch type characters accessible via it you need an OPTION key on both sides. To touch type an upper case A you press A with your left little finger (pinky) and right SHIFT with your right little finger. To touch type one of the characters on the OPTION layer you do the same thing, so you need OPTION keys on *both sides*.
The pictured sample is a US layout. Using the default keymap for Australia (same as the US), the following characters cannot be properly typed: £, €, dead accent, dead grave. I don't even live in Europe and would miss those. At least the umlaut and curly quotes are possible, because those are on the right of the keyboard.
Now perhaps you could remap the very rarely-used right hand CTRL, but (i) do they supply a keycap for that? (ii) as the article points out, you'd need a Windows machine, and (iii) why on earth isn't it the default if it's claimed to be a Mac keyboard?
Did you miss the bit where you could put your iPad down on the desk beside your Mac and then smash the Mac's mouse through a hot edge on the Mac screen so that you can click and drag on the iPad? And then drag a file between them?
Or where people in two separate households could watch Disney's streaming service on their living room TVs attached to Apple's "Apple TV" media box, *synchronised* via chat on their iPhones and chatting in real time?
Or the OCR text recognition on photos in the album?
I've never personally wanted any of those things, but surely you can see the appeal for the average person. And none of them is technically trivial to do, particularly the synching of TVs in separate houses. They must be calculating time-of-flight like NTP to pull that off.
I've had a look at the list of supported Macs now:
(scroll down almost to the bottom. and feel the 1970s with the white text on purple).
I do think it's a bit mean of them not to support the mid-2015 27″ 5K Retina iMac, which could be had as quite a capable machine with up to 1TB of SSD (so not cheap). The next model (and the 2014 Mini) used Intel integrated or Iris graphics instead of a 2GB ATI GPU, so perhaps they just don't want to spend the money on porting the driver. If so, it would be in their interests to explain themselves, including reasons why the driver had to change at all.
Not only will iOS 15 run on anything that will run iOS 14, but in a change of policy iOS 14 will continue to get security updates.
I haven't looked closely at what Macs will run Monterey, but it does include the Mac Mini 2014, which rumours had said it wouldn't, and iMacs back to 2015.
Looking at the stated reasons in their github post, we have:
3. They want to know whether they should drop support for macOS 10.10.
4. They have a known issue with the new 3.0 file format that they want to collect more data about.
Maybe I'm old-fashioned, but if their new file code is corrupting data, it seems to me they should roll back to 2.x, properly test and debug the new code, and re-release 3.0 when it's actually complete. I'm not a fan of developmemt by trial and error no matter how many data points you gather, and I think even if people aren't paying for the software they have the right to expect it does not corrupt their audio data.
As for point 3, I no longer have anything resembling 10.10 to test on, but by now I would be expecting that version of macOS to be having problems with root certificates and cipher suites such that there is no guarantee it can even talk to the telemetry servers. And as a Mac developer, I'm wondering why they are wasting their time faffing around with ancient development environments that can even target it. Leave these people be with whatever version of Audacity was current with their version of macOS.
What was wrong with chargemaster.bp.com and later pulse.bp.com? They are never going to let bp.com lapse, and I can tell at a glance that the email came from somewhere inside BP. You know, the brand recognisable all over the world.
This reminds me of the time I got an email ostensibly from my bank but from an apparently unrelated domain. A search revealed the odd domain was actually a slogan from their current television ad campaign. I don't watch commercial television so had never seen it.
It is true that she travelled to Sydney, but hardly surprising that she might want to wait and think before finally going ahead with a formal report, given who she was accusing. Presumably during her visit to Sydney (I tnought it was one visit, 4 interviews) the detectives briefed her about the likelihood of a successful prosecution. So then she would go home to Adelaide, think about it and discuss it with friends there before making a formal statement.
I'm not sure she'd even completed the 14 days. The NSW police did say they didn't want to do it over Zoom without a psychologist present. What they conspicuously failed to say was "we told her we'd arrange for a psychologist to sit with her when we took her statement over Zoom as soon as the quarantine was over". Presumably she was already seeing one.
I am not saying the historical rape occurred, and whether it did or not isn't really the issue.
It is the way the Prime Minister passed his copy of the dossier on to the wrong police force without even reading it, and his failure to act on the Higgins affair, and his comment that the women demonstrating outside Parliament are lucky they weren't shot that have landed him in hot water.
No, it wasn't his purpose, because the entire country is already talking about the issue. It is the major running issue in Federal politics, with a rape scandal in Parliament House and a senior minister separately accused of historical rape. It's either a well-intentioned (if slightly ludicrous) response from the Commissioner or he's trying to deflect from his force's inadequate performance on the historical rape.
For background on the historical rape allegation: the Cabinet minister was accused of behaviour in a Canberra bar inconsistent with his goody-two-shoes Christian image by Four Corners (like Panorama). When nothing was done, the alleged historical rape victim was incensed and decided to report the crime, which occurred in NSW. She was resident in South Australia and made the report when COVID was still a problem in Australia. The NSW Police declined to either seek an exemption from COVID travel restrictions and go to Adelaide, or interview her on Zoom with South Australian police present, citing their concern for her welfare. On hearing this, she committed suicide the next day.
The Prime Minister and NSW Commissioner have been hiding behind the ridiculous proposition that because there is no formal statement, the crime cannot be investigated so the Minister must be innocent. The proposition is ridiculous because the victim was a professional historian and put together a 31-page dossier including diary entries from the time of the rape, who she told about it over a period of decades etc., which was sentt to senior politicians of three parties, including the PM himself. It has been kicking around Canberra for months and eventually leaked.
Four Corners then doubled down and broadcast a second program. The Cabinet minister sued them for defamation. The ABC will argue truth, and try to prove the rape in court.
So no, it wasn't an issue not under public discussion!
The article doesn't define what an "autonomous database" is, and I'd never heard the term, so I Googled it. Every hit on the first page is a hit for "Oracle autonomous database". It appears to be an Oracle invention for tuning and management software with AI assistance. I don't think The Register should be normalising the term by using it as if it is valid industry jargon.
It's a "good hack" if you can't later clearly state a reason why it should be replaced by something done "the right way".
The ".." in Unix paths (as in, there was originally an actual directory entry with string "..") was a "good hack" because it is hard to argue that anyone would actually want to create a file called "..".
Photoshop 22 beta runs natively on M1 Macs:
Apple have been purging the APIs of nonportable stuff for several years now. For example, the provided calls for atomic increment and decrement were deprecated in 10.12, five years ago, with developers advised to use C++ std::atomic<> instead. Any use of a deprecated function results in a compilation warning.
The kernel extensions are going away for Intel as well.
The commitment to 5 years' support for Intel Macs is so that owners of existing machines have software to run.
It very much does, and IIRC it is the one of the most popular websites in Australia. Currently rubbing its little virtual hands in glee pointing out the ABC also has apps.
The issue is, people in some of these tiny places *do not have* affordable access to that website, or any website. The internet for them is a phone on 4G pointed to Facebook. How that came about, I don't know, but presumably Facebook did the telco a deal.
It's all in the article I linked, on the very website you complain does not exist.
The ban has badly affected some small island nations:
TL;DR: a telco serving several of these countries offfers unmetered Facebook, which means it is *de facto* "the internet" for large numbers of people there. Austraia's national broacaster built out a tailored news service for them based on Facebook, with 100,000 users who now have no access to it.
Some of these countries are so small they don't even have their own currency, and use ours.
So, no, the sky didn't fall in for *us*.
People should not put their faith in these distance-based apps. They try to detect whether you have been within 1.5m of a COVID-positive case for a significant amount of time, but it is now known that the virus is spread by aerosol. That means if you are within the same poorly-ventilated space as a case you can be exposed, and you don't even need to be very close.
Of particular relevance to Japan, if you're in the same train carriage you could be infected. In the case of outbreak Melbourne over the New Year, where you can walk through between carriages, they required the whole trainload to get tested and self-isolate.
Whereas Australia's app was (and possibly still is) also a dud, and the technical community were trying to get the government to admit that at the start of the pandemic, people are no longer wound up about it, and the app is mostly ignored. What we do have on our phones are QR check-in apps from our state governments, so it's known who else was in that space. We're also almost cashless here in Melbourne, so there's a timestamp on places like supernarkets and take-away which don't require QR checkin (some states do). If a case leaks out of quarantine, as happened this week, a list of exposure sites is posted within hours, and the government will phone those known to have been there at the time.
New Zealand never went down the proximity app rabbit-hole and jumped straight to QR codes from the very beginning,
The odd "bug check" terminology was straight out of VMS. There was a companion "machine check" for when the hardware noticed something wrong with itself.
It always seemed wrong to me, because you can check for a bug and not find one. On the other hand there is the mental image of some Microsoft engineer taking his shiny bug out of the hangar, and doing a pre-flight check on its wings, eyes, antennae etc. so that the user would know, as usual, the bug would be ready to bite the user.
The reason was historical, and predates both Linux and Unix.
Long ago, there were computers that had no lower case. Some of them used 6-bit codes. Others used character codes that could accommodate lower case, but their printers could not print lower case. (Back in the day, the better printers didn't use a matrix of pins, but "chains" carrying lines of raised characters in a line that would be struck against the paper.) There were even a couple of years there where ASCII had not yet got its lower case characters. So, while there was usually some way to upper and lower case the content of your files, the last thing you needed was for their names to consist of characters you might not be able to see or type. Hence, the native form of file names was generally UPPER CASE, and the UI would helpfully convert lower case names to upper case so you at least had a fighting chance to get them in the editor or feed them to the case-change utility,
And naturally, if you had a proprietary operating system where file names were case-insensitive, your customers had got used to it, and it would also be a hassle for them to reformat all their volumes, So things carried on long past the point where lower case support was ubiquitous.
If you grew up with it, it does not seem so strange. (Except, I imagine, if your're Turkish, because the lower case of "I" in Turkish is not "i". And that is the real problem with case-insensitive file names – whose case rules?)
It seems not:
This comment says it was caused by building using an old version of Apple's Xcode developer environment:
That must be some old hardware they were rocking - even my 10 year old cheesegrater will run Mojave. In February they moved on to Xcode 11 on Mojave, and those builds are no longer blurry on Catalina.
But it broke again for Big Sur, the newest release from about a month ago which you get if you buy a Mac today:
The comments are instructive: they don't know why it doesn't work, and a developer who worked on the original code for Mac does not currently have access to a Mac. I'm also going to guess the developers haven't been going to Apple's WWDC developer conferences and talking to Apple's engineers there. It looks like these problems might have gone away sooner (running 2 years now) or perhaps not arisen at all if LibreOffice had had more money,
(There's a fork called NeoOffice which claims to work better on macOS.)
All right, I had another look. It is in 8 point type in dark grey text on charcoal at the bottom after the privacy notice and other stuff no-one ever reads in the page footer, followed by a link whose name is that of another free software product. I didn't see it the first time, and I don't think any normal person would either.
It is not my job to "parse" a page. Or my job to "contribute" to open source, or "get involved" or anything else. Nor is it "techie" to put up with user-hostile layouts. I told you I didn't agree with Stallman's ideas.
These guys are competing with the excellent word processor/spreadsheet/presentation suite that Apple gave me *for free* with my Mac, which also runs on my phone. Or with Microsoft, to whom many ordinary users are happy to fling a relatively small amount just so they don't have to learn anything different (and Microsoft throws in a shedload of cloud storage too.)
I was referring to the claim that Stallman's motivation was that he didn't want to pay for things, not whether he uses LibreOffice or not. He won't use those Linux distributions which, although they cost nothing, have binary blob drivers in them, which shows he has other reasons. I really don't know whether he even uses a word processor; he's said to do everything in plain text in emacs like it's still 1978.
As for "free as in beer", I've never understood what that is supposed to mean. Maybe I have to be American, or have lived through an Animal House-style fraternity to understand. Anyway, on a quick scoot around the LibreOffice web site (oh! my eyes!) I could not find a license statement, but I don't think it could be GPL because I once bought a word processor for macOS that contained some of it embedded, to do the conversions.
I disagree with most of Stallman's ideas, and from reports I think I'd dislike the man if I met him, but this claim is unfair to him. He is known not to use certain software which is available for no payment, because it does not satisfy his stricter definition of "free".
Furthermore, he has in the past paid money, on behalf of GNU, to people to write software which GNU gives away.
Bjarne Stroustrup is Danish, but he developed C++ at Bell Labs. One of his design constraints was that it was targeted at people who already knew C but not Algol. Otherwise, he might have done a mild update of Simula-67 (which he knew). The culture has a big effect. (Stroustrup's mathematical elegances shine dimly through the New Jersey crud in stuff like the way inheritance works and the operator overloading. Perhaps not coincidentally, some people hate those things in the language.)
Guido van Rossum, on the other hand, invented Python at the Dutch CWI (their national maths and computer science research insitute), the very same place that produced Algol-68.
Erlang was developed by Joe Armstrong (British) and others at Ericsson, a Swedish company. Again, he was given the freedom to do it properly and his language was not dismissed as "too abstract" or "too inefficient". It's the culture, not the nationality or first language of the originator. Erlang was probably a big reason why Ericsson's exchanges sold so well; they were better.
I have to disagree that Pascal and Modula-2 were considered inefficient at the time. Both were used for real-time work – I've personally seen the firmware on a network card done in Modula-2.
Ada might be too verbose for your taste, but it is consistent.
Pascal took a lot longer to sink than the others, but there's not a lot of it floating around now. (Back in the 80s, I built utilities with it that had thousands of users.) It's true it left a trace in Swift's and golang's declaration syntax. I can't think off-hand of any other way it influenced modern languages; everything else came from Algol-60.
Python is, I would argue, somewhat elegant within its original intended purposes. It's flawed, e.g. the "__init__" muck, but then I did say I "quite liked" it rather than listing it with the others.
Translation: "We used a nifty new language and had to hire an expert when the organisation sponsoring it threw in the towel. Good thing they didn't keep at it but take the language in some direction inappropriate for our uses, because then Klock might not be available to us."
I'll reserve judgement on the language itself, other than to say, whatever the merits of the borrow-checker concept, it seems to be almost as much like line noise as perl or TECO. At some point I'll learn enough Rust to form an opinion, but I'm afraid it's behind Erlang in the queue.
I'd prefer if the next language did not hail from the Anglosphere - every language from there (B, BCPL, Bliss, C, C++, perl, go, Swift) has lacked taste. Yet a succession of elegant things from other countries (Pascal, Modula-2, Simula-67, Ada, and probably if I knew them Erlang and OCaml) have sunk almost without trace. I quite like Python - guess what, it's originally from the Netherlands.
It's almost as though dominance of languages has little to do with their merits and more to do with what big American companies like.
Biting the hand that feeds IT © 1998–2022