* Posts by Kristian Walsh

1497 posts • joined 10 Apr 2007

Classy move: C++ 20 wins final approval in ISO technical ballot, formal publication expected by end of year

Kristian Walsh

Re: Is C++ becoming too large and complex?

No madness, just perhaps a misunderstanding of what a declaration is for, and an attempt to misuse a keyword for its side-effects.

Because that is the whole point of auto - deduction of a type from an initializer. It is for exactly that purpose - to allow you not to be specific.

You can't keep hand-waving all the way to the the metal. At some point, you have to be specific, and for a variety of reasons, allocated class member types are one of those places.

Incidentally, you are allowed to use auto to declare a field of a class/struct, but only where that field is both static and const (and the initializer is a constexpr or literal) static const fields are special, though, allocated and assigned just once and shared by every instance (sometimes they don't even get allocated unless your code tries to get the field's address). This single point of initialisation may give you an idea why they can easily be allowed to use auto, while class-members are not... hint: consider how you'd declare a constructor that sets your Fred::x field.

Even if it didn't cause problems for code generation, your desired solution would only work for fields that have a default initializer, and thus it is only half a solution for the legitimate requirement of needing a field that can always contain the return-value of a given function. Luckily, the people standardising the language did consider this, and so if you need a field type that always matches the return type of a function, you should be using decltype(), which can be used regardless of initialization status. (auto can be seen as a special-case of decltype() for use when declaring variables and constants)

struct Fred { decltype("gotcha") x; decltype(foo()) y = foo(); };

... and this syntax also allows you to write a constructor, or any other function, whose argument type will match the return-type of a function, or match the type of any other declared constant.

Kristian Walsh

Re: Is C++ becoming too large and complex?

struct Fred { auto i = 43; };

Why would you ever want to do this? If you're not able to be specific about the datatypes used in a DECLARATION, then maybe look at using Python or something. No typed language with type inference allows this kind of "yah, whatever you're having yourself" declaration.

If I were maintaining that code, even if that syntax were possible, I would switch it back to int, so that the intention of the programmer is clearly visible.

Or, to put it another way, what I like about C++ is that I don't have to deduce what could be in a data-structures by looking at field assignments.

DPL: Debian project has plenty of money but not enough developers

Kristian Walsh

Re: Oh dear

You've hit on the major weakpoint of the volunteer FOSS model. While many volunteers end up being paid to work on their hobby, many do not, and end up pulled between having to make a living and fulfilling their personal commitment to a project. Shortage of time leads to poor progress, and complaints from users who just assume that package maintainers do nothing else but maintain packages.

I'll be charitable and say that it's often the thanklessness of maintaining a package that causes the aggression - any small request runs the risk of being the straw that breaks the camel's back. (That said, yes, there will always be arseholes in every sphere of human activity)

As for what to do with the money: Once it becomes practicable again, why not pay for people to meet up more often in person? It's time we stopped assuming that every programmer is an asocial hermit - it's simply not true, not even for Linux. An easier opportunity to meet contributors and maintainers would also go towards solving the problem of finding new contributors and maintainers.

In the frame with the Great MS Bakeoff: Microsoft sets out plans for Windows windows

Kristian Walsh

Re: What about what shows on screen?

That CPU argument was valid for Vista and 7's eye-candy modes but it doesn't hold water today. Windows 10's window design requires fewer drawing operations than that used in Win2000. A standard window with a title-bar, title and the the three control buttons in Windows 10 needs just eight graphic operations to draw. Shadows are done by the GPU at no CPU cost.

Kristian Walsh

I agree in principle, but not with the alternatives you offer.

Web applications are horrifically inefficient, and subject to strange behaviour depending on client. When I write a UI, there are times when I need different components to align to the exact same pixel - doing that in web still degenerates into a nightmare of rules and javascript in too many cases.

Plus, and this is a big one: right now, moving to Web means you get no code reuse from an existing C application. That's a problem for codebases that have decades of customer requirements baked into them.

Adopting a cross-platform development framework for C/C++ code makes more sense, but I'd add a recommendation to use Qt instead of Gtk+. Yes, it has those odd SLOT/SIGNAL extension macros, but at least it doesn't force you to learn how to write a graphics toolkit just to do simple things like custom list cells and data bindings, and QtQuick is still so much easier to get good results with than anyone else’s GUI markup system.

Kristian Walsh

Re: Two different windows are a problem ... so add a third!

Architectually, UWP and Win32 are not compatible. UWP is designed around there being a graphics rendering pipeline, whereas WIn32 has the underlying concept that there is a grid of pixels in memory somewhere for everything, and it is shared between applications.

This can be frustrating at times (e.g., in UWP, most loaded bitmaps are just lightweight handles into GPU surfaces: you cannot access the pixels in a BitmapImage control unless you explicitly render the control into your own memory buffer), but the upside is that the GUI operates tens to hundreds of times faster, never needs buffering, and you are able to offload all GUI assets into your graphics card's onboard memory rather than losing application memory for it.

This approach seems to be a way of re-writing what can be rewritten of the old API using the new architecture, and leaving the rest in a kind of compatibility "island" drawing surface within those new windows.

The closest similar problem was what Apple faced when moving MacOS to the pre-emptive multitasking kernel of OS X in 2000 (at least Win32 was designed with multitasking and thread-safety in mind; Mac Toolbox wasn’t). Apple’s approach was to simply bin everything and tell developers to use a completely new API; an approach that, as I hope is obvious, is something that would never work in the Microsoft ecosystem, where developers have a much more robust and equal relationship with the OS platform vendor.

Kristian Walsh

It's funny to see X11 proposed as a model for Microsoft to follow. You guys must really hate Microsoft…

If I want bandwidth-efficient, remote application access from a Linux system, I use SSH.

If I want a graphical UI, I accept that it I have already demoted bandwidth efficiency in my priorities, so I use VNC or other system that works on raster images. Given the way GUI applications really draw their window content, as opposed to the way X advocates would like them to, there isn’t much difference between shipping window content bitmaps over the network versus the drawing instructions, and sometimes the bitmap is actually more efficient in cases where the display is composed of tens of thousands of drawing operations.

X11 was a system designed for a different, bygone, world, where centralised computers ran applications that were viewed on thin-client terminals (yes, I know the terminal isn't a "client" in X, but it is one in network terms). That is not how the majority of graphical application systems are used now, and there's a good chance that it never will be.

Anyone else noticed that the top countries for broadband speeds are well-known tax havens? No? Just us then?

Kristian Walsh

Measuring what's easy to measure, not what's significant

This survey is the equivalent of estimating a country’s road traffic journey times by using the average top speed of the vehicles sold there each year.

First error: by using speedtest data, they’re basically just measuring the speed between the user premises and the next hop in the network. That doesn’t tell you about backhaul provision (hi Cable ISPs!) or consistency of service (ibid.). Use an “off-brand” speedtest website and you can see your reported download figures tumble.

The other problem is that measuring speedtests is confounded by pricing structures of the providers more than the actual network capabilities. If providers in Country A give a 20 Mbit service for €15/mo, an 80 Mbit service for €20/mo, and a 200Mbit/sec for €35/mo, people like me will take the cheapest one that exceeds my peak bandwidth requirement (which, incidentally, never rises over 20Mbit/sec). Lots of users living on their own, who want Internet for occasional use will also go for the services with low bandwidth caps on cost grounds. But in another country, where operators only offer the 200 Mbit/sec option for €35/mo in an effort to maximise revenue, that country’s network to be considered faster, even though both are the same capacity.

The only legitimate way of determining this survey's results would have been the hard way: discover the share of domestic connections by carrier type (DOCSYS, xDSL, FTTP, UMTS), and then apply an empircially-derived performance figure per technology (and generation) to those connection counts (thus, a 1 Gbit cable connection scores far, far less than a 1 Gbit FTTP), and look at the amount of trunk network capacity in the country.

Happy birthday to the Nokia 3310: 20 years ago, it seemed like almost everyone owned this legendary mobile

Kristian Walsh

Lipstick phone

Ah the Nokia 7280. That's really a product of designers and engineers (both asking “how small can we make this?”) over management’s requirement to make something cheap. All the 7xxx phones were basically “screw it, we’re rich: go do something mad!”.

I knew someone who had one. It’s actually much easier to use than you think once you’ve got all your phonebook on the SIM (remember, this is from the days when your contacts and last ten SMS messages were stored on your SIM card). Entering your phonebook, though…

Real Crazy Finnish Design was the 7600 (a square/leaf-shaped phone with a screen in the middle and keypad buttons along the edge), or the 3600: for some reason this had the buttons arranged in a circle on a surface that would easily have accommodated a usable keypad.

Relying on plain-text email is a 'barrier to entry' for kernel development, says Linux Foundation board member

Kristian Walsh

Re: "plain old ASCII text is a barrier to communications"

Yes and no, really. ASCII-1963 was published as an uppercase-only alphabet; CCITT and ISO used this as a base for a two-case system, which began standardisation in 1964 and was published in 1967. ASCII was then revised in 1967 to be fully compatible with the new ISO standard. I suspect that a clean-sheet ISO design would have had fewer control codes (most likely just 16) and more printable glyphs so that the national replacement characters wouldn’t have clashed with punctuation characters that were often used as field delimiters in documents... despite ASCII having four control codes specifically for delimiting fields in the form of FS,GS,RS and US, it seems that nobody ever used them (@jake, you're at this game much longer than I am, maybe you've come across something? Personally I often use these as delimiters when I need to bundle multiple outputs from one *nix tool that may contain tab/spaces/linefeeds into another shell-script. ASCII-safe and almost guaranteed not to already be in the data being processed)

But ASCII is the classic example of how an inadequate but ubiquitous standard drags everything down to its level. Even by the standards of the 1960s, and even in the fiercely monolingual USA, it had considerable shortcomings - most notably, it is incapable of correctly encoding the proper names of places in parts of the USA that were first settled by the French or Spanish. Sadly, this shortcoming was then set in stone by the US Postal Service, which disallowed accented letters in official “preferred” addresses, simply because at one time its OCR and database vendors could not handle them or their terminals couldn't enter them. (Many state DMVs did the same later with names on driving licenses)

Back when I visited the Bay Area a lot, there was a sign I used to pass on I-280 for what was originally signed as “Cañada College” but was now misleadingly spelled “Canada College” thanks to this policy — as someone who was working a lot with text encoding and processing at the time (I wrote software localization tools), what struck me about it most was that you could still see the shadow of the tilde that had been removed from the sign-face after the official street-address changed; obviously someone was perfectly well able to deal with Spanish names back before computerization came along to limit their horizons.

(Actually, a quick look at Google StreetView shows me that a newer sign for the Cañada Road exit has regained its tilde, so there is hope for the future... although that’s one ugly tilde)

But on the original topic of kernel submissions, the biggest problem with insisting on “ASCII” is that nobody uses it, because there are pretty much no 7-bit text computer systems in use anymore. That leaves “an unspecified superset of ASCII” as your de-facto standard. (If I had a buck for every time I heard someone say “8-bit ASCII” like it was an actual thing that had ever existed...). To avoid this ambiguity, Linux Kernel submissions are actually to be submitted as UTF-8, not ASCII, as this allows maintainers to use their actual names in those submissions, not some bastardised, accent-stripped version of it.

Kristian Walsh

Re: "plain old ASCII text is a barrier to communications"

"a universal communication medium. Look up the acronym. "

I didn’t need to. ASCII stands for American Standard Code for Information Interchange.

Hmmm.. Very universal. At least it’s not EBCDIC, I suppose.

Aw, Snap! But you should see the other guy – they're in dire need of a good file system consistency check

Kristian Walsh

Re: I don't care....

Make the on-card filesystems read-only, put all writable stuff into a ramfs, and there's no issue with using SD cards. Flash is destroyed by writing, not reading, and for something like this that’s just a thin-client to a web-hosted service, there’s no need for writable persistent storage on the device.

I would, however, have also designed this with a recovery image so that any corruption of that main flash could be blasted away easily by a quick dd, but as it’s a mobile unit anyway, access for repair is probably not as big an issue as for fixed stuff.

Sun welcomes vampire dating website company: Arrgh! No! It burns! It buuurrrrnsss!

Kristian Walsh

Re: Inappropriate garb

Yep. As someone with a lifelong aversion to "suit-and-tie" dress codes, I don't really care what a candidate wears to an interview, or what they wear at the office, but there are limits: it has to be intact and, most importantly, it has to be clean.

Splunk sales ace wins sex discrimination case after new boss handed her key accounts to blokes deemed 'flight risks'

Kristian Walsh

Re: Sex discrimination or just bad management?

No. Blame lands firmly with the new manager.

First, sabbaticals are a form of compensation, and for senior staff who rarely get to use scheduled calendar leave without interruption, they can be the only real form of leave available. As a form of compensation, they must have no downside. Your employer doesn't say "here's a bonus, but we're stealing your monitors". If your situation of employment leads you to think that asking for extended leave and being granted it will hurt you professionally, you need to find a new job.

Second, it's a very poor manager who takes over a new position without first determining who's good and who needs improvement. Having a top performer on sabbatical isn't a disaster: it opens up opportunities to advance existing staff. What it doesn't do is give you carte blanche to hire in your old chums instead. That's bad management. (My own rule of thumb is that the quality of a manager is inversely proportional to the number of hangers-on they bring along with them in each role).

The sexual discrimination was in paying the new, untried buddies more in basic wages and OTE than the proven successful female employee. There's no ifs and buts about that.

The constructive dismissal was refusing to return the top contracts to the higher-performing agent on her return, instead leaving them with the manager's friends who (we must assume, as it was not put forward as a defence) were not managing them as successfully as she had. Or, in other words, creating a situation where the employee feels they have no other option but to resign.

Apple's at it again: Things go pear-shaped for meal planner app after iGiant opposes logo

Kristian Walsh

Re: In an alternate reality somewhere

The Macintosh had lots of advantages over Amiga, but the design of the system software and the all-in-one form factor were the biggest ones. Amiga was a spectacular home computer, and it started the whole category of “multi-media”, but it really wasn't good for productivity software once you moved away from video graphics - there wasn't any word processor as easy-to-use as MacWrite, or a spreadsheet as easy to use as Excel (the original version of Excel was developed for Mac; there was never a DOS version).

The question of whether Macintosh was worth the crazy price Apple charged is a different one, and the answer to that is probably “no” for people like you or me who come from a technology background and know how this stuff works, but for many buyers, the price was definitely worth the savings they gained in being able to use a computer to write their documents without having to memorize arcane command sequences.

The consistency of the Macintosh, even when you went from one vendor’s software package to another’s was completely revolutionary: Apple was the first computer maker to take this idea seriously, and so the developer documentation didn’t just tell you what the libraries could do, it also told you how you were supposed to present these functions to the user. (This was a key concept in Macintosh from its beginnings as a text-based system under Jef Raskin’s direction, long before Jobs arrived and turned it into a GUI system to undermine the Lisa project that he’d just been booted off)

Linux Foundation starts new group to build pandemic-popping software

Kristian Walsh

Re: Corona Washing

These apps are not a replacement for contact tracing.

The idea of determining "dangerous areas" is a red-herring, because it's not places that transmit the disease, but people and their actions: standing in a hall is not dangerous, but attending a choir practice at that hall is. The idea that there can be "red" and "green" zones actually increases the spread of disease: crowds won't go home when a popular area is marked "unsafe", they just congregate - at the same density - in a "safe" area... which will become unsafe simply by hosting so many people in close proximity. It's a kind of epidemiological whack-a-mole, which can end up increasing the geographical spread of the illness.

The app's function is simple: alert people when they have been in contact with someone who has self-reported symptoms. At that point, the person contacted can restrict their own movements to avoid any further spread of the disease - this is very important if the contact has any kind of job where daily contact with many people would make them a multiplier of the infection (not really a factor for most Reg readers, I grant you...).

It's only one part of a set of tools for managing the outbreak. The other parts - personal hygiene, widespread testing, contact tracing, physical distancing - don't become irrelevant just because your phone's got an app.

Kristian Walsh

Re: Correct me if I am wrong

And as far as I can see, as a Node.js project, the NearForm app doesn't even link against Linux, so I don't see why they're getting involved with it. Apart from some virtue-signalling, I guess...

You've had your pandemic holiday, now Microsoft really is going to kill off TLS 1.0, 1.1

Kristian Walsh

Yes, but they did do a "clean start": it was called UWP. Developers looked at it, then wanted to keep the duct-tape version, because they had a pile of code written for it already that worked, and porting costs have a ROI of close to zero.

This new initiative (which, if you ask me, should have been started at the same time as WinRT back in 2012) lets developers keep their existing code, and use the useful bits of the new APIs.

Incredible artifact – or vital component after civilization ends? Rare Nazi Enigma M4 box sells for £350,000

Kristian Walsh

Re: Super duper encrpytion device brought down by simple mistake

Four-rotor Enigma was reconstructed because in 1938, when the German military shifted to four/five rotor use, the Nazi Party's own internal security service remained on three-rotor machines, which created an abundance of messages that had been encoded in both algorithms. The work of reconstructing the four-rotor Enigma was done in Poland at the very start of the war.

At no point did Enigma rely on "obscurity" in the sense that cryptographers use that word. The mechanics of the device (i.e., the algorithm itself) were known from the 1930s, and while the addition of a fourth rotor made life very difficult, that job wasn't made much easier when the 4-rotor Enigma machine was fully described.

The Germans knew that Enigma could be cracked, and they knew that four-rotor Enigma could be cracked too. However, their predictions of how long any such crack would take were based on an assumption that proved to be untrue: that the Allies would not invest huge resources into cryptanalysis, and instead rely on brute-force attacks and traditional espionage techniques to obtain information. By 1945, the use of clever search-space reduction techniques developed by Alan Turing and Peter Twinn, brute-forced by the high-speed "bombe" machines commissioned from NCR by the US Navy (but deployed at Bletchley) allowed a 48 hour decryption of all Enigma traffic.

(The Colossus computer was built to decode a completely different German cipher, the electromechanical Lorenz SZ)

.NET Core: Still a Microsoft platform thing despite more than five years open source

Kristian Walsh

Re: What about QT?

Qt is an application development framework first and foremost. It has a very nice UI toolkit and lots of cool features to make it easy to write interactive apps that run fast and respond well, but it's very focused on software that presents a graphical UI that you interact with (interesting fact: Qt is used in a lot of in-car entertainment systems, precisely because it's so good at GUI stuff on relatively slow hardware)

.NET isn't really comparable, as it's a runtime and a general purpose programming library (like Java). You can use it to make servers, clients, web-apps or desktop apps. It doesn't have a single UI toolkit, so you have to make a choice between them (of the available ones, only the Windows-only UWP comes close to Qt, but Uno could have potential).

I think both are great technologies, and I've written apps for mobile and desktop with both of them. There's things to like about each: C# is a more productive language than Qt's C++ (speaking as someone who had decades of experience with C++ and came to C# relatively recently), but on the other hand, Qt's QML layout system (and its use of JavaScript for tying together stuff that's a little more complex than a binding) is so much more developer-friendly than the XAML-based schemes you tend to see on C#, Net applications.

The FOSS puritans don't like Qt either, by the way - I can't remember what the reason was, something to do with its dual licencing where commercial customers got features first, but I find those people tend to do much more talking than actual programming, and thus are really only in a position to advise on political, not technical matters.

Kristian Walsh

The two major compilers for C# and two of the three .NET runtimes are open-sourced. Are you unable to clone a repository and execute make?

And does C/C++ really have hundreds of compilers? Based on the various systems I write for today, 90%+ of the market is just three: gcc, msvc and clang.

Kristian Walsh

Re: The problem as I see it

C# has all those advantages, plus more features and a mature, fully featured standard library rather than an "ongoing project to implement" one.

Swift is an okay language, and it's the least bad way to write native iOS code, but you really have to be in love with Apple to consider using it outside of the Apple ecosystem.

Kristian Walsh

Re: What's up with non-.NET developers thinking?

Mono and .NET are converging much more, especially now that the .NET versioning nightmare on Windows is finally ending (.NET 5 will be one single API set, regardless of what it runs on. Hurrah! It only took 20 years...). Mono now supports "everything in .NET 4.7 except WPF, WWF, and with limited WCF and limited ASP.NET async stack" according to their own documentation. (https://www.mono-project.com/docs/about-mono/compatibility/)

One good move is that now Microsoft are steering devs away from using the "Windows" flavours of .NET in favour of the ".NET Standard" APIs, the cross-platform API set that Mono implements.

Both .NET and Mono run on Linux, so it's a question of where your priorities are. Mono has better performance compiled down to native binary, while Microsoft's own .NET has a faster JIT runtime. Both support WebAssembly.

Soft press keys for locked-down devs: Three new models of old school 60-key Happy Hacking 'board out next month

Kristian Walsh

I used to use L,H,J,K lad, and I were lucky to even have that!

No cursor keys? That's certainly a... bold move for a "developer" keyboard.

When one open-source package riddled with vulns pulls in dozens of others, what's a dev to do?

Kristian Walsh

Re: Inspection+test at "goods inward"? How quaint.

The flaw of ISO9000 is that it assumed that a business, on seeing the true horror of their process, would take action to simplify it. This was true in some businesses that introduced ISO9000 as a way to improve their operations, but not so in others, whose motivation was basically to do the minimum required to be able to print "ISO9000 certified" on their brochures.

Later quality processes such as WCM move away from process and focus on the general concept of eliminating wasted efforts. That change of focus makes "...and now, go and fix these crazy things" part of the standard operating procedure.

But you've still got to have a company that doesn't just talk about quality, while acting in every single way that's guaranteed to diminish it.

Apple gives Boot Camp the boot, banishes native Windows support from Arm-compatible Macs

Kristian Walsh

"There may be a time when MS produce a reasonable ARM version of Windows."

Windows 10 is fully ARM-native, including all of the built-in applications. Latest release includes the ARM-native Edge Chromium as default browser - the lack of an ARM-native Chrome was the biggest issue with Windows 10 ARM, but with Edge Chromium, that pretty much goes away.

Of course, third-party application developers can do what I want regarding ARM architecture support (as Apple will also discover that means that current products will get updated, and publishers will use this as an opportunity to kill unprofitable older product lines in the hope of spurring sales of newer versions), but Microsoft has already done the bit that's under their control.

Kristian Walsh

Re: So basically

It's not a small percentage at all. The ability of Macs to run Windows allows them to be funded by organisational IT budgets. Lots of middle/upper management types bump themselves up to a Macbook on this basis. Because under Bootcamp the Mac directly boots into Windows, it's no different from an IT department licensing point of view as buying bare from any PC maker and re-using existing licenses on it.

However, I don't imagine that running Windows in a virtualised container would come under that licensing, so while Apple supporting Windows 10 Arm trough a VM might be fine for individuals who need Windows, it may not fly with the corporate customers who've used Bootcamp as a way of getting work to subsidise their personal laptop.

Apple's new WidgetKit: Windows Phone Live Tiles done right?

Kristian Walsh

Yeah... whatever next? an Olympic Games in Beijing?

It wasn't even new when Android did it... but in fairness, Google didn't claim that home-screen widgets were anything special; apps being able to display summary status was seen as a basic feature of a mobile device.

It looks like Apple is copying Windows Live Tiles, which weren't true "widgets", as they didn't allow you to issue commands to an app: all you could do was click them, so they are more like hyperlinks that present a preview of their destination. But when they were done properly, they were very powerful way of managing the information that a phone can deliver. An app user could create a live-tile for any specific content that an app could access, with made them very effective for narrowing the scope of notifications, and as such they're something I really miss since moving to Android with its cacophony of pointless notification alerts (yes, I know how to turn them off; my complaint is that I shouldn't have to: they shouldn't be on by default).

Examples of live-tiles that worked: in a podcast app you can pin a few of your favourite feeds to the front page, and you'll see when those are updated, without being bothered by updates for the other podcasts you follow less frequently; or, pin a particular contact in a communications app, and you will see notification of when they contact you, not when anyone has contacted you via that app.

The Live Tile idea is based on well-proven principles of how people construct mental models of tasks: it shifts the focus onto the object of interest (your contact, the location you want a weather forecast for, your favourite podcast, your family photos album), and not the tool or method (the apps that provide you with those services). Understand that, and you can see why some brand-name social app developers were so lukewarm on Windows Phone - it shifted control of the interaction back toward the user.

Against that, the initial implementation of live tiles was visually far too "busy" and distracting, and by the time that was addressed, the fate of Windows Phone had already been sealed by developer decisions.

Apple to keep Intel at Arm's length: macOS shifts from x86 to homegrown common CPU arch, will run iOS apps

Kristian Walsh

Re: Keyword here is "maintained"

"30 year old (or even 10 year old) software should run fast enough even with a naive cross-compilation to modern ARM CPUs"

Cross-compilation is fine, but what exactly are you planning to link against?

Apple entirely broke backward compatibility in 2001 when MacOS X was launched. Code written for the previous OS's API, the Macintosh Toolbox, will not run without an application host ("Classic") which has not been supported for about 15 years.

Newer code that ran on OSX may also fail to run on the latest release of OS X due to dependencies on CarbonLib, the API that allowed easy porting of Macintosh Toolbox applications to OSX. CarbonLib was finally discontinued this year (the writing was on the wall back in 2011 when Apple announced that they would not port it to 64 bit).

In short, getting very old Mac software to run on new Macs generally isn't possible: even if you have the source-code, chances are the old software links against an API library from Apple that is no longer supported.

(Windows isn't perfect either: 16-bit software has not been supported for a very long time, and 32-bit is going the same way, but Microsoft does provide better supports for getting old source-code to compile and run on a new OS than Apple does)

Belief in 5G conspiracy theories goes hand-in-hand with small explosions of rage, paranoia and violence, researchers claim

Kristian Walsh

Re: So basically ...

There's a perverse comfort in believing that the whole world is rigidly ordered and coordinated, even if the goal is evil, when the alternative is to accept that the real world is a chaotic place with many bad things that happen by chance and cannot be predicted or prevented.

What's the Arm? First Apple laptop to ditch Intel will be 13.3" MacBook Pro, proclaims reliable soothsayer

Kristian Walsh

Re: Depends on what you're doing

Time for another look at Windows. Running my preferred Linux distro on WSL is superior to running BSD on MacOS, and now that the new default Windows terminal is finally better than Terminal.app, there's nothing that I miss from Macs.

I do like BSD, and I think its codebase is so much better than Linux's (the Linux networking stack... dear god!). However, nobody has ever asked me to write a product that targets BSD.

Kristian Walsh

Re: Strategy

No mystery: it was good old internal politics. NeXT vs Apple, and NeXT won. NeXTStep, rechristened as "Cocoa", won as the application environment of choice for MacOS X. However, a lot of the libraries that enabled it were written in C++. From my hazy memories of that time, Quartz, ATS (the font renderer) and IOKit were C++ codebases, to which you can add all of QuickTime, and any other library or application that was ported directly over from MacOS 9.

Oddly, there was such a thing as ObjectiveC++ (basically C++ with the added at-signs and SmallTalk-y brackets syntax to let you interact with the ObjC runtime), which was fully supported by Apple's tooling, but received zero publicity in developer documentation and training. It's a shame, really, because the "C" part of the original MacOS X ObjectiveC was a creaky pre-ANSI dialect that felt like a real trip back in time (as in having to define all your variables up front at the head of a function again...).

I really don't understand why Swift needs to exist, except to make Apple platform skills non-transferrable, but Apple has always had an unhealthy impulse towards proprietary approaches. The irony of course is that without OS X embracing so many open standards, there wouldn't be an Apple today.

Only true boffins will be able to grasp Blighty's new legal definitions of the humble metre and kilogram

Kristian Walsh

I once took a car for a test-drive without realising that my local dealer had sourced it from the UK.

I thought it was surprisingly sluggish to get up to 120 until the sun caught the part of the dial that said "MPH". Oops. (Cars in Ireland are sold with km/h-only speedometers)

Luckily for me, nobody from the Traffic Corps was around that day..

Gulp! Irish Water outsources contact centres to Capita for up to €27m over 7 years

Kristian Walsh

Re: Irish language service?

I had no involvement, but I'd make a guess and say it's to do with names.

There's a number of people in Ireland who choose to go by an "Irish-ized" version of their name, rather than what's on their official documents. So, a "Seán Mac Giolla Phádraig" and "John Fitzpatrick" can in fact be one and the same person, all of whose proof-of-identity documents are in the original "English" name. Similarly, those who actually come from an Irish-speaking background (rather than those who would like to think they're "more Irish" than you are) will sometimes do the reverse: their official name is a traditionally Irish one, but they use an Anglicized version when speaking in English.

When you've got a CMS with just one "Name" field, accommodating this would be awkward.

GitHub to replace master with main across its services

Kristian Walsh

Re: I like colors

Yes, this is true. The "Green" of the green traffic light was deliberately set at the blue end of the spectrum so that the red and green lamps will appear as yellow and white (Protanopia), or as orange and lilac (Deutanopia) to someone with total red/green colour blindness. To such a viewer, the amber lamp will look like red, which is a safe message to convey.

Actually, I was sligthly wrong when I said the Japanese lights are the same green as we have: although I never saw any on my brief time there, there are a few places in Japan where local municipal governments mounted blue lamps in traffic signals. The joys of written specifications in ambiguous language, I suppose.

Kristian Walsh

Re: I like colors

"Green" in Japanese also doesn't describe the same portion of the spectrum as it does in English Many things we consider to be either blue or green are described by the same Japanese word, 青 ao, and this is used to describe both the colour of the traffic light that means go as well as the colour of a clear sky. Just to be clear: Japanese green traffic lamps are the exact same hue as those found in Europe - it's the word that's different, not the actual colour.

There's another, newer, word, midori, 緑, which is used exclusively for what we call green in English (and yes, that does make the name of that melon liquor very unimaginative), but ao still carries both meanings.

(By the way, this isn't a uniquely Japanese quirk, and many languages mix the concepts of green and blue: https://en.wikipedia.org/wiki/Blue%E2%80%93green_distinction_in_language )

Microsoft brings WinUI to desktop apps: It's a landmark for Windows development, but it has taken far too long

Kristian Walsh

Re: But is it usable?

You don't have to take my word for it, open up one of Microsoft's own UWP apps like Mail or Microsoft To Do, and then try composing an email using only the keyboard

Okay,

[Windows-key]mail[return][Control-N]subject[TAB]recipient[TAB][TAB][TAB]body[Control-return]

So, twelve keystrokes more than the total needed to enter subject, recipient email address and message body. (I didn't take advantage of autocomplete)

From app not running to message sent, that's actually one fewer keystroke that the old Unix "mail" tool, which needs 13, although I will concede that if your subject doesn't need to be shell-escaped, that overhead falls to 11 keystrokes. ( mail[space]recipient[space]-s[space][quote]subject[quote][return]body[ctrl-D] )

Kristian Walsh

Re: The problem is called "UI"........

I don't remember any time when there was just one command-line interface. Certainly typing ls or rm used to get me precisely nowhere on a VAX.

Even on "unices", a lot depends on your shell. There's so many people who think "shell-script" and "bash-script" are the same... right until they try to use an embedded system that has dash or a similar actual Bourne-shell copy installed. (Someone experiencing the difference between "vim", the command that's soft-linked from "vi" on most Linux distros, and actual "vi", is also fun to watch)

Command-line applications are a mishmash of good and bad UI. Some are an OS unto themselves (emacs, and that same vim these days), some are maddeningly inconsistent (the linux "ip" tool), and some are superb (anything that follows the ethos of the original core K&R command set of Unix: do one thing, do it right, don't be chatty).

The real problem with graphical UI isn't the toolkit, it's the way the applications behave (much more important than just "look"). There are design guidelines for the major OS platforms (except Linux), that explain what users learn about interaction from using the built-in software, and how you can reproduce those same patterns in your own code. The problem is that different OS platforms have different and conflicting design patterns. Linux is in its own special hell, because with no agreed "platform" behaviours, and at least two warring factions for the desktop, every single graphical application goes its own way, which means you've got to learn everything from scratch, with no carryover of existing knowledge.

(Don't get me wrong, I really like Linux. But I can't use any of its GUI shells without getting angry - the command-line shell is just so much quicker)

The longest card game in the world: Microsoft Solitaire is 30

Kristian Walsh

Re: The interface formerly known as....

Its not a clever approach at all - its nothing more than vector graphics (NOT invented by MS) shoved into unicode and having a load of unicode tables to contain graphics when there are an infinite potential number seems a particularly stupid way of drawing icons.

Missed my point completely. The "clever" bit isn't that it's a font, but that it makes use of the existing and highly-optimised font rendering code path to produce the coloured symbols needed for emoji. Contrast Apple's approach that just hacks enormous bitmaps into the font file, or the SVG-based solutions (the other OpenType approved mechanism) that require a separate rendering path for the Emoji glyphs versus the rest of the code set.

The rules for inclusion in Unicode are straightforward, and the set of Emoji, whether you like them or not, meet those rules. If you consider that the symbols of the Phiastos Disc, a set of characters used only on that single ancient artefact are assigned codepoints in Unicode (U+101DE0--U+101DFD), there's room for a few hundred pictorial symbols that are used by billions of people daily.

Kristian Walsh

Re: The interface formerly known as....

You're correct. It's to allow the interface to be rendered at a variety of different viewing distances and pixel densities. There aren't any bitmap "icons" anymore in Windows 10 for button functions: all graphical symbols are stored in fonts and rendered through the text engine. (the font is called "Segoe MDL2 Assets", and you'll find the icons encoded as characters within the Unicode Private Use pages).

Windows 10 also does the same thing for its emoji support, by using a colour-font feature (COLR and CPAL tables in OpenType) that allows coloured symbols to be built up like silkscreen prints, with one "character" for each colour layer. This is how Microsoft was able to implement every possible skin-tone variation in the "family" and "activity" emojis, and also provide non-coloured emojis without needing a new font (something I wish other vendors would offer). It's a clever approach that reduces the amount of memory and CPU effort needed to display the symbols, can be scaled to any size, and by limiting the number of colours per emoji, you end up with icons that visually blend with text much better than bitmap-based solutions.

(App icons remain as bitmaps)

I know this is not a popular opinion here, but I really the Windows 8 and 10 UI. It's clean and clutter-free, looks really sharp on a high-DPI display, and it's consistent as far as it's applied - the main thing I don't like about Windows 10's UI is how often you're dumped back into the Windows95 look and feel: surely it'd be worth putting a few interns on a task to turn those horrible tab-dialogs into something from this century (... and don't start me on the multi-row tab-bars where the tabs dance around after you click them, in some kind of twisted, user-interface whack-a-mole game)

Kristian Walsh

Other things "stolen" from Apple...

The cards in Solitaire, as well as the standard icon-set for Windows 3.0 were designed by Susan Kare, graphic artist whose other famous contributions to computing included the icon set for the Apple Macintosh, as well as its default bitmap fonts (and a hidden 32x32 portrait of Steve Jobs). To add to the set, as an independent design consultant in the 1990s, Kare also provided the standard icon suite for OS/2 Warp at the same time as doing the Windows 3.0 images... and later the Linux-based Nautilus desktop.

'iOS security is f**ked' says exploit broker Zerodium: Prices crash for taking a bite out of Apple's core tech

Kristian Walsh

Re: Here's an idea

They had one, but eventually there were so few bugs for them to find that Apple fired them all...

The Windows Phone keeps ringing but no one's home: Microsoft finally lets platform die

Kristian Walsh

The mistake wasn't in dropping 6, it was going from 6 to Silverlight on WinCE, then Silverlight to WinRT on NT in version 8. It's that Silverlight to 8.0 breaking change that kicked out many developers - once you were on 8, there really isn't a major difference in targeting 8.1 and 10: I got my reasonably complex 8.1 app running against the 10 SDK in a few days, although UI changes to conform with the UWP design langauge took longer.

Microsoft punished its early WP7 adopters (always the most enthusiastic developers) by landing them with a major rewrite to get their apps onto 8.0, and not providing user-space libraries for some of the Silverlight controls that had no equivalent in 8.0.

As for dropping 6, it had to be done to meet the performance requirements. 6 is a traditional event-loop runtime where apps generally run linearly on a single thread. The only way WP7/8/10 got its incredible responsiveness (and a 40-quid Windows Phone running 8.1 was far more fluid and responsive than a 400-quid Android of the same era) was by making asynchronous programming and deferred tasks a first-class feature of the framework and the C# language - this makes it easy for developers to do the right thing to keep the system responsive, but it's such a divergence from the "old" way of doing things that even if the Windows Mobile API were preserved, devs would have had to do extensive re-writes to get performance, and then re-write their UI to handle touch gestures rather than stylus...

Things Microsoft will be glad to never see again: Windows 10 1809 and Windows Phone Office

Kristian Walsh

Re: Option A for me

Unfortunately, I have a lot of friends who only read their WhatsApp/FBMessenger/Viber messages, and these services are going to stop supporting WinPhone at the end of the year.

If it weren't for that, and the now-knackered battery, my five-year-old Lumia 950 would do me for another few years.

Tesla has a smashing weekend: Model 3 on Autopilot whacks cop cars, Elon's Cybertruck demolishes part of LA

Kristian Walsh

Re: I Can't Stop Myself

AC is presenting a straw-man argument. Nobody wants to ban self-driving systems, they just want the vendors of these systems to prove that they're safe before putting them on the public roads. Uber and Tesla need to realize that this isn't some crappy javascript running on a web-page where you can just update when the thing crashes: if their system fucks up in a moving car, there's a good chance that people will die or be maimed for life. Higher consequences require a bit of grown-up responsibility.

Tesla's system does not meet the requirements to be called an "Autopilot" (I'm aware that autopilot doesn't mean self-driving - I'm referring only to maintaining a safe speed and distance within a traffic lane). In this case, the correct action would have been to sound an alarm to alert the driver of a stationary hazard, slow down before the hazard, and finally, in extremis, to apply the brakes to avoid impact. That's what other people's automated cruise-control systems do.

There are now too many instances of Tesla "autopilot" driving straight into visible hazards to consider it a safe system. In this case, the solution is for Tesla to re-engineer their product to fix this error, but until they do so, there is no reason why other road users should be endangered by the buggy software.

Kristian Walsh

Re: I Can't Stop Myself

" I know Google Translate is far from perfect, but it's far better than anything produced the analytical way."

To cure you of such delusions, I suggest you bookmark DeepL: https://www.deepl.com/translator

Thanks, Brexit. Tesla boss Elon Musk reveals Berlin as location for Euro Gigafactory

Kristian Walsh

Re: No, the UK was never in the running

Cars priced above €60,000 receive no subsidies at all under the new German scheme. The biggest gains are for buyers of vehicles priced under €40,000, and cars costing between €40,000 and €59,999 receive a smaller subsidy.

The cheapest Tesla vehicle retails for €57,000, so most of the American company's offerings won't benefit from this scheme at all. The intended beneficiary is of course not Tesla, but rather Volkswagen, whose ID3 EV will cost around 30-35k, right in the sweet-spot for the grants.

Not LibreOffice too? Beloved open-source suite latest to fall victim to the curse of Catalina

Kristian Walsh

Vista's problem was the incessant nagging, as each permission needed to be approved by the user. Where programs spawned additional processes it turned into a nightmare of re-approving each sub-process. Windows 7 to 10 actually use the same security model to protect the user's system against privilege escalation, but the user layer was changed to ask you only once for all necessary permissions for a process and its sub-processes.

Catalina's problem is different: the software is broken.

Kristian Walsh

Re: Just another...

App Store distribution has even tighter restrictions.

The problem here isn't app security - authentication of origin is a good thing (although Apple controlling the gatway is a smaller "bad thing"). It's that Apple has apparently shipped a broken implementation.

LibreOffice has had their build signed, but Apple is rejecting that build. That's a flaw in the process or the implementation. Perhaps the installation process changed something insignificant that the signing algorithm thinks is important? Perhaps Apple's crack team of stock-option clock-watche--- sorry, of software developers didn't test their code enough?

Good idea, lousy implementation used to be Microsoft's purview (Apple excelled in "bad idea, excellent implementation")...

Chemists bitten by Python scripts: How different OSes produced different results during test number-crunching

Kristian Walsh

Re: Fixing the symptom…

I did say "stored", not "presented". It's a subtle difference, but I'll accept that NTFS doesn't offer any other way of getting catalog data out except codepoint-sorted, so the point is moot for software running on NTFS volumes.

```In fact, I wonder if te quoted comment "because most times that returned order is basically "order of creation" ..." implies that the poster works on a system that behaves that way and is assuming that all other systems behave the same way.```

No. I was assuming nothing. I work on Linux, MacOS, Windows 10 and Ubuntu-on-WSL systems. I used that as an example of why assuming codepoint order is incorrect. NTFS does present its directory contents sorted this way, but HFS+ and most of Linux's filesystems do not.

For what it's worth, I think NTFS does the right thing here - a consistent, predictable and (above all) logical answer from the question "what is in this directory?".

The big surprise for people is that this is a _filesystem_ property, not an OS one. Running Linux, issuing `ls -1f` lists files in code-point order on a NTFS volume, but not on ext4 one. (As can be demonstrated by using Windows 10's WSL feature)

Python apologists saying "but read the documents - you were promised nothing" are missing the point. Python is supposed to be a straightforward and intuitive language. Leaving these kind of gotchas in there undermines this promise, and frankly, the effort that was needed to document the unintuitive and nasty behaviour is about equal to the effort that would have been needed to make the call work the same way across all filesystems in the first place.

SUBSCRIBE TO OUR WEEKLY TECH NEWSLETTER

Biting the hand that feeds IT © 1998–2020