I too have eyesight that makes decoding emojis at normal text size very difficult. Further, I can't be bothered to learn the input methods for emojis. I have a compose key set up for typing accented letters, and so I can type <compose> : ) for a smiley. Apart from that, the whole emoji business leaves me cold, and therefore my response to it is <compose> p o o.
18 publicly visible posts • joined 26 Jan 2008
Re: Check you can complete before you start
Back when I was a university teacher, I often included the instruction `Write the word OKAPI at the top of page 2 for an extra mark’ in my midterm exams. Less than half of students did this. Some who didn’t said afterwards that they had read it, but thought I was joking! Life sometimes makes me sad.
Re: phoning and testing
MVCL was always interruptible.On an interrupt, the hardware would update the registers; on return, the machine would just re-execute the MVCL.
A much more interesting way of swamping a 360 or 370 with virtual memory was to process a huge matrix along rows, not knowing that Fortran stored arrays by columns. This could bring our university's 360/67 running a time-sharing OS to its knees, as the offending process ended up having to swap in a new page for each element access.
It would be interesting to compare Wikipedia's error rate with Orlowski's
Intelligent people know that no source can be trusted completely, and that any worthwhile research uses a multiplicity of sources. I often use Wikipedia as a starting point, often finding its references and external links the most useful part of the article.
While there have been many accounts of individual Wikipedia pages that contain slanted or biased information, I have no reason to believe that the overall site's error rate is substantially higher than any other encyclopdia.
Shoddy journalism is the journalist's fault. Uncritically cutting and pasting a Wikipedia article is no different from a school student copying a passage from a book without acknowledgement in a school report.
Finally, I look forward to seeing a link to Mr Orlowski's list of the 16,000 pages he claims are false.
'We think autonomous coding is a very real thing' – GitHub CEO imagines a future without programmers
Well, OK, then
I'm a fan of the Fargo film and TV series, which depict decent people going to hell through a series of really bad decisions. Similarly, programming ended in the early 1960s, when FARGO, the programming system was created. [https://en.m.wikipedia.org/wiki/FARGO_(programming_language)]. Not!
Of course, people don't write mortgage programs anymore, they use spreadsheets. We don't need to keep writing the same programs over and over.
Software development has changed many times over the decades. What hasn't changed is the need to have smart people create solutions to problems that nobody has solved before, and other smart people evolve those to meet changing needs. Intelligent programming assistants can definitely be a part of that.
Re: No - systemd doesn't offend me
I selected No because the choice of init system really makes almost no difference to me. I've been using Unix since 1975, and taught courses on Unix programming and system administration, so I find the issues relating to init systems interesting, but they have very little effect on me on a day-to-day basis. I've survived BSD init, SysV init, upstart, and now systemd, and none of those changes has made an appreciable difference to me, other than some minor modifications to scripts. Obviously, some sysadmins might find that such changes necessitate a great deal of work; but that hasn't applied to me.
Now do I find systemd's design offensive? Yes, I do. But then I find most features in modern distros kind of offensive (the cat and true commands are exceptions ☺) . I tend to like systems that are consistent, have as little clutter as possible, and match my notions about workflow. I would personally have liked it if Plan9 had been modernized and made a solid OS. But it wasn't, and Ubuntu and Debian do a fine job for me.
So what about Devuan? I hope the developers can produce a solid system. If it's easier to use, or more robust, or more scalable than Debian, good for them; maybe other distros will pick up on their ideas. The computer scientist in me finds such matters interesting; the computer user in me is happy with what he has.
Peter is of course correct, it was a brain fart on my part. In fact, by the time 7th Edition came out, I no longer had access to the 11/45 in question.
Reiterating my earlier point, much of present-day Linux distros (all of them) isn't the way I'd do it, but I have a life, and I just use what I have to get the job done. Linux/Unix on desktop, laptop, or server looks downright great compared to the Redmond alternative.
Some other posters have referred to disembowelment or the like. I prefer to judge software on its merits or demerits, without ad hominem comments.
My first Unix system was 7th Edition on a PDP-11/45, back in 1975. Since then, I've seen Unix (and now Linux) grow from an elegant, small, sparse system to the furry, ungainly creature it is today. In part, this is due to increasing performance, reliability, or functionality. In part, it's creeping featurism. Most of a modern Linux system isn't done the way I'd do it, ranging from duplicated functionality to unwanted applications, not to mention hit-and-miss documentation.
I suppose if I really cared, I'd use one of those distro generators to roll my own. But frankly, life's too short. I fire up my various Debian or Ubuntu systems, and for the most part, they just work. I haven't really noticed any decline in stability with the advent of systemd (which is definitely not done the way I'd have done it).
So I wish the creators of Devuan success, but frankly neither the presence nor the absence of systemd really makes much difference to me. I just use whatever I have to get whatever I need done.
Re: Dubious award
If you are referring to the claims of pedophilia lobbed against Clarke, no evidence was found to support them (other than some claims in some tabloids). While those accusations were current, Clarke's knighthood was in abeyance. When the Sunday Mirror apologized for publishing those claims, the knighthood was granted. Clarke was definitely either gay or bisexual (regardless of what he said on the subject), and he may well have enjoyed the company of younger men, but there's a world of difference between that and pedophilia.
Actually, the USNS Harvey Milk is the second ship in the John Lewis class, which are being named after civil rights leaders. And the `long, hard, and full of seamen' comment is about as courteous in discussing this matter as a reference to watermelon in a discussion of Obama's foreign policy would be. Both the article and the headline are seriously misleading.
I don't like the one-button mouse, but I will say that it does make using the mouse simpler to explain. I have taught several seniors how to use computers, and the whole `left-click/right-click' thing is actually quite difficult to get across. I don't think the one-button mouse is a good idea, but it is true that Apple actually had a case in favor of it. (I really like my Lenovo laptop with its TrackPoint and three mouse buttons, and a touchpad I disable.)
As for inaccuracy, mice have always been inaccurate. I was told in a graduate-level HCI course I took circa 1975 that the original SRI mice were built from miscellaneous electronic scrap, and that getting perfectly linear potentiometers was essentially impossible. As a result, to move the cursor in a straight line, the user had to move the mouse in a curve. Apparently, users adapted very quickly, and were genuinely unaware of this phenomenon.
In my experience, government IT projects often don't come in on time/under budget, for several reasons. 1) Requirements creep, often triggered by political considerations (and often from the same sources as later outrage about increasing costs); 2) lack of coherent oversight by the client (instead of having a small review committee with both technical and other members, approvals are often distributed among a large, diffuse, and ever-changing group of individuals); 3) significant amounts of out-of-band communication between contractor and client, often necessitated by the first two factors, and resulting in a loss of institutional memory (`where did that requirement come from'); 4) poor life-cycle models, veering towards waterfalls, and away from iterative/agile practices (and I am not carrying the agile banner here, just pointing out that waterfall development often produces gargantuan monsters that don't satisfy client needs); 5) lack of end-user involvement (e.g., a senior program manager who thinks he can speak for clerical users of the system); 6) lack of clearly identifiable milestones that relate to the actual project (e.g., `the Frobozz infrastructure is complete', rather than `the system can correctly perform transactions X, Y, and Z'); 7) emphasis on delivering milestone products, rather than ensuring the products meet specified reliability and performance requirements; and 8) lack of proper audit procedures (e.g., looking at numbers of closed and unclosed bugs in the bug database). I could go on and on.
Of course, many private-sector projects go south in exactly the same ways. But government IT projects seem particularly susceptable to these dangers. As someone who has taught software engineering practices to industrial practitioners, it disturbs me greatly how easy it is for large organizations to ignore their own history, and make the same mistakes over and over again.
`Coding' is such a generic thing, and never should have been identified as a goal. Certainly HTML coding is a good and useful skill (though, to be fair, it's easily automatable, as many web frameworks have shown), but it has little to do with writing programs that accomplish useful things, whether they be things of interest to the individual author, or things of interest to enterprises and the public at large.
The Raspberry Pi folks don't primarily want to have people coding. They wanted to build something that can be played with, that is as open as possible, to stimulate interest in computer science and engineering. Their metric, as can be seen from their website, is much less `how many people learned to program RPi' than `how many people have done fun/interesting/useful things with the platform'?
This distinction seems to be very difficult to get across to people. Thirty years ago, I was running workshops on teaching with LOGO. I gave it up after realizing that to most of the participants, the goal of LOGO was not to learn how to solve interesting problems, but to `trick' the turtle into drawing cute pictures.
I thought it interesting that the timeline presented in the article made no mention of people who were in the business of teaching computer science and engineering. If I were a non-technical politician wanting to set up something like this, I'd have gone out and formed an advisory council with (say) 5 people from academia and 5 from industry (at the CTO level) to chart out a workable strategy. If I were doing that in Britain, Eben Upton would have been one of the first I would have called. So I guess that's why I'm not in politics.
Macs are for creative folks
I'm always intrigued by the notion that somehow Macs are for creative people, but PCs are for serious people. Maybe that was true back in the days when desktop publishing was new, but nowadays, pretty much anything you can do on OS X can be done on Windows, and vice versa. Here's why I love my 3-year old MacBook Pro.
1. Magsafe power plug. Very little chance of destroying a computer, as I once did to a Dell Latitude laptop where the power cable ended up inextricably tied around my foot just I stood up abruptly. Those little tiny touches do make a difference.
2. It's real Unix, so I can run computer science-y software that was written for Unix.
3. It really is plug-and-go. I have a Dell laptop with Ubuntu on it, and while it's a nice system, I had to do a lot of work to get everything working properly on it.
4. My antivirus software for that machine consumes zero bytes, takes 0% of the cpu cycles, and cost me $0.
5. Apple includes their development software at no cost with every system.
6. Apple's development software does not suggest that I become super-user in order to compile, test, and debug software. Visual Studio did exactly that on Vista, maybe it won't on Windows 7.
Not one of these is something that Apple's competitors couldn't duplicate (they'd have to license the magsafe patents, I guess, but they could apply the same attention to detail about other aspects of how people actually use machines).
Windows systems fail on all six of these criteria. Unix/Linux systems pass on 2, 4, 5, and 6, but fail on 1 and 3. (Incidentally, I know my way around Unix, having started in 1975 on a Unix V7 system. Even so, when I want to send email, or write software, the last thing I want to be doing is to fiddle with system parameters to make everything work, and yes, I have had to do that with every system I have ever installed Linux on).
So my point is, Microsoft, the hardware vendors, and the Unix/Linux community COULD be competing with Apple. The criteria I listed here happen to be mine; other folks have their own, though `it just works' should be on everyone's. Where people miss the boat is to say `Oh, we aren't Apple, we do it our way'. Apple definitely has their failures, but their successes come from building things that satisfy needs (not just status) in people's lives. Other companies could do the same.
By the way, let me put in a plug for Mark Shuttleworth's goal of making Ubuntu compete with OS X. I don't know if they'll be successful, but they are definitely thinking the right way.
1. The Open Source Definition is not the work of Richard Stallman or the FSF, who overall does not like the phrase.
2. The non-discrimination clauses I quoted come from the Open Source Definition, which is published by the Open Source Initiative.
3. When Microsoft developed licenses that they considered to be open-source (note lower case), they submitted them to the OSI for approval.
4. The Wikipedia article on open source cites a reference () from 1990 that shows the term to be used to signify `materials open to the public and freely available' [my wording] and opposed implicitly to `controlled'.
5. I hope Sean would not consider a library to be public if it refused admittance to meat eaters, vegetarians, or any other group.
For the record, I have been using open-source software since about 1970, back when user groups such as SHARE (IBM) and DECUS (DEC) used to distribute it. The GNU Manifesto clearly says that RMS wanted to re-create the environment he'd experienced at the MIT AI Lab, so he can hardly have claimed to invented the concept.
Another article that misses the mark
Whatever this license might be, it most definitely isn't an open source license. The Open Source Definition has at least two clauses (no discrimination against persons or groups, and no discrimination against fields of endeavor) that are. at least according to the article, are violated. Similarly, the Four Freedoms include the right for anyone to run the software, again violated, according to the article, by this license.
I do wish the article's author had been a bit more careful about using the term `open source' for something that is nothing of the kind.