Re: So, even Apple can't get it right
The first iteration of the iPhone was a bit crap. I've heard they sell quite well now.
212 publicly visible posts • joined 17 May 2011
My computing experience started with a job teaching undergraduate psychology students BBC BASIC on a room full of Model Bs. It was one of those, 'read the book so you stay a week ahead of the students' gigs for me, but it meant I learned the ins and outs of BBC BASIC quite thoroughly. A year or so later I was writing fairly complex programs in 6502 assembler on BBC Model Bs (with re-entrant interrupt routines, linked lists, reading data and controlling external hardware). I then learned C (on a PDP-11/73 - there was no complete C implementation on the BBC micro) and never looked back (apart from an odd interlude updating some ancient code in FORTAN-2). I ow it all (well, not all), to The BBC Micro.
'The Comet' was, and still is, a slightly downmarket hotel. Its across the road from the Hatfield Galleria (a shopping mall built over the A1). I remember seeing the red model aeroplane at the front of the hotel when I was a kid. Many years later I used to stay there when I was external examiner at Hatfield. I quite like it - faded glory and all that. The red aeroplane is still there.
> sensors powered by nuclear batteries that generate electric power from radioactive decay to be used on the Moon, and Massachusetts
I read that as 'the Moon and Massachusetts' and started to wonder what catastrophe had left Massachusetts without the ability to generate electricity or get it from other states. Very likely an experiment going horribly wrong, conducted in secret by those pesky, meddling, left-wing, pinko boffins at MIT and Harvard.
A long time ago (Sun3 era) I had a C++ environment called Sabre C++ (there was a name change at some point, I can't remember the newer name). It was very expensive, but worth it. You could compile your C++ in a debugger, set breakpoints and so on, but when you reached a breakpoint you could set the system to switch to its C++ interpreter, which, of course, allowed you to modify not only variables, but also code on the fly. You could step through like this until you reached a point where you resumed the compiled code. All quite amazing. I've never seen anything like it again. I think it died of expense and people buying cheap hardware and software.
Many years ago I was told (so this might not be true, but it was my Dad who told me, and he'd just done some consulting work for Guinness) that in some African countries Guinness was usually mixed with condensed milk to make a sweeter and more dense drink. Guinness made a special variant to suit this market - small bottles of very alcoholic Guinness (9% I think). There must be some truth in this because I have drunk one of those bottles of super-Guinness (without condensed milk). It was good, but sadly not available in the UK.
I learned to program on a BBC Micro. First BASIC (which, as a PhD student, I was teaching to university undergrads - I was one or two weeks ahead of them), then 6502 assembler (because I had a couple of psychology research projects involving control and data collection from multiple units of custom hardware which couldn't be run any other way - I have written thousands of lines of assembler), then LISP (for fun, but I learned enough to get it to do useful things and *it made me change the way I thought about programming*).
I only learned C after all of those, and it seems to me that that was a good order for learning languages. I wrote a lot of things in C, but I suspect with quite a bit of LISP inspiration. After that I did some maintenance on some code written in FORTRAN-2, some newer stuff written in FORTRAN-IV (no chance of any style in either of these), quite a lot of C++ (neural network simulations), some C# (as the language you had to use with Unity), and tons and tons of MATLAB (we had equipment which only came with libraries for MATLAB). BTW, I'm not a programmer, I'm a University Professor, and this journey through programming started when I was a PhD student over 40 years ago - BBC LISP definitely played an important role, even though I never really used it in anger).
I get an alert on my phone every time I pay for something from my Barclays account (not including standing orders and direct debits). I can't remember when they started doing this, but is has been like this for at least two years I think. It may not work for cheques, but I haven't used one of them for years. I'm not being a Barclay's fan, just a point of information in case anyone is interested.
Changing the 2CV's spark plugs was a bit of a nightmare though - nowhere near enough room to get a normal plug-spanner between the plug and the bodywork. I never attempted anything that might have needed taking the cylinder heads off. The curse of putting a boxer engine in a car. Boxer engines are much easier to work on when they are in a *proper* BMW. My last proper BMW was an R90S.
I bought "Thirty Seconds Over Winterland", a live album by The Jefferson Airplane, sometime in the 1970s. It has superb flying toasters on the cover (they sued Berkeley systems over copyright but lost because no-one at Berkeley systems had seen the album - philistines). The title "Thirty Seconds Over Winterland" is, of course, a riff on the 1944 film "Thirty Seconds Over Tokyo" which was about a bombing raid on Tokyo led by Gen. Jimmy Doolittle. I claim my bonus points, and my certified old hippie status.
My experience, from asking chatGPT to generate answers to various exam questions I have set, is that chatGPT answers would get 2ii marks at best, and that it is pretty obvious that they were not written by a human, or at least someone with a good grasp of structuring an argument and use of evidence. I've also tried software intended to detect machine generated text. It always classified the chatGPT output as more the 97% likely to be machine generated. If, however, I rewrote the first two or three sentences (so a very small portion of the essay) the ratings changed to 2 or 3% likely to be machine generated (this wasn't Turinitin, but it was meant to be the best chatGPT detector at the time). Personally, I think if students want to use chatGPT they should be allowed to. They'd have to do so much work editing the chatGPT output that a good answer would still demonstrate a student's understanding of a topic and their use and knowledge of appropriate evidence in supporting an argument. Yours truly, an old Professor.
I used to use, and program for, System 7. You'd be lucky if it stayed up for a whole day - it just crashed on me for no apparent reason even when just using it for word processing. SunOS3.5, on the other hand, stayed solid for months despite my imposing all sorts of horrible 'optimisations' on it using bits of 68K assembler I barely understood, embedded in C. Moral of the story - simple UI good, stable OS better.
BMW are pretty good at supporting their products. A few years ago, when I had a 1974 BMW R90S, BMW still had all the parts available new together with all the manuals. After I bought it I got it serviced at a BMW dealership and they replaced worn components of the braking systems with new parts. I think the bike must have been about 40 years old at the time. Slightly off topic, but compare that to support for phones!
As a junior academic, but with some IT skills, I was once given the task of determining whether our senior IT technician, who had been caught 'red-handed' watching porn, had downloaded or viewed any illegal material. This was a very strange task, but the strangest, and hardest (not intended as ooh-err-missus) part was that I was expected to write a report for the Head of Department describing exactly what sort of things he had been watching even if they were not illegal. A rather tricky assignment, and I was never quite clear why the non-illegal material had to be desired in detail!
Is it an Apple issue or an ARM issue? Are there lots of ARM implementations out there based on ARM8.3 that have this pointer authentication issue or is it just the M1? BTW it sounds as if you are worse off with no pointer authentication than with pointer authentication that takes minutes to crack. I'm not an expert in such matters - just interested to know the answers.
I was surprised that the author didn't include Sun's desktops in his or her survey. I really liked the Sun desktop that preceded Broken-Look (SunView that came with SunOS3.5, not the networking one). It was also fabulously easy to write applications for. All of the functions for creating and managing windows were varargs where the arguments were 'option' 'value' pairs. Any that you didn't specify had sensible defaults. You could actually see the event handling loop and so it was easy to insert a function (or functions) into it to trap events you wanted to deal with, everything else, again, was handled with sensible defaults.
I was one of those people paid by IBM in the early 80's to do some of these UI studies. We ran experiments where people used an editor where insert/overtype mode was either indicated by a change in the cursor (flashing block or flashing underline) or an indicator in a status bar which could be at various locations. We not only tested user reactions and editing speed, but we even eye-tracked them while they used the editor (pretty difficult at the time - the eye-tracker involved subjects wearing centre-less contact lenses with circular coils of wire embedded in them and much physics). The attention to details like this was really impressive.
Also, as the editor we used (ELM I think) was the (quite advanced for the time) standard on our University's MTS operating system, I could have fun replacing the standard ELM with one where the edit mode would change randomly with no indication, or the mode indicator would change but the mode wouldn't or both would change randomly, but independently. One of those great things where people begin to doubt their sanity. That wasn't strictly part of the project, but, as I said, mischevious fun. Anyway, thank you IBM for employing me (indirectly) for a year.
I'm a cognitive neuropsychologist, but with a bit of a computational background (spent a summer at the Santa Fe Institute, refereed things about physics of computation, that sort of thing). Dr. Verstynen is right to draw an analogy with chaotic dynamics, but the brain situation is even worse. Typically, when one studies chaos the system is isolated - in brains new perturbations from the outside world keep bumping into it and the system never has the chance to settle into stable attractor dynamics - it is always on the way there - it never arrives (until we die). There are probably transient, sort of predictable, quasi-stable states induced by signals from outside world, but I don't think we have much of an idea at all about how these are formed and how they affect the rest of the brain. All a bit hand-wavy, but, at this stage that's probably all you're going to get. Elon is talking out of his hat.
Yrs truly, A Professor.
I hooked up a Canon 5DS SLR as a webcam during serious serious lockdown. It was just something to do to while away the time. A bit excessive, but you could control DOF and focal length. The image was very good, much better than my webcam, even when downscaled to 720.