Re: AI winter
... and neither of them was in 1984.
360 publicly visible posts • joined 13 Aug 2007
The "AI Winter" was not circa 1984. That was the time of the Japanese Fifth Generation project, the UK Alvey program, the heyday of Lisp machines, Prolog, and expert systems. The AI winter was about a decade earlier, and in Britain was largely associated with the Lighthill Report of 1973.
In the 1970s we had a PDP 11/40 that was turned on every morning, and it too would often not start in cold weather. We were told that this was because the 14" platters of the RK05F disk drives would contract until they were no longer within the tolerance for track positioning.
(The RK05F was a fixed version of the removable RK05 and was able to have twice the track density because it removed the manufacturing variability between disks.)
Bear in mind that in those days people were much probably much less inclined to treat numbers as decimals in general. We would write £1 2s 6d, 5' 8", 2lb 4oz.
During the run-up to decimalization in 1971 one of the things that they (the government, teachers, the BBC...) tried to drum into us was that a pound and five pence should be written £1.05 and not £1.5.
We had a DECwriter as the console on a VAX 11/750 running Berkeley unix. Every minute a cron job printed the date so that we could tell when any output had happened.
One day we suddenly started getting an error for every command - something like "out of processes". We found that the console had run out of paper, and several hundred "date" processes were stuck waiting for their output to complete.
A fine example of this is that many Canon printers can't be connected to your wireless network if the network implements 802.11r, a protocol to speed up roaming between access points. The symptom is particularly bizarre - the printer just doesn't prompt you for the wifi password. The workaround is to turn off 802.11r, configure the printer, and turn it on again.
My ISP gives me a /48, which is pretty common. That means that in theory I can have 2^80 devices, though more usefully I could have 2^16 subnets of up to 2^64 devices. Huawei's /17 would allow them to provide 2^31 = 2 billion customers with a /48, and that's not much more than the population of China. Not that I imagine that they want it for that.
There are lots of fun things you can do when you have an effectively unlimited number of addresses. I have a string of christmas tree lights with 250 individually-controllable colour LEDs - I could easily write a program that gave each LED its own address.
The battery is certainly the obvious place. It's heavy so a little extra is less noticeable, or perhaps they just reduced the battery capacity. It could even be introduced without the complicity of the pager manufacturer - offer them a good deal on several thousand batteries and they might well take it.
I haven't seen anything indicating how they were triggered. If the software was compromised it could signal to the battery without the need for extra wiring, for example by a pattern of power use.
When binding a server address, 0.0.0.0 (INADDR_ANY) means listen on all interfaces. But what is the justification for it being recognised as the local host when used as an address to connect to? I don't recall ever seeing that documented, and what would be the point of 127.0.0.1 if 0.0.0.0 did the same?
"If I use this instruction correctly and without any malicious intent on any other RISC 5 processor, then in these machines my code is completely broken?" This makes it rather surprising that the bug was not noticed immediately - or indeed in testing before release. Are there just very few programs that use the affected instructions?
The transputer did indeed demonstrate what could be achieved by a very simple processor in a big array, and the answer was "not much". For several years people enthused over it, but failed to produce useful solutions with it. It became clear that the vast majority of tasks just weren't amenable to being solved that way: they have parts that are inherently sequential, and even when parallelized need access to shared memory.
"many early internet hackers worked on the original MINIX, porting it from x86-16 [sic – this was an 8086 OS] to 68000 and SPARC"
The most significant work (at least in my opinion - I played a part in it) was getting a 32-bit x86 version running, and improving the system libraries to the point where gcc and emacs could run. That's what Linus built the first Linux with.