* Posts by Healeyman

8 publicly visible posts • joined 30 Mar 2022

Europe's cloud customers eyeing exit from US hyperscalers

Healeyman

Re: "I think this is a realistic fear nowadays"

Agreed. There was a time, when companies cared about data autonomy, that 'co-location'--i.e. 'CO-LOs' were popular. You rented space, power, A/C and network access in a warehouse-style facility and installed your own servers. Of course, then you had to pay techs and administrators, which hurt the 'bottom line.'

50 years ago the last Saturn rocket rolled out of NASA's Vehicle Assembly Building

Healeyman

Re: Consider the glass

“I guess the question I'm asked the most often is: "When you were sitting in that capsule listening to the count-down, how did you feel?" Well, the answer to that one is easy. I felt exactly how you would feel if you were getting ready to launch and knew you were sitting on top of two million parts -- all built by the lowest bidder on a government contract.”

― John Glenn

Healeyman

The Best

Saturn V was the most elegant rocket ever; I sorely regret never catching a launch. Sorry, Starship doesn't work for me (but I also like old British sports cars so I may be warped).

Please fasten your seatbelts. A third of US air traffic control systems are 'unsustainable'

Healeyman

Re: Nothing is ever new

Correct, sir. I'll add, in case anyone's interested, that runways are generally aligned so as to face into the prevailing winds for the area. For instance, in the SF Bay Area airports, with which I'm most familiar, most runways are designated plus/minus 30 (300deg magnetic),as the prevailing winds are from the northwest. In the northern hemisphere, low pressure cold fronts move west to east and their wind travels counter-/anti-clockwise about the low, due to Coriolis Effect, so with an approaching front the 'active' will be switched to the opposing runway (e.g. runway 12).

Linus Torvalds: 90% of AI marketing is hype

Healeyman

Re: I agree

If I have this right--I've been out of SV and Tech for over a decade, but still follow--the gist of 'AI', esp. LLM, is that the content/data of the entire internet is continuously snarfed-up, processed and regurgitated by various algorithms (neural nets, etc.). Since more of this content is AI-generated, this means 'AI' is continuously ingesting its own output (sh*t) so, potentially, misleading and outright erroneous 'info' can get amplified, concentrated and re-regurgitated until it's nothing but unrecognizable nonsense. Is this the 'singularity' we've heard so much about?

The chip that changed my world – and yours

Healeyman

A Classic

I know the Z80 from my reading Ciarca's 'Circuit Cellar' and my career--long ago--at HP. I spent years, both in school and professionally, programming on HP's 2500/1000 computer, which was initially developed to control HP's instruments (using HPIB, of course). The later version of the 1000 I worked on had a 16-bit CPU--earliest versions used 4x4-bit CPUs 'bit-sliced' together--but its power came from its OS, called 'RTE' for Real-Time Executive and the architecture. All I/O operations were handled by all-but standalone Z80-based microcomputers on a bus, to which the CPU would issue read/write commands then be on its way. The I/O board, for example the serial board, with several UARTs, would send or collect the data, DMA it to the CPU and interrupt the CPU via dedicated circuitry. With this architecture, it was a data collecting machine par excellence; when HP went to RISC ('PA-RISC') for its business machines they couldn't even come close to 'real-time' as all I/O was handled by polling relatively dumb I/O interfaces. In benchmarks, the lowest-powered HP1000s--by then on a single CPU chip--would blow the doors off the most 'powerful' PA-RISC/HP3000 business machines in benchmarks (you cannot ever call a computer which requires polling 'real-time,' as response cannot be deterministic by definition but, with enough CPU speed they can get close).

My career at HP started in the early 80s. At the time, HP had been contractually committed to supporting the HP1000 through, IIRC, 2010 as it was the primary controller for the USAF's AWACS radar surveillance aircraft.

Silicon Valley Bank seized by officials after imploding: How this happened and why

Healeyman

Cramer Strikes Again

Jim Cramer, of 'Mad Money' infamy, was pumping SVB's parent's stock a while back. That's the 'Kiss of Death' for anybody who keeps an eye on this carnival barker for contrarian picks. And the company's execs were lobbying Congress to relax regs and oversight a few years ago; the result was inevitable (don't worry about the execs; they cashed-out a couple days prior to the collapse).

The wild world of non-C operating systems

Healeyman

REALTIME, baby.

Noob here, first comment, name's Bob (68yo retired programmer). My favorite OS, and best overall IMO, was one called RTE--for 'RealTime Executive--on HP1000 minis (descendant of the HP2100). HP got into the OS business because they needed a computer/system to manage all the test and measurement equipment they built (signal generators--including their first product; a synthesizer used to create sound for Disney's 'Fantasia'--spectrum analyzers, O-scopes, gas chromatographs ... you get the idea.). Also, to communicate with all their devices, on a single bus, HP invented the HP Interface Bus, or 'HPIB,' later to become IEEE488 parallel bus.

The word 'realtime' gets bandied about, but there are actual criteria; basically, the machine's response has to be at all times deterministic. Obviously, responses cannot occur at the same instant as a stimulus, but in order to be considered 'realtime' the machine has to guarantee response within a specified maximum time. Since the 1000 was an I/O processor extraordinaire, this was accomplished as each I/O type--serial, parallel, etc.--had its own Z80-based processor, essentially a microcomputer with its own RAM, microcode, backplane hardware and specialized chips (e.g. UARTs). The CPU would specify an input's parameters, the I/O board would collect the input until the request was complete, DMA the data to main memory then fire an interrupt to tell the CPU ready to process the data when the DMA was complete. Depending on the process's priority, the CPU would save its state, service and process the interrupt then restore its previous state and process until a higher priority process needed servicing. By contrast, the later RISC-based HP3000 was strictly a polling architecture; the CPU would query each I/O subsystem in turn no matter its 'priority' (it was a round-robin process handler, higher 'priority' processes would merely get a larger time slice). The theory was the RISC processor was so speedy it didn't need true realtime capability but in testing the HP1000 'mini'--early versions used 4 x 4-bit microprocessors daisy-chained together for a 16-bit system--would routinely blow RISCy 3000s away. When I was at HP, 1983 to 1996, one of the 1000's applications was as the signal processing unit for AWACs early-warning aircraft. HP was obliged to continue supporting the 1000 into the 2000s for use in AWACs aircraft. I worked a contract programming a bank of HP instruments on IEEE488 run initially on HP Pascal, later on a micro using LabVIEW (ugh) for one of the GOES weather satellite's comm systems.

I had fun in school with LISP--one of the Linux text editors is written in it IIRC--and Forth but the strangest language I encountered was an oddity from IBM called APL (for 'A Programming Language'). Commands parsed right-to-left and an assignment character was a left-facing arrow; there were specialized keyboards available for command symbols resembling Greek letters but us mortals had to use clunky ASCII transliterations. But, it had a single character command that would perform a full matrix inversion. Wonder what happened to it; was probably too far out of the norm for many to use.