Ringo said the system "worked flawlessly after the 'upgrade'
Sounds like he tuned the system well...
With Friday upon us, the drumbeat of the working week is slowing. But we still have enough energy to bring you another instalment of On Call, The Register’s weekly reader-contributed tale of out-of-tune tech support tales. This week meet "Ringo" who runs a music technology specialism as part of a music degree at a very …
This post has been deleted by its author
We’re currently in our holiday flat it Spain, enjoying the cold wind and grey skies.
The lovely Ivana couldn’t get the toaster to work this morning. No amount of pushing the lever down would make it latch.
On a whim, she decided that the have you switched it off routine would be worth trying.
So she switched the kitchen lights off.
After switching them back on , the toaster latched at the first attempt.
Toasters usually won't latch unless they're powered. The lights likely being on the same circuit, they probably cause some kind of "dirty" power which causes the toaster to just stay turned off.
Still something you'd probably want checked out because that's not normal and I'd be very concerned about those lights being a fire hazard.
We bought an old house a decade or two ago. The electrical service was a fuse box with assorted fuses from 15A to 40A for various lighting and appliances. About a month into ownership, the whole house went dark!
We called the power company... no outages. But they sent a truck to see if it was the feed to our house. No, that was good. They checked the two bus fuses in the fuse box; they were good too. Then they started checking individual fuses... and found a dead one.
It turned out that 90% of the house was wired through one 15A fuse.
We had the house re-wired...
"It turned out that 90% of the house was wired through one 15A fuse."
I have that issue. The kitchen and the back two bedrooms are on the same circuit breaker. If I have the toaster oven going and the space heater in my office, eventually the breaker will trip but I smell it coming as the wire insulation gets roasty. I plan to redo the siding on the house and will run several new circuits along that wall including a couple of dedicated ones in the kitchen. When I'm in full chef mode, I can have a bunch of appliances all going at once.
I had something similar in my 1930's appartment. One group powered the kitchen, the bathroom (with washing machine and optionally a dryer which I did not have), and some other odds and ends.
I could use the washer, the dishwasher OR the coffee-maker, but never more than 1 of them at the same time.
When I got a new kitchen installed a few years later, I had a group added.
Only afterwards did I find out that one of the other groups powered the doorbell. Only the doorbell.
The last that I heard, nationality doesn't count as a race. This may vary. I'd say it should be, because "race" in this sense is barely a real thing anyway.
According to https://www.worldwidewords.org/topicalwords/tw-spa2.them
British newspaper owner and criminal Robert Maxwell mangled the euphemism "old Spanish customs", possibly from American slang, into "Spanish practices". The web page proceeds to describe tolerance of forms of "traditional" routine wage fraud and extortion by British newspaper print workers, to which the term referred. This is not to say that there wasn't fault on more than one side.
However, it doesn't seem to address technical incompetence. I think it would address supervisory acceptance of work done badly as though it was done well, which isn't unknown in construction and utility work. We probably should consider that we hear stories about bad work and not about good work, but nevertheless, bad work should be spotted and be corrected, before there are stories.
The flat roof on our extension started leaking. This uncovered two Spanish practices here in Scotland. It appears to have been a self build. First the ceiling rose fused, this caused the kitchen lights below to go too. They were connected off the ceiling rose the wiring of it was arcane.
Second was our flat roof leak leaked into next door’s extension so no proper party wall. I did of course get a new roof sharpish.
When I got repiped for the heat pump install they found the floor was boarded in odds and ends not proper floorboards. All done on a shoestring The papers for the place say the council consented the extension post facto ie after all the crimes had been covered up.
"where the phrase "spanish practices" came from?"
I would have thought from the exploits of the Holy Office's Tommy Torquemada (as in "No one expects...)"
I am sure if that jolly chappy had the advantages modern(ish) Spanish electrical cabling there would something more anatomical than just your toaster in the power circuit. He would have improved his inquistorial KPIs with a reduced MTTG (mean time to guilt.)
Standard practise in Germany. Don't forget, everything is on a series of 16A radials, rather than ring circuits (and neither appliances, nor lights, have fuses).
The safety and cost trade-offs between the UK and European systems are complex - but neither is per-se dangerous.
Toasters are rather ingenious actually. There isn't actually a latch as such, pulling the lever down energises an electromagnet that keeps the lever pulled down, along with the heating elements and some sort of timer. When the timer goes up, the power is interrupted, disengaging the electromagnet. No mechanical action involved (other than the spring).
So yes, probably not enough power getting to the toaster while the lights are on. Or worse, a miswired circuit that is returning a hot leg to the neutral of the toaster (giving 0v potential) - I've seen it happening on weird boiler installations and it's frightening.
I was sceptical about the timer but it's confirmed here https://home.howstuffworks.com/toaster.htm that one normal design is a small electric circuit containing at least a variable resistor and a capacitor. Pressing the toast lever switches the toaster on, and the electromagnet is needed for the lever to stay down and the toaster to stay on. As the capacitor charges, the electromagnet eventually is released. Another type of design depends simply on the temperature of metal parts of the toaster, since heated metal expands.
There was a bit of a viral fact/video recently that started saying "the numbers on a toaster correspond to the number of minutes!" which wasn't entirely true. But the fact it wasn't true meant many people took that to mean "the dial isn't a timer". The scale isn't in minutes (it isn't anything really) and because it's a potentiometer/capacitor/transistor timer circuit, it's probably not entirely linear either. So it got a bit vague and confused recently.
"a miswired circuit that is returning a hot leg to the neutral "
At a flat that I bought, all the electricity seemed to work OK until I was redoing the wiring (it was an old property with wiring inside wooden conduit outside the walls). It turns out that at the point where the mains wires were fed into the master fusebox, the live and neutral were switched over, so every neutral wire in the place was actually live!!!
Funny you should bring that up -- the University of Iowa here has an "IT fair" yearly (or at least they did in the past). In the 2000s it was lame, just a bunch of booths for different divisions of University ITS ("IT Services")... but in the 1990s, they'd have SGI, Sun, Microsoft, etc. there and it was pretty sweet. So I was there as a junior high/high school student in 1994 or 1995, and Microsoft had a booth showing a pre-release of Windows 95, by having it run a popcorn maker. Or not. The whole place reeked of burnt popcorn, at the booth was a blue screened computer and this poor computer-controlled popcorn machine with charred former-popcorn and whisps of smoke puffing out of it.
There was always the Honeywell H316 "Kitchen Computer", from 1969.
Perhaps of interest to ElReg commentards, this same basic computer was used for the first Interface Message Processors (IMPs) in the early ARPANET.
https://www.vintag.es/2018/11/honeywell-kitchen-computer.html
You're not making enough attempts to plug them in. If by chance the first attempt to plug it in appears to work this, in fact a superposition with the true connection which requires at least two attempts to insert the plug. You need to unplug it, turn it upside down, fail to plug it in and then turn it back to plug it in properly. This is the consequence of the strange quantum properties of the USB standard.
Back ports are usually mounted directly to the mainboard and work more reliable since their routing the the USB chip is, usually, very short and well tested. Front ports have cables going to the mainboard, and those extra contacts cause reflections of signals with USB 3.0 and higher. Hence the reason why those USB 3.x mainboard-to-front connectors are so small with short pins which cannot hold the plug reliable: Only with those short pins the reflections cause less problems. Not the wording "less problems" here :D.
Extra fun: Quite some boards have extra USB ports in the back, connected via whatever chips instead of the CPU. Some of them are picky what they like.
1: Wasn't aware of non-first language status, glad you found me to be of service. I have two 'first' languages but my 'second' languages are appalling
2: because despite not being able to use the 'grammar pedant' icon as AC, I find that ElReg commentards, possibly partly due to the site losing it's 'Britishness', are less tolerant of cliché'd corrections than they used to be and posting as AC seems to attract fewer knee-jerk downvotes (and they don't count anyway :-)
Also, apologies that it's taken away from the perfectly valid points you made!
I'll give you the "it's". Awful. It's happened a couple of times recently and only (I think) when using my phone, so I'll make a dubious claim of innocence based on auto-incorrect.
Is the other one "cliché'd"? That's an interesting argument. Then again, I've recently been in discussions with people about possessive apostrophes and given names. That is, if Mr. Jones has a car, is it Mr. Jones's car or Mr. Jone's car or Mr. Jones' car?
Hey! Maybe that's why my old wonderful IBM mouse that's big and fat and fits my hand well is so intermittent when it's plugged into my Dell Optiplex 7010 running Win 10 Pro 64-bit.
Maybe I need some type of USB hub (?) that will honor the mouse's older USB and translate it to USB 2 or 3 to the Optiplex?
What do you think? What should I try?
Thanks!!!
I have some USB hardware where one version will not recognise hubs, so devices must be plugged directly into it. The other version recognises hubs, but as soon as something is plugged in the hub it collapses itself under continuous interupts. It's a documented problem, and the recommended fix is "don't use hubs".
USB can be...unpredictable.
I suspect (though have no proof) that this is a system timing problem as the OS attempts to enumerate all USB devices, hubs, etc. Some devices (I have a Sades gaming headset) do not seem to respond quickly enough for the OS to consider them present. The headset eventually works, but on boot, the OS (Linux Mint) says it's busted.
My HP Zbook work PC does everything through a USB-3 C type connector...I have a hub-cum-Ethernet to which the external keyboard (through AT to USB adapter), camera, etc are hooked. Every now and then, this hub isn't fast enough, and then I have no keyboard and have to open the laptop and turn it off and on again. But I do like the integrated Ethernet (gigabit, natch). I am not a fan of wifi for use with Teams...my home office is wired only.
It'll be an assumption made by some engineer or driver-writer. I have a similar problem at work where small groups of computers interact through the network. As installed, the interaction within each group was via a 'private' network to which only these few computers in a specific group were physically connected. This was achieved with 100Mbit PCI cards, and the computers talked to the rest of the installation (for remote control, system updates etc.) through a second interface - usually a motherboard job.
It began to go wrong when the network to which the second interface connected was gradually upgraded to Gbit ports.
It went wronger when our first rebuild with Windows 7 (the originals were XP) kept choosing the 'wrong' interface for the private network and therefore couldn't take part in group activities.
To cut a very long story short, it seems the software didn't actually care which interface it used; it connected itself to the first one 'up'. While XP was able to force an order on interfaces which usually (but not always) did the trick, W7 didn't bother.
It all came to a head when I started installing switches for the main network which had 10Gbit uplinks, it was now possible to get a response from 'the other side or the network' faster than a single-hop 100Mbit card connected to a 100Mbit switch and computers began attaching themselves to the wrong group.
Eventually I realised that with a very little reconfiguration, the private traffic would happily co-exist on the main network and the secondary network wasn't necessary at all! Disconnect that, change a couple of configurations in the software and it seems to have been perfectly fine since.
And mostly (I believe) down to what us electronics types would probably call a race condition.
M.
I supported just such a lab for 20+ years. I remember needing hubs to connect MIDI devices from an age long gone. It was a cross your fingers moment anytime there was an update, either to the OS or the software in use. Some of the horror stories are coming back to me now, I better stop this post...
Many years ago I tried connecting a Nokia (?5130) via serial cable to an ancient laptop, at the time it was the only option available for some remote working, dialling in to work then running an AS/400 5250 terminal emulator. In theory it should have worked fine, in practice it was unreliable until I realised it only worked when the phone was on charge, or at least fully charged.
A dredge through my limited electronics knowledge suggested the likely scenario, the laptop was using TTL 5v logic, the phone was 3.3v CMOS! Once known I just had to make sure I'd got the phone charger as well.
The next iteration of the same connection, with later hardware, was using IrDA instead of the serial cable, it worked fine until I was sat outside my tent on a Sunday, beer in hand, running some month end processes and the sun came out!
>The next iteration of the same connection, with later hardware, was using IrDA instead of the serial cable, it worked fine until I was sat outside my tent on a Sunday,
irDA is WiFi peer to peer networking that uses infrared light instead of wireless signals. So "of course" it wouldn't work in the open air on a sunny day.
These days trying to explain why wireless has some capacity limitations (which translates to "sluggish speed" for the average user) has the same kinds of issues.
USB was tricky in the early days.
Our first attempt at building a USB device, would only work if plugged in directly, not via a hub.
We had to redesign when manufacturers started putting hubs on the PC motherboard.
Our second attempt worked directly, or through a hub, but was orders of magnitude faster when put through a hub.
We got it right on the third attempt, and then decided to build a bluetooth version which was a whole other world of pain.
Had to retire a perfectly good, 25 years old RME Multiface last year because my last laptop with a PCI slot finally went out of business. Actually the interface was PCMCIA(!) but was running fine over a PCI to PCMCIA adapter. Even the 5m cable between (PCMCIA) IO card and the actual interface was still original. For a long time one could use these cards via PCI-PC-express cards but that stopped when PCI-slots on motherboards became PCIe-PCI bridges because the BIOS would not recognize the PCI-to-PCCard bridge behind the PCIe-to-PCI bridge and could not initialize the PC-Express-to-PCMCIA bridge behind the latter.
As if the talk about IrDa and Bluetooth further up on this page wasn't enough to give all of us nightmares for the whole weekend, you had to bring PCMCIA into it as well?!
Why not just say "Ultra Wide SCSI 3" (thunder rumbles) while you're at it?
(Fun side note - a company I worked for had a mom-and-pop-shop cable manufacturer who'd solder you a cable for anything. Small runs or even single cables weren't a problem and their workshop was just up the street, so you could usually call him and place and order, then walk over and pick it up half an hour later. He'd make cables for ANYTHING - I swear, if you dreamed up, I don't know, a cross-over null-modem cable with DB9-to-SCART plugs to connect, say, a traffic light to your washing machine, he'd make it for you, and it'd follow all the relevant EMF specs to boot. He wouldn't, however, touch SCSI 2 or higher for the eloquent reason of "the pins are so damn tiny that I keep burning my fingers, and I don't like doing that".)
Given that I left the company that used his services in 1997, he wouldn't have at the time at least...
Looking at official records, it seems the guy is actually still alive (25-year-old me would have classified him as "older" already back then), but the company was shuttered about a year ago. It looks like someone who would probably be his daughter had formally taken it over by that time. Maybe she had more nimble fingers, but the market for SCSI-cables isn't what it used to be?
Not that unusual.
DB9 was used on the original IBM CGA graphics adaptor for the monitor interface.
A long time ago now, but I think the RGB were 0-5V digital signals, and the H and V sync on separate pins, but with a few resistors and maybe a couple of diodes (plenty of room inside a SCART plug) I connected my first PC (Tandy (Radio Shack) SX1000) from it's CGA port to the SCART input of my TV, and had a 21inch monitor years before anyone else on a PC. Only 240 x 480 resolution IIRC, with all of 16 colours, but heck, it was 1988 (?). I seem to recall there was a setting in the bios to switch the CGA from 525 line to 625 line mode (480/60 to 576/50) too.
The PC is still in the bottom of a wardrobe, but I think the TV is long gone.......
"DB9 was used on the original IBM CGA graphics adaptor for the monitor interface."
That was actually a DE-9 ... the "B" or "E" is the size of the connector. The early PC crowd got the nomenclature wrong.
With that said, I've actually seen a D-sub, B-sized connector with only 9 pins, in a single line down the center ... a true DB-9. They were in some old test equipment that we were re-purposing. I have absolutely no idea why they built it with such a non-standard part ... In about 1990 I called Amphenol for spares, they told me that they made them for a limited time in the early 1970s for a government contract, and they sent me a box full of old stock, gratis (individually wrapped, complete with pins, hoods & hardware). I probably still have a couple dozen or so of each (male and female) in my junk collection. I've never seen 'em anywhere else.
A pedant of the genre might point out that the D-sub connectors are all supposed to have two rows of pins, but what else would you call the things?
Absolutely nothing wrong with SCSI.
From SCSI-2 devices in my Amiga in 1990, through UW-SCSI (ah, the Adaptec 2940UW, God's Own adapter), to U320 in the servers. Never a single issue.
Even mixing narrow 8-bit devices on a wide 16bit chain, provider termination was done right, all was well. SCSI was superb - timed server builds - netware or NT - between identical systems but one with an 8x SCSI CD ROM and drive, the other with a 24x IDE drive (and allegedly faster UDMA disk)... SCSI won hands down. The old Tyan Thunder 2500 (dual PIII) with a Ultra-2 (80mbit/sec) disk subsystem happily outperformed equivalent P4 systems for video editing.
Oh for those innocent days back in 1996-7 when a 'fast' PC had a 2940UW and a Matrox Millennium graphics card.
Im old... time for slippers and cocoa
Same here, including the Amiga experience (CD-ROM and Zip drive). The only time I recall a specific SCSI problem wasn't a SCSI problem per se, more of a firmware compatibility issue. The card wouldn't boot with the default F/W version on a Compaq server with a F/W level below some version# or other. Guess which combo I had to deal with. Unfortunately, the only box the card would go in was the wrong one! Bit of a catch-22 since the card had to go in something for it's firmware to be upgraded. (It was a WFH customer even all those years ago). Had to go to another site to do the f/w upgrade then return to site to replace the failed card which thankfully booted first time and pulled the RAID config data from the drives so almost Plug'n'Play :-)
I always found SCSI super reliable and easy to get going… except where a smart-alec manufacturer “knew better” and built gear that didn’t fully comply with the standard (just like the problems above with serial not using the correct +/- 3-25v etc). Almost always, problems were due to people not understanding the termination requirements, or not setting the correct IDs.
Apple in particular were bad for not fully complying with the standard. The Mac Plus, Mac Portable and PowerBook 100 did not properly terminate the bus, or supply termination power. As a result, they could be “picky” about cable length and what they would work with. The problem was exacerbated by users ignoring the instructions to power off before connecting or disconnecting SCSI devices and blowing the termination power fuse in SCSI devices. Once you had a computer that failed to supply termination power to the bus and an external drive with a blown termination power fuse, the two could not communicate.
I always liked the touch that there were in fact 2 kinds of cables, low speed and high speed. But you were not allowed to sell low speed cables….
Because low speed cables were only allowed where they were permanently moulded into the device using them (for example, a mouse, or a keyboard).
Most of the advantages of having a cheaper, lower spec cable for the things that only need slow speed. None of the consumer confusion!
I'm a bass player, so I shouldn't throw stones*....however...
Q: How can you tell if your drummer's riser is dead level?
A: They are drooling out of *both* sides of their mouth.
.
.
.
* And certainly not at my own 'trade' :-
Q: What is a bass player?
A: A bloke that hangs about with musicians.
On an ancient, but very well equipped iMac music production system (with plugins for 1000s of £s), the screen died. To keep it running as a two screen setup (it already had one external DVI port), I found an old USB to DVI converter on Amazon and, after a long search, the appropriate software driver on the internet. Worked flawlessly, and was fast enough to have the mixing console on the USB controlled screen. Whoever wrote that driver was seriously cool, not a single crash over years.
Apple have a long history of very bad USB implementations. Sometimes deliberately so.
Some models of Mac fail to meet timing (I think due to the TPM chipset) so you get stuttering over USB. Really nasty for audio - a major professional use for Macs.
And of course, El Capitan deliberately changed how Macs enumerate USB devices, swapping the order of a few of the steps.
That change bricked a great many devices, including many popular professional USB audio devices.
Apple then claimed that the USB standard didn't explicitly say that a host had to do those steps in order. I suppose that might even have been true, but it was still a stupid and evil thing to do.
A rare few USB device manufacturers released new device firmware six months or so later - which needed Windows or Linux to install, as the bootloader wouldn't enumerate on El Capitan either.
Most of the affected USB audio devices were simply trashed.
I know quite a few people who moved to Windows because of this.
Years ago I ran a company sanctioned FTP server in a colocation data center. The firewalls it sat behind were our product (AC for a reason), and had a tendency to lock up after a few weeks. The problem was once one locked, it was about 8 hours until the second, then 4 hours on the third. Sure, I could access the APC power strip and cycle the power if I caught it in time, otherwise it was a road trip to the data center to press some power buttons.
The solution was some mechanical light timers, set at 8-hour intervals. Firewalls had high-availability & load-balancing, so it was no problem to give them individually a 30-minute forced time-out once a day. Kept their re-boots fresh and I never had to make a panic run to the DC again.
"The solution was some mechanical light timers, set at 8-hour intervals. Firewalls had high-availability & load-balancing, so it was no problem to give them individually a 30-minute forced time-out once a day."
A perfectly fine solution ...until someone decides to upgrade the firewall firmware and is unaware of the timer - or just momentary forgetfulness...
If you want something that always worked, not much better than good old "bus" and "tag". Pretty standard in its day.
As for SCSI, had Compaq and Western Digitan gone that route back a long time ago, we wouldn't have various incantations of SATA/ATA/IDE to deal with and more than two drives easily connected a long time ago. Oh, well (*SIGH*).
My Firestick started buffering. There's actually a speed test utility in it's settings. It was getting about 1 Mbps compared to my phone's WiFi speed test of 50 Mbps at the same location. Rebooting didn't help. Then I noticed that the nearby Bluetooth sender was blinking, it was in pairing mode. As soon as it was either paired with something or turned off, speed of the Firestick sprang back to normal.
The router was behaving badly. Moved the power supply away from it a bit, all good.
A friend's laptop had completely died. I asked about its recent usage. Someone had been plugging AV kit into it for a presentation. On a hunch I carefully examined all of the laptop's sockets. There were severely bent pins where someone had stuffed the wrong plug into the ethernet socket and two pins were in contact. Pulled them apart with fine pliers and it booted up.
because the mains in our building is exceptionally dirty. The laboratory device worked fine when used with a laptop, but not with a micro PC from Dell. The engineer couldn't figure it, so left the laptop used to set it up just to keep us going - job done for him. But then the laptop needed to be plugged in to charge. The machine didn't work. USB was loading and dropping device drivers like crazy. So I had a brainwave and dragged a spare UPS in from another room. The micro PC worked fine when on the UPS battery. The more expensive hubs and cables have through grounding, which created a ground loop. The cheap and nasty one didn't - route the USB through that and it was all good, and I could return the UPS to the job it was intended for.
Always craving the latest leads to a sort of upgrade spiral that doesn't actually get any work done because everyone's always trying to get 'the latest' to work.
The sad truth about software is that it doesn't wear out (it has to be designed to degrade over time to give the impression it does but if you're in the 'thing' business you'll know what I mean). So unless there's some kind of critical change in the job then whatever was working last week, last month, last year -- or even last decade -- will continue to work just as well now. So 'upgrading' just brings problems in its wake, in this case a problem with USB hardware and drivers not being fully tested (because its only tested with 'the latest', of course!).
"The sad truth about software is that it doesn't wear out [...] So unless there's some kind of critical change in the job then whatever was working last week, last month, last year -- or even last decade -- will continue to work just as well now."
Sure, as long as your hardware hasn't changed and not only do you never find that you need something else, you never think "You know what else a computer could do". If even one of those statements isn't correct, your software will in fact wear out. Try running an old operating system nowadays. There may be a few hardware problems, but if you put a week of effort into it, you'll probably get something functional. Now try to do the things you do today with that system. It's not going to be as useful as it was when it was new, and most likely, you'll find problems in the old versions that nostalgia has hidden.
I have a MacPro (small) cheese grater I use as a production machine (comms are done on a MBP). I've maxed the thing out and bodged in a copy of Big Sur. It runs awesome and with a 4tb video card it does most of the things I need. I'm stretching things until I can budget for a Mac Studio or whatever the equivalent is that I'm forced into. In the mean time I need to stuff the kitty to be able to pick up a piece of land I want when it comes up on a tax sale. As long as the computer runs and can do useful things, I'll hang on to it. I've got a bunch of ancient hardware around the house and shop that gets used for stuff it can do.
"and bingo – one of them reined in the Mac Pro's blindingly fast USB speeds sufficiently." - which were slower than a budget pcs at the time.
It sounds like the sampler was using com port over usb, a tech that has a lot of issues on MacOs old and new as you couldn't change the speed,
I had something similar to this occur with other peripherals and just changing the baud rate, sometimes data bits or sig bit in linux or windows and off you go.
It's also odd to see a hardware sampler in that era, we had 24bit sampling on computers in the 90's and pci XLR panels for direct input.
Can't remember for the life of me who made it, but where I worked, we had a small video studio. As part of the that studio, we had a PC with a weird sound "card". I put the word "card" in quotes because the card that went into the PC was essentially a cut down SCSI card that communicated with a rack mounted box that had four inputs, and four outputs. Both could be routed through multiple connectors, including XLR.
Apparently the idea behind this was to remove the electronic noise generated inside a PC. All I know is it sounded excellent, and even did a good job of playing games, as long as they ran under NT 4.