But surely
An ohm is the place an amp lives?
Welcome once again gentle reader to another Monday morning, and with it an instalment of Who, Me? in which Reg readers cushion your entry to the working week with tales of things going not quite right. This week meet "Nikolai" who spent some time in the mid-1980s working on some fancy bespoke communications solutions. On the …
El Reg Tesla good story here about Nikolai, but I'm wondering why the Regomizer didn't name him Henry due to the lack of induction when reading the electromagnetic field given off by the display?
Now, I'm not an electrical engineer, but wouldn't the current setting on the power supply have been a current limit, and if the emulator was designed to work with a current of 2.5 A, then it shouldn't have drawn more than that if the current limiter had a higher value - a bit like replacing a 5A fuse in a plug with a 13A fuse doesn't suddenly make the properly working attached equipment draw 13 Amps. Which means the emulator was broken before our hero increased the current limit to 5A. The evidence is that the 5V rail was only at 2.5V - which would usually meant that the (normally) voltage stabilised power-supply was being forced to deliver more current than expected - the symptom being the voltage drop, which shows the current limiter was doing its job clamping the current to a maximum of 2.5 Amps.
StackExchange: Electrical Engineering: Voltage Drops When Current Reaches Limit On Supply
As I say,., I'm not an Electrical Engineer, so if I've missed something obvious, please be gentle.
NN
You're right. That's how I understand electricity to work too, and I grew up around electricians (mom was an engineer for Sharp. Dad worked for ITT, Grundig and Panasonic before deciding that he wanted to be his own boss and opening a electrical store). If a device drew 2.5A before, increasing the amp would not cause the device to overload and blow up. In fact, the device will continue to draw 2.5A as before and the power supply would actually run cooler due to the reduced load. I'm betting the knob still increases the voltage even when the screen is showing ampere.
Yes, I guess you're not an electrical engineer.
Nicer power supplies often do have current limiting. But most adjustable power supplies have one or more meters to show the voltage and the actual current.
Nicer supplies have two meters, but others had one that you had to switch between volts or amps.
This story works best with simpler supply that you switch between seeing volts or amps. With the meter set to show amperage turning the voltage up to get the meter to 5 would get you the rest of the story.
but wouldn't the current setting on the power supply have been a current limit, and if the emulator was designed to work with a current of 2.5 A, then it shouldn't have drawn more than that if the current limiter had a higher value - a bit like replacing a 5A fuse in a plug with a 13A fuse doesn't suddenly make the properly working attached equipment draw 13 Amps
My guess is that he saw "2.5" on the display, assumed it was the voltage, and turned up the voltage control until the display read "5.0". As the volts went up, so would the current - but not necessarily in direct proportion to volts (it probably went up faster), but that doesn't matter, it just went up as the voltage was wound up. At some point, the voltage was too high and the magic smoke escaped. The current limit probably wasn't involved at all, most likely set to max so as not to interfere with voltage regulation.
The article is actually rather misleading at the end. Had current increased linearly with voltage, then by the time it was at 5A, the voltage would also have doubled (to 10V), and the power taken would have been up to 50W - not the 12.5W stated. As above, the current probably didn't change linearly with voltage, and the magic smoke would have escaped long before that point was reached.
A friend told me that she had once been testing a dozen brand new, very expensive, monitors. They were the first batch off the production line and we needed to check they had been correctly built. After the tests were done, she sent them back to the manufacturing plant in Scotland for repacking and distribution. They decided to do their own basic test that the displays did still work so they could be sold on.
Plugged the first one in, and it didn't work. Plugged the second one in, and it didn't work ... All the way up to 12.
At which point they finally checked the specs and found that they had used 240V power instead of the 120V required by this model. And that they had managed to wreck all of them.
Fuses and circuit breakers have a maximum current that they are designed to break. Interesting things happen when that current is exceeded - circuit-breaker relays can be welded together, and the wire in a fuse can be vaporised into a conductive plasma. It's why fuses are often filled with sand, to suppress the plasma. High voltage relays use techniques to either suppress plasma formation (like being in an envelope filled with a gas that doesn't ionise easily, often sulfur hexafluoride), or to make the distance between the terminals in air rapidly become too great for a spark to jump, or literally have fans to blow the plasma away from the terminals.
Lightning jumps open terminals because it forms conductive plasmas.
Given a high-enough voltage, everything becomes conductive, including what remains of the fuses and circuit breakers protecting the consoles in the USS Enterprise. You don't need to suspend disbelief for that effect. The transporters, on the other hand...
> I suggest designing all electrical appliances to ensure they fail spectacularly when you plug in the wrong power supply.
Valve based oscilloscope with a sliding 115/230 volt switch that was not very clear which position was which - the sparks from the HT flying around to the case was quite spectacular.
Amazingly it still worked afterwards
This was the method we used on a certain printer design. If you set the switch to the wrong AC line voltage input, an electrolytic cap inside would boil and explode, stinking up the test line.
There was almost always one, because of a poor design choice on the test line to use the same US 120V outlet style for both 120V and 240V power (saving money on power cords, right...).
I suggest designing all electrical appliances to ensure they fail spectacularly when you plug in the wrong power supply.
Many did ! I've blown "one or two" over the years.
Most impressive was a large printer, and I'd just been sent to swap the supply - I wasn't responsible for buying it. Unfortunately, I never thought that they might have sent the wrong one, and after it let go in a manner that wasn't missable (it was a big of a power hungry beast it was driving), I looked and found that it was in fact a 110V model (no switch, different part numbers for different voltages). The guy at the maintenance company that had subbed the job out to me was "a bit apologetic" and arranged to get the right one sent out.
"if they had just used a slightly more expensive power supply that supports the whole 100-240 voltage range"
Back then, it wouldn't have been a case of "slightly more expensive". It would have been somewhere in the range from "*significantly* more expensive" to "doesn't exist"
The better power supplies had a physical 110/220 switch -- and better-still ones had some kind of locking mechanism to keep the switch from getting flipped accidentally; you could only change it with intention. Lower-end supplies only supported 110-125 VAC or thereabouts (or correspondingly, I presume, but my experience is in Leftpondia).
I imagine auto-adjusting power supplies existed, but the first time I encountered one, it was a thing of wonder -- and not quite to be trusted, after many years of the sorts described a above. "Are you *sure* this thing won't blow up?!?" (I can't remember when that first encounter would have been, but I'm fairly sure it was post-80s.)
I knew the name "switch-mode supply", but didn't know what that means until just now looking into it. Given that they deal in square waves anyway, and ones with weird duty cycles at that, they must have really good filters at the end to produce the final D/C.
Going out on a limb here, does that imply that needing proper sine-wave A/C input, e.g. from a UPS, is no longer the big deal it once was?
They became widespread once the price came down. Automatic multimode power supplies (or licenses to use someone else's design) were still somewhat expensive in the the early 90s. It was much cheaper to just use a switch for quite a long time after the designs were available.
Personally, after the first 2 didnt work, I'd have stopped testing until I'd passed the buck up the tree to say "Something isnt right here. They say everything's fine, but the first 2 I've tested failed. What do we do?"
To do 12 in a row - Either someone was very stupid (so why on Earth where you letting them test new kit!), very afraid of asking questions (management fail), or it's a rather subtle version of sabotage. Since I dont think there are any more monitors being manufactured in Scotland, I guess the sabotage succeeded!
> Please note: fuses are directional, so please ensure you install your fuse accordingly. If you are not sure on the correct direction, simply try it both ways- you will be able to hear the difference.
Well, I'm convinced!
£4,200.00 a pop? No problem, I'll just mortgage Fenchurch St station and both utilities then wire the bank to send the cash straight over. The last two hundred will be along as soon as I pass Go again.
A weak lemon drink? Why, thank you nurse. Say, you'll never believe what I just bought.
That's when I gave up buying hifi mags. The concrete bunker speaker installations for 'rock' solid bass I could just about believe, but the necessity of soldering every mains joint in the house wiring to reduce noise pickup and distortion escaped me.Presumably this quantum fuse fitted snugly into a standard cheap plastic mains plug...
Spent a morning rebuilding a power supply for offshore work, then Friday lunchtime pub visit, on my return to my bench failed to check the Voltage toggle switch prior to testing & Friday mornings work & a lot of things vented their magic smoke.
Friday afternoon repairing it was a bust (Due to lunchtime) & I did't get the task of repairing it as at 4pm I was told I was going to be sent on a course that someone had dropped out of (Either Montrose for a weeks Offshore firefighting course or Nantes for a two week course on the Syledis positioning system we used, I forget which).
Back in the analogue world, I used to supply pencil galvanometers, which had a max current of 75 mA in most cases. These were not cheap, at around 200 pounds an item.
Some expert in a stores decided to test a batch of these with an AVO 8 - which was capable of a couple of hundred mA if I remember correctly. I think they got through 100 or so before figuring out that they have made an error........
A girl in my halls at Uni had never been away from home before to live and so the dark arts of laundry and cooking were new to her. Her mum had provided her with a handwritten set of recipes to cook whilst at university that were easy to do and things she’d eaten at home. Unfortunately her mum had forgotten something rather crucial which we only discovered when the building had to be evacuated. The recipes were written using Fahrenheit and the cooker only showed Celsius on the dial.
Never having cooked before she put the cooker as high as it would go because it couldn’t reach 430 degrees. After 15 mins she put the offending food into the scorching oven, a timer on for 20 minutes and she had then gone outside for a cigarette. Not long afterwards there were clouds of black/grey smoke emanating from the kitchen on our floor. Making things worse, because the window in the kitchen was open and there was a howling gale outside it quickly started smelling everywhere on the floor, well before the smoke alarms were triggered. Fire brigade were automatically alerted and the head porter had to call them to prevent a visit saying it was just a burnt pasta bake.
The perils of smoking.
We were standing outside in 30-40mph winds in a temperature that wasn’t even 10 degrees Celsius. Some wag quipped that if she’d had the cigarette indoors we might not be standing outside freezing our behinds off*. The pasta thing was cremated and she’d only opened the window in the first place because it was so hot in the tiny kitchen. Little surprise it was hot given she had the oven set to maximum.
*Smoking was verboten within the entire university buildings but you could detect signs by looking for ash trays on window ledges.
"turned on the cooker .... she had then gone outside"
This is something I just cannot understand. How on earth do people manage to be ALIVE while simultaneously missing such an huge part of their brain?
You never ever EVER *EVER* walk away from a turned-on cooker. You don't need /teaching/ that, it's as obvious as breathing.
The worst smell I've come across is burnt boiled eggs, where somebody (fellow student) wandered off and forgot about them, and the water boiled away completely. I'm not a fan of burnt popcorn either, when somebody (teenager) chucks a bag in the microwave at full power and wanders off in forgetful mode.
The same teenager wandered off from a frying pan left on a hob at full whack. Luckily oil had not been put in the (clean) pan, so it was just the smell of hot metal and whatever tiny residue was left after it was last scoured (it wasn't Teflon), otherwise I suspect the fire-blanket would have had some use. I should have used the opportunity to demonstrate the Leidenfrost effect, but I wasn't in quite the right frame of mind.
《Wait till you inhale burnt rice after our floormate tried cooking rice in a rice cooker. Parents didn't include the other vital ingredient as well as rice.
Water》
A classic. Gen Mai Char? ;)
Half a clue wouldn't go amiss either.
In these parts, parent used to hand their offspring embarking on life's adventure a copy of The Commonsense Cookbook. I still had my 1970s edition until a few years ago but I had by then been seduced by Elizabeth David and even had a copy of the slightly obscure Culinary Jottings. :)
> You never ever EVER *EVER* walk away from a turned-on cooker. You don't need /teaching/ that, it's as obvious as breathing.
Funny thing, our cooker has this timer thingie, so we tell it the casserole takes 3 hours and we want it done by 18:30, set the temperature and wander off.
We even leave the kitchen, until the pinger goes, when the sourdough loaf first goes into the oven!
No sprogs around here, btw.
For a rather hair-raising example of metric/Imperial units confusion, see the Gimli Glider near-disaster.
As so often, the units thing was only one of a series of errors that all had to occur to lead to the incident. But there were also a number of improbable things that all had to go right in order to save the day -- paradoxically including one of the failures.
Once, just once, I had the deep-fat fryer on the hob so the cooker sucker sucked all the steam away. All went well until I accidentally turned the ring underneath it on.
In seconds the kitchen was full of burned plastic fumes. What an idiot I am.
(I have an engineering degree, so if you put it down to a vital experiment I suppose I could say I learned something)
It took me a while to realise you meant one of those self-contained modern deep-fat fryer things: "of course you turn the ring on, how else would it work?" and "huh, his chip pan was made of plastic?".
We got an electric rice steamer last month. Maybe I had better keep well clear of it.
I destroyed my first rice cooker that way. Put it on the cooktop while I cleaned the bench and accidentally bumped the knob. I still insist it wasn't my fault. What short of idiot designs a control knob with 360 degrees rotation so that the slightest nudge can set it to full power?
As someone who lives in the USA but visited the UK this summer - I'm glad that most transformers/adapters and a lot of equipment now handles anything from 100V to 250V at 50 or 60 Hz.
Things with heating elements, etc won't, but chargers, etc work just fine.
Your 220V USB charger probably works fine in Canadia, for example.
I once had a PC on loan from a company that owed me for some contract programming. They insisted in taking it back, in person, at my parents address, with a big bloke standing in the background. The police were called but didn't want to get involved, advised me to hand it over.
Went upstairs, flipped the switch on the back of the beige box to 120V, for it was before the days of auto-ranging PSU's, and handed it over. Big bloke demonstrated his USP and carried it out to their waiting vehicle. A colleague in the same situation did the same.
Apparently after the first one went pop, they checked the second one very carefully, before turning it on...
Our other actions on that day got us mentioned in an article in PCW.
Never did get paid. Scumbags. Opened a newspaper a few years later to find the company founder had died an untimely death. I like to think it was the delayed shock of that PC going up.
Not on expensive kit but entirely wilfully (and in a very distant past), when some electronic circuitry didn't work as intended, it could only mean one thing: it didn't have enough juice. So we ramped up the amps. And if one bench power supply (usually around 30V / 5A max) wasn't enough to make the magic smoke visible, we would stack them. And, before you try this at home, be reminded: 90V already delivers a slightly sub-pleasant sensation through the fingers.
My physics teacher ran an evening electronics class. He wanted to show us the perils of wiring an electrolytic capacitor the wrong way round. It took a surprising amount of volts and time, with us all hiding behind desks at the back of the classroom in case of shrapnel, before it obliged and went bang.
You wouldn't get away with that sort of thing today.
The physics lab was also used as a hobby electronics lab, shared by two teachers, one good (who ran the electronics lab) and one bar steward.
The report and bullet-like motion of the casing of a small transistor that had been wired across the mains on a timer switch was impressive.
This was on a par with the wag who painted the floor of the chemistry lab with nitrogen tri-iodide. The teacher had been pleasantly suprised that we had politely waited for him to enter the lab first...
But also why the hell have the option for double the power straight into a device that on the face of it, not only doesn't require the ability to switch to it, nor had any safety components to stop it self-destructing.
Even if it was an "off the shelf" PSU across different models - big sticker over the switch with a warning not to adjust it.
A proper bench-top power supply should be able to display both volts and amps at the same time.
I just tested a hand-built assembly over the weekend. The little bench supply I have on hand can do 30V, 10A, but is set to 24V and I get to read the live amps to see actual load taking place (0.1A on one circuit, 0.24A each on a couple others).
Hm... I wasn't getting a bench-top PSU vibe from my first reading, but rather a built in PSU.
A bench-top one would make more sense in this context though, you're right.
But I can't remember the ever seeing one that didn't have current AND voltage displays.
Back in the days of 7 segment LEDs and moving needle meters, manufacturers would save money and panel space by using a little V/A switch instead of two large, expensive dedicated displays. If you didn't manage to blow up your project it was still a problem as the switch would always be in the wrong position for whatever test you were running, and the supply would be just beyond convenient reach to switch it.
I barely have a working knowledge of electricity, but I always thought amps were "drawn" rather than pushed. I've been told (for example) that adaptors need to match voltage, and need to have an equal or higher amp value than the device you're powering.
"I barely have a working knowledge of electricity, but I always thought amps were "drawn" rather than pushed."
Traditionally, that's entirely correct.
But this seems to have been kit which could be set to deliver a certain current. So when it was set to deliver a higher current it probably increased the voltage (which, assuming the resistance of the load remains the same, will also increase the amps drawn) until the selected higher current was delivered. The resulting higher power then destroyed the kit (which as someone else pointed out, could have done with a bit more protection).
When I was at secondary school one of the teachers built a sizeable analogue synthesiser - and then one day connected the power supply the wrong way round leading to loss of a lot of magic smoke :( . So now if I build something significant (rarely) I try to remember that and provide adequate protection.
" could be set to deliver a certain current."
nah , cant see how , or the point , of that . lowering the voltage might do it ... but why? Maybe if you wanted to test the emergency cut out circuits of something. But you'd do that by making it draw more amps not by forcing amps at it . I dont think thats possible .
I *think* what happened in the story was the voltage was at 5v ,as it should be , the joker switched display to amps , which happened to show 2.5 , so the guy upped the voltage and broke it .
I dont know what the amp readout wouldve said during that , might've gone down briefly i guess , what the with the wattage being satisfied more with amps than volts, but it probly happened quick .
I suspect you are correct. Typically a supply has both an adjustable voltage and current limit. For most cases, the user will set the voltage and then adjust the current limit to be something a little above what the application requires. The supply will stay in "voltage mode" holding the voltage steady while only providing the amperage that the load asks for. You can turn the current knob all the way up and the supply will only provide the amperage that the load requires while in "voltage mode". Only when the load tries to draw more than the current limit does the supply switch to "current mode". Now, it won't allow the load to pull more than the current limit setting which causes the voltage to sag.
My guess is as you say, they were looking at numbers which were reading amps and they turned up the voltage knob until the device started pulling 5 amps.
Yeah, you can't "adjust" current, not without changing the characteristics of the load.
Constant current drivers for strings of LED lamps work by adjusting their output voltage to suit the number of lamps connected. I can appreciate how it must work, but it still seems like witchcraft to me.
Some bench supplies can operate either in constant voltage or constant current mode. In CV you set the voltage and the load will pull whatever current it requires (I=V/R). In CC you set the current and the PSU will push that current through the load and the PSU voltage will adjust to whatever is required by the load (V=IR). CC mode is rarely used - I only used it once in my career for biasing some weird microwave diodes. Most bench supplies have a current-limit setting that prevents the sort of problem seen by today's protagonist. It's good practice to set the current limit to a bit above what you expect the system to draw, especially if you're dicking about with it.
In this example the PSU was in CV mode and the supply was set at 5V but the display was showing the current drawn by the system, not the voltage applied. When the voltage was increased the current drawn by the system increased (I=V/R), so it looked like the voltage was being turned up. Until the kit blew up.
Exactly this. Note that it also means that at the magic smoke release point the voltage was significantly higher than the 5 volts claimed and thus the power calculation (V*I) that the article makes is wrong - it would've been a lot more than 25W
HP supplies are my favourites. I have a couple...rescued and repaired (the manuals have the schematics!).
I have noticed the cheap Chinese supplies display volts and amps, but I'm not sure I trust them completely yet. Certainly, the ones I have seen don't have any current limiting, and an overcurrent shutdown seems more likely than foldback current limiting, but I'm in the market for one, so perhaps I'll be pleasantly surprised.
Story told to me by my father, a long time ago.
One of his work colleagues, a Professor or similar, mentioned to him that the power kept tripping whenever he (the Prof.) tried to start a particular machine. Dad suggested that he connect a 100W light bulb in series to limit the starting inrush current, and switch it out (ie short it) when the machine was running to give the machine full power, thus "Soft Starting" the machine. The Prof. argued that that wouldn't work, saying that the bulb would blow because the machine would still want to draw its full current and thus overload the bulb. Dad had to patiently explain that it would work because the bulb would be in control, reducing the current, not the machine
This works because the resistance of the filament while cold is very low, but increases sharply as it heats up. That's why bulbs fail when you first turn them on - the inrush current is far higher than the steady-state current when hot. Weak (thinner bits form from crystallisation during use) points in the filament when turned on get overloaded and melt.
Incidentally we have a Edward VII Coronation commemerative light bulb from 1901. Never dared try it, but it tests ok!
"If you push hard enough it'll draw the amps or die in the attempt."
What does pushing hard enough look like? My very rudimentary understanding is that the only way for the power supply to "push harder" is to increase the voltage. You seem to be suggesting that it's not that simple. What am I missing?
"To me, part of the blame lies with the manufacturer of said very expensive kit."
Yes, and no. Remember, thus was the mid- 1980s. The assumption was that only people with clues would have access to such kit. This is the time when many stories of switches set to 120V were wired into 240V mains, with a resulting release of the magic smoke. It's also the era when we started seeing units that would automagically adjust for most household voltages world-wide.
Mid 90s. Early in my career as a chemical engineer, so my memory of details is hazy. Had a job to do monitoring the organic chemical emissions to air of some plant or other. Had a whizzy bit of kit to log the emissions: a flame ionisation detector, which would output an electrical signal, and a datalogger in the shape of a Psion Organiser 2 - the big chunky calculator one, not the cute microlaptop Organiser 3. There was, IIRC, some kind of interface thingy on the top, and we were supplied some cableage to connect the two devices. Again, IIRC one of the plugs was an old 9 pin D plug like an Atari joystick. And the same connector was used for a power supply...
Long story short, it was physically possibly to connect that up incorrectly. Which of course at some point I did. And at that point the magic blue smoke started coming out of the Organiser. So I didn't do any monitoring that day. The organiser and the cables went back to the supplier, and were returned, free of charge I think, with some of the pins snapped off one of the plugs and some of the holes glued up on the socket... making plugging the wrong thing in there physically impossible. An elegant solution, and just a shame they hadn't foreseen the possibility before handing the thing to a wet-behind-the-ears numpty like myself.
>> And the same connector was used for a power supply...
Oddly enough the sea trials of a very expensive piece of naval hardware were interrupted because someone saw a 25 pin D connector and assumed that it would be OK to plug in a serial cable...
I guess its one of the hazards of having a multipurpose connector - it can have many purposes, one of them inevitably being "finding out who actually read the fine manual before connecting random cable A to connector B"
If I have to use cheap connectors, I prefer automotive ones like the Deutsch DT series. At least you can specify keyways to prevent connector mismatching.
I just built a box with (1) 8-position and (3) 12-positions (A, B, C). As long as I wire up the other side correctly, you can't swap them, so no worries.
once had a job to supply a bunch of remote sites with a router and a modem for backup.
Both the same tubular power connector... unfortunately one was 9v and the other 18v... many 'dead on arrivals'... rapid deployment of colour-coded WARNING labels for plugs and sockets
(luckily it was the el-cheapo modem they were frying, so they had service while we rushed them YET ANOTHER modem)
The BBC used to use widely an XLR mains connector. It was considered pretty safe, until someone playing around discovered it was just possible to force a connection to a nominally non-compatible audio XLR connector... a certain amount of scope for mischief was available. Even after they were banned, five pin xlrs were still used to provide (via different pins) various SELV voltages.
Well CGA, EGA,Tokenring, atari/commodore/sega/etc joystick users and many others might disagree about what that connector is for. Serial only started using it later on, so I think serial is the one that should have picked something else. I certainly do not assume it is a serial port on the machines I have around.
And IIRC, they swapped the numbering for Tx and Rx so a 9-pin serial to 25 pin serial DTE-DTE cable was 2-2 and 3-3 rather than the expected "by tradition" of a cross-over of 2-3 and 3-2. And serial cable were the bane of all of us involved in connected kit back then anyway, even when they used "standard" connectors. What HAD to be connected, what MUST NOT be connected and what MUST BE looped back varied in almost every application :-) Which type of plug or socket was being used at each end was the least of the problems :-)
"At this point, using a 9-[pin sub-D connector for anything but a serial port is negligent design. Pick something else."
As others have already pointed out, 9-pin d-sub was in use for many different devices on computers before it got used for a serial port. Even before IBM standardised on 9-pin d-sub for serial, the Olivetti PCs (M20??) used 9 pin d-sub for both video and keyboard, one male, the other female. Yes, even PC keyboard connectors were not yet "standard" by then.
Some of the stuff I've worked with in the past looked like a display model for a connector manufacturer - all the sockets were off the shelf parts, but all sufficiently different from one another such that, unless you used the largest sledgehammer in your toolbox, you weren't going to be plugging a cable into a socket it wasn't intended to be connected to...
As a useful side effect, it also helped guide you to which socket was which provided you had a basic grasp of what size/spacing pins you'd need for mains vs 5VDC vs telecoms etc., so you weren't having to scan through the full list of sockets to find the one you were after.
I guess its one of the hazards of having a multipurpose connector - it can have many purposes, one of them inevitably being "finding out who actually read the fine manual before connecting random cable A to connector B"
yes, also very common in the early days of standalone personal computers. 9-pin and 25-pin D plugs were common, and there were no standard yet as to which was used for what. Male or female, 9 or 25 pin, could just as easily be almost anything from a joystick to a printer to a keyboard to a video out and whatever you think it might be, if you don't look at the manual or, if lucky, the on case legend, very likely is NOT what you think it is. In some cases, they may even be edge connectors directly to the motherboard, eg IEEE and other interfaces on a Commodore PET which may or may not be keyed, or if it is keyed, the plug isn't!!
Reminds me of the time, back in the early 1970's, when I had a holiday job working in a photo and hi-fi shop. Quad was a very desirable hi-fi brand and, whilst their pre- and power-amps were easily sourced, their electrostatic speakers weren't. We'd managed to get a pair in stock, and a customer who was prepared to fork out a small fortune to take them back out of stock. Management insisted on demonstrating them first and, because they were so rare and expensive, us oiks were not allowed to even touch them - the store manager was going to set everything up himself. Come the big power-on and a bang as both speakers blew. Red face from the manager once he realised he'd swapped the power and audio connections to the speakers (perhaps I should explain that these speakers needed to be fed mains power as well as the audio signal) - not sure how he managed it as we weren't allowed to look.
He kept his job but I believe his pay packet was lighter that month!
"it was physically possibly to connect that up incorrectly. Which of course at some point I did"
This is exactly what Capt.(?) Edward Aloysius Murphy, Jr. was getting at with the original formulation of his eponymous Law -- not the "whatever can go wrong, will go wrong" that has come down to us, but more along the lines of "if it's possible to do a task wrong, sooner or later someone will do it that way".
In other words, Murphy wasn't being a fatalist; he was arguing for defensive design, wherein you make it *im*possible to do the thing wrong.
The short version of the story is on Wikipedia. Here's the long version.
Silly thing to do. I can't be the only person who could never use a multi-meter on an incorrect setting....... If only.....
Yesterday, I tried to measure the voltage on a small battery and wondered why it got rather hot. Fortunately, no-one was watching so it didn't happen.
Not involved-->
I never got a sensible reading when I (accidentally) tried to measure the resistance between live and neutral of my mains supply. The end result was similar to that of a University colleague who had the bright idea of getting a ~500v supply by putting two adjacent 240v sockets in series.
Of course electricians (proper ones, not DIY-Dave sort) have the equipment to measure the supply impedance, works by switching a load of a few amps on/off and correlating the voltage shift. If you try it on a cabin with a site cable feeding it you see the lights flicker during the test.
Such testing ought to be done so you know that (a) enough current will flow during an earth-fault the the protection will trip fast enough to avoid a shock risk and/or cable burning out, and (b) too much current won't flow beyond what the fuse/circuit breaker is rated to interrupt safely.
> Of course electricians (proper ones, not DIY-Dave sort) have the equipment to measure the supply impedance, works by switching a load of a few amps on/off and correlating the voltage shift.
I am a very improper electrician, but I have the technology to measure supply impedance. Volt and Amp meters inline to my fusebox. At very light load I see 125 Volts (twice). On laundry day the house may suck 30 Amps and drop to 113 Volts. The 12V drop @ 30A makes 0.4 Ohms. This correlates well over a wide range of loads (limited by rounding-error of 2.5-digit digimeters) and with estimates of line length and size and metal. (And yes, 10% sag at less than rated current is poor, but the house is cheap, way out in the woods, and was probably a trailer when the wire was brought in.)
> Such testing ought to be done so you know that (a) enough current will flow during an earth-fault the the protection will trip fast enough to avoid a shock risk and/or cable burning out, and (b) too much current won't flow beyond what the fuse/circuit breaker is rated to interrupt safely.
"Earth fault" as in to-dirt? Can't happen here. My best dirt-rod is 120 ohms to remote dirt (dry sand over shallow rock). Four of those is 30 Ohms and 30r at 120V is just 4 Amps. Never going to heat a wire or blow a house fuse/breaker. (Will trip a GFI/RCB, proven repeatedly.)
I don't think ANYbody tests Interrupting Current on site. My sub-breakers are rated 22,000 Amps. That is a MAJOR arc-flash. Not a problem here because with my long line I can not pull even 1,000 Amps at my dooryard. But in Texas you can find four homes clustered within 20'(7m) of a 100KVA transformer, which is why they don't make 10,000A-interrupt breakers any more.
"Earth fault" as in to-dirt? Can't happen here.
Here earth = ground = CPC (circuit protective conductor). Most UK installation are on TN earthing so a fault to earth is essentially close to a fault to neutral (=cold in USA parlance). To protect against contact with true Earth (as in some poor sod touching a live cable) we use RCD = GFCI as they (in the UK) are usually 30mA trip threshold (actual spec is 15-30mA).
I don't think ANYbody tests Interrupting Current on site. My sub-breakers are rated 22,000 Amps. That is a MAJOR arc-flash. Not a problem here because with my long line I can not pull even 1,000 Amps at my dooryard. But in Texas you can find four homes clustered within 20'(7m) of a 100KVA transformer, which is why they don't make 10,000A-interrupt breakers any more.
Nobody tests actual fault current intentionally, out side of approval laboratories that have controlled sources of many kA and a means to contain the usual explosion that follows!
Here in the UK we have less of an arc-flash risk in general due to a love of HRC fuses at the incoming point of most small-medium sized installations. They do a much better job of limiting fault energy that most MCCB/ACB style of breakers. In domestic UK it is unusual to see above 3kA fault currents, except in some cities where the LV grid is actually a grid, most LV networks here are deliberately smaller and isolated to keep fault results down. Our domestic panel = CU (consumer unit) is designed to be safe to 16kA fault current by the combination of the supply company's fuse (typically 60-100A) and the MCB used. Basically the fuse limits the peak current to that of something like 4-6kA equivalent even for underlying supplies of 16kA or a bit more.
Fuses are good! Cheap, reliable, with excellent fault energy limiting. But one-time, and usually need skilled replacement.
"a fault to earth is essentially close to a fault to neutral"
I have an old clothes iron in the museum that has a 'heating' indicator composed of a torch bulb in the handle. It connects across neutral to earth, usually about 4V in the UK. Try it now and it always trips the RCD; they only had fuses when it was made.
"500v supply by putting two adjacent 240v sockets in series." - Ouch, I see exactly how the thought process of your colleague went on this because if you think about it from a certain point of view it makes sense.
Sadly from another point of view connecting the live of one socket to the neutral of another would not have the desired effect!
I have also got to wonder why anyone was trying to get 500V from the mains. Wanting double heat from a portable heater maybe?
In between single phase and 3-phase, there is something called split phase. I've been offered it for a rural new build that being all electric needs more than a single phase can supply. I guess the DNO takes a single phase overhead supply and a centre-tapped transformer to give "two" phases.
That'd get you your 500V!
I'm sure I have posted this before but when working at an about to open fruit store, we discovered the hard way that not all TVs have universal power supplies. It was for a window display and had been shipped from the US *with* a power converter but was plugged directly in to 230V. The most remarkable thing is that it lasted nearly a whole day before failing. A replacement was sent overnight from another store at great expense.
> By doubling the amps, Nikolai had sent 25W to the emulator, blowing its little mind.
Huh? This makes no sense. The ammeter is measuring the current drawn by the device, and while the PSU doubtless has a maximum limit (possibly even configurable on a bench PSU) you can't force the device to draw more current than it wants to by changing something in the PSU (notwithstanding a PSU in CC mode, but (a) if it had been set to CC mode it would have almost certainly have blown the device up already, and (b) in CC mode you're still changing the voltage (albeit automatically), not the current.)
Presumably what actually happened is, not realising the display was showing the current draw rather than voltage set, Nikolai turned up the /voltage/.
And yeah, I remember working with wretched ICE CPU emulators back in the day. The things would flake out at the drop of a hat, and more time was spent wiggling hundreds-of-pin connectors to try and make the damn things work than doing any actual development. (The ones I used to work with came in stacks of 3 or 4 instrument units each connected by thick bus cables - I was always told, possibly apocryphally, that this was so they could be shipped as separate units each of which would fit down the hatch of a submarine. Why you would want a CPU emulator in a submarine is left as an exercise for the more sceptical than I was in those days...)
This post has been deleted by its author
Yeah, I'm also rather pleased that the days of pizza-box sized ICE units are now just a rapidly fading memory of my early years in the industry.
Bad enough these days when you burn out a JTAG probe and have to pause what you're doing for a few minutes to go dig another one out of the big pile of spares in your cupboard of useful bits (and then remember to order another one to keep your spares pile topped up), back then it was "make apologetic phone call to the distributor and see if we can sweet-talk them into shipping us a FOC replacement ASAP, pretty please, BTW did I mention last time we spoke about the new project where we're planning to use another 50K parts a year..."
Once, many years back, had a colleague set a desktop PC PSU from 230V to 110V to see what would happen. "What happened" was the PSU emitting lots of sparks and magic blue smoke when I plugged in the power lead, in close proximity to my head. Replacement PSU and the PC was fine, and from that point on I was always paranoid about any power supply with switchable voltage...
Also a long time ago I wanted to plug in a radio. Stuff didn't always come with fitted plugs then and there were none lying around in the barracks so did the normal (for the time) of splitting the two leads and stuffing them in the socket. Unfortunately, being naive/stupid and lucky, I used a pair of narrow nosed pliers to force them in in parallel. Common or Garden black smoke ensued and the wall mounted conduit rattled loud enough to annoy other residents more than the radio would have done. I did learn enough to reach old age with only the odd shock/tingle since.
Flick the Voltage switch to 110V while it's still working thought a business client of ours one day.
After the smoke cleared he brought it into us, PSU replaced but the MBR on the HDD was screwed resulting in a expensive for him, new HDD, Windows Install & data recovery to the new HDD.
Sometimes you need to do stupid things to get your brain to learn to avoid similar things in the future. In my case me and a colleague was doing some troubleshooting, and I decided to check that the 230V supply really was 230V and we didn't have a dodgy cable or something. Grabbing the multimeter, setting it to 300V and doing a quick measurement phase to neutral. after pulling my head out of the roof after setting a new record height for a standing jump I realised that it would have been a good idea to check that the cables were in the sockets for voltage and not current measurements... No harm was done apart from vaporising the probe tips, blowing the multimeter fuze and one of the office fuzes. To this day I have a printout of the resulting transient on my wall to remind me not to do something like that again. Unfortunately the measurement bottomed out at 86A, would have been fun to know what the actual short circuit current was. As a fun note, the actual short circuit lasted for about 1/100 second but that and a couple of hundred amps was enough to vaporize something like 5mm of probe tips. Nowadays I always double check that the multimeter cables are correctly connected.
It's always worth standing back a bit when using a multimeter. Our 240V AC on 50 Hz is the RMS voltage, so the peak is rather higher...... Also, if you are not sure of the source or if it is a supply from an inverter the waveform is non-sinusoidal and many (cheap) meters won't register the correct RMS voltage anyway. If you can get an oscilloscope-type observation you learn to be very respectful of 'live' or 'dead' connections.
When my Atari ST died I decided to have a poke around inside to see if I could "fix" it.
I popped the crock clip of my Dad's old analog meter onto the heatsink of the power supply assuming that was earthed. It very definitely wasn't....
I just remember the room lighting up and me shrieking in pain bringing the rest of the household running. Left a little melted weld spot on the heatsink and blew the 13A fuse in the plug.
A lesson I will never forget.
Electrical contractor says minor interruption & doesn't mention to fully disconnect equipment from mains supply - famous last words before the smell of burnt electronics.
Example from my experience but it didn't happen in a datacentre, but it could happen:
While upgrading power from grid to the meter board from 1 phase to 3phase, the licensed contractor approved by the power company (retail/network/supplier) got the wires mixed up so 415v went across circuits designed for 230v - 240v. Anything connected with a ON/standby was damaged. A very costly & time consuming mistake even if some items covered by our/their insurance.
When I was an engineering intern at a very small company, I was doing some board level troubleshooting and using our one and only Fluke multimeter in milliAmp mode to measure some currents. When I stepped away, an experienced engineer came and borrowed the meter to check the 480 V power supply to a milling machine. He changed the dial from mA to VAC mode and put the test leads on the supply wires. This immediately resulted in a flash, a loud bang, and bits of the face of the multimeter hitting the ground several metres away. It was not repairable.
He had, of course, neglected to move the red lead from the ammeter side to the voltmeter side, causing the meter to be a dead short. He tried to blame it on me, but nobody fell for that.
There's always somebody. Just two weeks ago I was the only tech who turned up on site with any tools - THREE DAYS IN A ROW!. The others kept snaffling my tools, one of which was a multi-head spanner which they proceeded to completely wreck by not applying it to the bolt heads correctly, and just wrenching away until they killed the spanner.
Back when IBM-compatible monochrome monitors were a thing (remember those? green and very slow phosphorus, well at least it did not flicker), I encountered a warning on a 3.party extended graphics card (one that could drive both monochrome and CGA monitors with more colours and resolution than IBM's offering- this was before VGA and even EGA) that one should never use out-of-spec modes with a monochrome monitor, else it may break. So I did not. I have ever since wondered if this actually happened to anyone.
Ever wondered why the official secrets act covers so much of what the government does, and why everything done by the government is so expensive and takes so long............
I cant name the place or what we were doing, or who for....but we were young and fresh.......... and stupid.......... very very stupid.
So take 3 apprentices........ a box of large electrolytic capacitors........... yeah we know where this is going................ add some wire... a switch and a large ac power supply.
After letting out the magic smoke, we had 3 deaf apprentices, no capacitors, or power supply and a very annoyed technical bloke responsible for what we were doing.
"YOU KNOW HOW MUCH THESE THINGS COST?" was one phrase I remember being shouted at us..
We ended up at a motorhead gig that night wondering why lemmy was so quiet
"We ended up at a motorhead gig that night wondering why lemmy was so quiet"
In the young and stupid department, there's the Hawkwind (Lemmy's previous band) gig that I walked out of with my ears ringing. That was in the late 80s -- and they haven't stopped ringing since.