Dictionaries be damned. Leverage is a verb if enough people use it that way. That's how language evolves.
136 posts • joined 10 May 2010
I can't remember whether it was Boris or Hancock who first used the phrase, but when i first heard one of them say "world-class" I positively cringed.
If you've spent time in a large organisation, you know that there are jargon phrases that mark someone out as a somewhat clueless "manager", rather than a doer who actually has any idea of what's going on. "World class" is most definitely one of those. It means nothing, and shows that the person saying it has no idea of the challenges involved.
It's also, considering the woeful delivery record of the Civil Service under governments of all colours for the last half-century, a woeful triumph of hope over experience, but that's another issue.
(Or, possibly, "Start Chopping..." - memory is fallible, and it's been nearly half a century. But I agree with the the article linked to below by @Jake - the slang term "scram" seems by far the most likely origin. It was certainly sufficiently in use around the time to have even reached the UK; my father - UK career soldier - used it regularly, when I was under his feet.)
Back in the early 70s the "perceived wisdom" was that SCRAM stood for "Start Cutting Right Away Man" (vide the instructions for the Atari game of the same name). In reality, I suspect that there's a 99.9% chance that the name came first, and every "explanation" is, basically, a backronym..
"But, Mr Dent, the plans have been available in the local planning office for the last nine months."
"Oh yes, well as soon as I heard I went straight round to see them, yesterday afternoon. You hadn’t exactly gone out of your way to call attention to them had you? I mean like actually telling anybody or anything."
“But the plans were on display…”
“On display? I eventually had to go down to the cellar to find them.”
“That’s the display department.”
“With a flashlight.”
“Ah, well, the lights had probably gone.”
“So had the stairs.”
“But look, you found the notice, didn’t you?”
“Yes,” said Arthur, “yes I did. It was on display in the bottom of a locked filing cabinet stuck in a disused lavatory with a sign on the door saying ‘Beware of the Leopard.”
I suffered the consequences of failure to engage brain, once. For about 6 months, to be precise.
On the site where I worked, a removable IBM disk pack (this is around 1980, so we're talking a 3330 or 3350, I guess) stubbornly refused to read when mounted on the drive. Standard procedure was to try either a different pack in the drive, or the pack on a different drive, to see where the problem was. The keen folk on the floor that night decided to do both. They tried more packs. No luck. They tried other drives. No luck. Unfortunately - they followed the book and kept going.
Finally they thought to check one of the "fresh" packs they'd been using on one of the "clean" drives they'd been using. And that didn't work either. At which point it began to dawn on them that one of the two original components had had a catastrophic mechanical failure, wrecking both the disk pack and the drive. And every time they'd move a pack around, it had wrecked the drive they put it on. And every pack they'd put on THAT drive had been similarly wrecked. And so on, like a plague.
I seem to remember that they'd accounted for about half the drives in the machine room by that point, and a significant portion of the primary copies of our operational business data. And we spent weeks in disaster recovery mode afterwards, trying to rebuild the data we'd lost from tape back, running forward recoveries on fewer drives than usual, and generally suffering. Oh, and we couldn't just phone IBM and ask them to ship us more drives urgently, because this all happened on an IBM internal site. And senior management took the perfectly understandable view that it was an internal problem to be sorted, and they weren't going to divert new hardware that was already committed to (paying) customers.
At that point in time it would almost always have been a secretary doing that kind of job, and secretaries were almost invariably female. It's not the OP's fault if historical real life proves to be inconveniently insensitive to gender issues..
Plus I note that, as reported by the OP, the techie describing the incident was also female - both which you fail to note and which rather destroys the picture of gender stereotyping that you'd clearly like to paint.
Except that this isn't a case of "stupid users". Most times something like this happens, it's design failure. You, being tech-savvy, assumed that your users would understand, and behave in a certain way; they, not being tech-savvy, didn't. Your failure, not theirs. You should have either built things differently, or ensured that everyone had adequate training.
Two stories I was told from inside IBM.
Firstly, the lab doing usability testing on a new system, with a user who'd never used a computer before (in a room on his own, being observed and filmed). He struggled through the first few instructions to get the computer turned on OK. Then he came to "Take the floppy disk out of its sleeve". Now - for whatever reason, he'd already done that some time before, but didn't remember or understand that he had. So he takes a good look at the floppy, and decides that the "sleeve" is the protective cardboard layer either side of the recording medium. He then spends 20 minutes trying to work out how to get the floppy out. After much effort and swearing, and with some considerable ingenuity, he removes the magnetic disk. He goes back to his instructions, turns the page, and finds... "Hold the floppy disk by the corner..."
The other was a guy who simply sat there, staring at the screen, despite people occasionally gently prompting him "When the instructions come up on the screen, do what they say". Instructions duly appear... he sits there. Verbal prompt. He nods and agrees, but still just sits there. Prompt, nod, nothing. Again. After a while, everyone gives up, and the testers go in to find out why he didn't follow the instructions in front of him. "WHAT instructions?!?" "Those!" "I don't see anything..." Coloured text on a coloured screen; colour-blind user. Epic fail, as they used to say.
Despite a plethora of genuine talent at the professional level, IBM management has been largely relying on old, cash-cow revenue streams and auto-cannibalism for the last 30 years - as anyone who has actually worked for the company would have had to be blinkered not to notice.
All that's happening is that it's reaching the point where it's sufficiently obvious that even the most disinterested investors are finally beginning to catch on.
>I expect this has plenty more to do with salary than age.
Quite probably. IBM's been all about the bottom line, the share price and the Board's stock options (not necessarily in that order) for the last 30 years. Unfortunately for IBM, that doesn't work as an excuse (or certainly not in the UK here, at least). It doesn't matter what your primary motivation was; if the effect is discriminatory, it's illegal.
...is that I don't care what the xTTC infrastructure is, if it can already deliver more than my poor old twisted copper pair can cope with (which is, frankly, pretty abysmal). So if someone wants to sell me an "upgrade", it had better improve my current - and future potential - throughput. Expecting me to accept FTTC as "fibre", and somehow "better", in that context is... ...bovine effluent.
First, full disclosure - I'm male, so every "you" in what follows is from second-hand observation when it comes to maternity absence, rather than personal experience. But it's repeated observation - I saw similar things happen several times.
In my time at the coal-face of IBM, I saw maternity leave have an effect quite a few times (I saw illness have the same effect, irrespective of gender). The problem as I see it is that, when you're technical and skilled, you normally can't walk away for an extended period (for maternity or any other reason) then simply slide back in to your old role. Your skills are a part of what needs to be done in your area, and if you're not there to provide them, someone else will be found to do so. And, frankly, that someone is not normally simply going to be moved back out of position when you return. You can't even necessarily expect to slide straight into an equivalent role, however willing in principle your management chain are - if there's a job there that really needs doing, again, probably someone is already doing it. So, quite possibly, you'll slide back in to either a period of somewhat make-weight work, or into a minor change of career direction: doing something similar to what you were before, but in a different context, needing a subtly different set of skills - which you are undoubtedly capable of gearing up to, but which will take time, during which you're likely not to be seen as working at your full potential. Net result - slightly lower assessments, on average, than the colleagues who've stayed in role and haven't had time off. No-one has done anything wrong, or improper - it's just a consequence of the business world needing to carry on turning in your absence.
...it goes to show that apps DO matter. Very much. And that, basically, there are precious few that work better on your wrist than on a bigger device - which is what I've been saying since day one of the "smart watch" hype. The world of apps was already pretty mature when that train hit the tracks, and no-one was able to point at anything obviously massively and generally useful that would justify a rather small screen permanently strapped to each of our wrists (except for the rather large number of hours each day when it was recharging, of course). And without that non-niche, killer app that we all "need", all you'll ever have, basically, is a rather expensive, high-tech, high-function timepiece.
Technology is a wonderful thing - but I'm old enough to remember the start of the digital age, and I've lost track of the number of "clever" devices (digital and otherwise) I've seen hyped on the market that died without trace - because no-one actually bought them. Just because you can make something, that doesn't mean anyone needs it.
"I've never understood why anyone would trust IBM, look what they do to their employees/"
Because at one time they were one of the BEST companies to work for, attracted THE best people, and made some pretty decent kit to boot. There was truth in the old "Nobody ever got fired for buying IBM", because the company had a rep and an ethos. You had a problem with your IBM solution? They'd fix it.
That all changed when Gerstner and the bean-counters took over. IBM still has some seriously good people working for it, but the higher echelons don't give a rodent's rectum for their staff, or even the business, provided the numbers "add up" at the end of the month/quarter/year, they can make their packet, satisfy the shareholders and vest their not-inconsiderable stock options.
The problem now is inertia. When you've been building your core systems on IBM for decades, moving isn't just a trivial matter of throwing a few thousand and a couple of weeks at a migration; it's a decade's work. You need a long-term strategy, and sufficient continuity of management to actually see it through (and few companies have that - usually the next bunch of guys not only don't keep the strategy but actively want to chuck it out and do something different ("after all, if you're not tweaking what your predecessors did, you're not 'managing', are you?")) Often, it's easier, cheaper (and, yes, technically better) in the short turn to just keep building parts at least of your new stuff on the same platforms that everything else sits on.
I wouldn't advise anyone to buy a used match from IBM nowadays.
- (ex) career IBMer
I afraid that's been pretty much the case for decades. In my perception, it's been a long time since the folk at the top were less interested in vesting their extensive stock options and milking the company for what they could get out of it, than they were in keeping up its long term health. IBM is moribund, and has been for years; it's just taking a long time to die. And probably still will.
"Inundated By Morons"?
Wildly unfair. Quite the opposite. Onshore, at least, still full at the professional level of very bright, dedicated individuals, doing their best in "trying" circumstances. And - whilst they were hell to work with - it's hardly the fault of the offshore guys that they've never been exposed to the proper ethos and levels of appropriate training, either - bright people with good qualifications, dragged in to "body shop".
"Instructed By Morons", or "Imbecilic Bloody Management" now - those are VERY different.
I still meet quite regularly with ex-colleagues still working at IBM; the general attitude I sense is one of slightly depressed fatalism. The younger ones are probably more upbeat, but then again, they'll find it comparatively easy to get new jobs if pushed. the older ones are pretty much just keeping their heads down and hoping the problem will keep going away.
>If the data shows that risk decreases with moderate intake and other analysis shows strong correlation, then the statement "risk decreases with moderate intake" is a correct statement.
I'm sure that's true, but it's a very technically-worded statement, and open to huge misinterpretation (especially by media hacks, who rarely understand the science and simply want a good headline and story).
What it says is that it has been observed that, ON AVERAGE, people who drink in moderation are less likely to suffer from the said diseases than those who don't drink at all. It says nothing about the "why". And what it most definitely does NOT say is "Scientists prove that a little drink is good for you". But that, of course, is the next day's headline.
"Correlation does not imply causation", and all that. "Weather improves with ice cream sales" is also true - but you won't bring on a sunny day by buying a few tonnes of the stuff.
Yeah, but at the time they also carried cameras, Walkmans, GPS units and more. Who does that any more? They're all subsumed into a single device now. Plenty of us don't even wear a watch any more, other than as a style statement. Bear in mind that, realistically, people don't actually have phones; that was just the evolutionary route. What they have are portable computers with more processing power than the supercomputers of a generation ago, that have replaced a whole raft of devices, phones among them. So I'm still waiting for even the hint of that "killer app" that will make everyone really want to buy an extra device and strap it to their wrists again - and, seriously, right now I don't think it exists. There's almost nothing that a smart watch can do, that a smart phone can't do as well, and often with better usability. Market penetration is laughable; there are, literally, more mobile devices out there right now than there are people alive - over 7.2bn at the last count. And smart watches have managed to sell a few million? Absolute chickenfeed; as close to zero as makes no difference. They have their uses, but basically these things are likely to remain niche.
Getting from the major road is even easier than now - you use a driver who is rested rather than one who has spent the last n hours concentrating on the (almost unchanging) road around them.
No, you use a driver who has just spent n hours of mind-numbing boredom staring at the back of the vehicle in front with no more variety than twiddling the steering wheel enough to keep that vehicle much closer in front of him than he is likely to be comfortable with.
I really hope the following vehicles have some sort of driver alertness monitor, because I don't want to be anywhere near that convoy after 3 or 4 hours on the road if it hasn't.
Great!, And how do they get to and from the "major roads" I wonder?
Apparently, the slaved lorries still have drivers, who are responsible for steering while they're "platooning". I presume that they'll simply take over and drive as normal at other times.
Can't say I'm wildly enamoured; it sounds a classic case of some bright spark in a lab getting unrealistically enthusiastic, and selling the idea to a government department that doesn't know any better.
"Anyone who tends to do this want to pop on an anonymous mask and give me a hand?"
Serious answer. I'm early 60s, and I don't do this. But I do have a suspicion - I think it has a lot to do with physical capability, reaction times, caution and confidence (rather like the well-documented tendency of elderly drivers to cling to the middle of the road). My own years of always driving like a bat out of hell, if they ever existed, are long behind me; now it's all about comfort zones. On dual carriageways I know, for example, that when I'm feeling wide awake and alert, and the road is reasonably empty, I can struggle to stop my speed straying well over the limit. Whereas if I'm very tired, I'm much more likely to potter along in the inside lane at a good 10mph under the limit - it may take longer to get places, but it's what feels comfortable and within my ability to react, at the time (and I know that my reaction times aren't what they used to be - and if I ever forget that, I only have to go play a video game or two against someone younger, to get my nose rubbed in the fact). And personal observation somewhat bears that out; for example, I remember a couple of elderly relatives of my wife, both of whom were medically fit and deemed competent to drive, but who in practice could be downright scary to be in the car with at "higher" speeds (which, in one of the two cases, by the time they stopped driving, was anything over about 25mph). I've noticed the same sort of thing with other, elderly friends, too - mostly, the older, the slower. So it could well be that those "35 mph everywhere" drivers are actually reckless, devil-may-care elderly speed-freaks, utterly ignoring the limits and belting along at what passes, in their cases, for comfortably flat out...
"The placebo effect is indeed clinically real. The real wtf thing is that there is still clinically measurable effect when the patient is told it is a placebo."
I've heard about this; seems likely to be Hawthorn effect. Just receiving attention has an effect in its own right.
You never know when you're going to want to do something like that. Plus you know everything about the ISS. So (a) practice it, and (b) find out how much information you can actually gather that way with that sat, and what your analysts can make of it - and what gets missed, as well. Why wouldn't you? Plus it's a great chance to eyeball the outside for damage.
Initial stages?!? The company hit the iceberg over 2 decades ago; the officers have been throwing crew overboard in an effort to lighten the load for years whilst pretending that everything's normal. It's just taken the passengers a bit longer than usual to spot the signs that it's been terminally holed below the waterline, and - despite its supposed unsinkability - is going nowhere but under.
"Trump has stated before that AGW is false."
that's because it _IS_ false. what part of "CO2 does not absorb infrared energy corresponding to normal earth temperatures" (and therefore can NOT be the 'greenhouse gas' everyone hopes it would be, so they can CONTROL PEOPLE'S BEHAVIOR in the name of reducing it) is NOT obvious here?
Well, gosh, you've convinced ME. Half a dozen paragraphs to demolish the career-long intellectual work of thousands of scientists, who agree to a man and woman that it's very real indeed. I guess they''ll all take one look at your simplistic argument, realise they've missed the blindingly obvious and been been wrong all this time, and go find other jobs...
It's not that simple. It never is.
Doesn't matter where a company is based. Sovereign nations such as the UK have this quaint idea that they are - well - sovereign... ...and therefore have the right to make and enforce their own laws. They can, will and regularly have take action against companies that step too far over the wrong line. The degree to which they may in practice be able to enforce their decisions may vary, but that's a rather different issue. True, UK courts may not be able to compel Facebook to a particular course of behaviour worldwide (although in my observation, minor annoyances such as a complete lack of jurisdiction have regularly failed to stop US courts trying to do precisely that in the past); but the reality is that, if Facebook wants to do business in the UK, it will certainly need to do so in accordance with UK laws, or face consequences.
Well. Until Project Waltz I *did* work for IBM, and indeed at IBM Hursley. Very probably I was working on one of those products you mention. And I'd have to say that I totally agree with the article. IT Services have been a critical IBM source of revenue for quite a while now, whilst the market for its cash-cow software products is (or certainly was at the time I last saw figures) slowly but steadily contracting. I have nothing but admiration for the quality of many of the technical folk who, even now, work in the IBM labs - but when was the last time that IBM developed (rather than bought) a truly new software product? Decades ago.
Seen from the viewpoint of a mere grunt inside the company, IBM senior management seemed to have no "strategy" worthy of the name beyond pushing the share price at all costs, and amassing their personal stock options before moving on. Investment in people was near-invisible, and the overall atmosphere was pretty poisonous. From such things as I hear from my contacts inside the company, nothing much seems to have changed.
IBM has, in my opinion, been on a long, slow slide into oblivion ever since the bean-counters took over from the techies in the 90s; the only real question in my mind is how long it can keep the cracks in the facade sufficiently papered-over to stop the market noticing.
"Sorry AC, but you are wrong. The UK, for one, does not have a minimum speed limit on a motorway."
Effectively it does. It's merely not explicit.
It's an offence under UK law to drive "without due consideration" to other road users. Pottering along at some obstructively low pace, without a valid and sensible reason for doing so, would most certainly be likely to earn you penalty points if it came to court.
I seem to remember some of the early UK hackers being convicted of "stealing electricity" (because there was no other applicable law at the time - and never mind that the amount of electricity involved was trivial). It seems to me that anyone who attempts to do this without my approval is doing precisely that - just more overtly.
What's even more ludicrous is the implication that Hollywood then ignored the goldmine that was the technology supposedly used, and that all of the SFX specialists involved in the "fake" also forgot everything they'd done, and the whole industry went right back to the same old low-tech SFX it had been using for years. You've only got to look at the moon and other "low gravity" scenes in "2001" (released only the year before the landing) to see the limits on what the industry of the time was capable of. But then again, such trivia don't bother crackpots.
"I also tend to make them use electronic kit in "open" areas of the house so I can see what they're doing rather than let them sneak off to their bedrooms. I suppose most parents do the same."
Not beyond a certain age. I think to a degree it may depend on the kids, but when mine were younger (they're all adults now) I took the attitude that openness and trust was a better strategy - treating them as adults to just as much a degree as they could cope with. We were all kids; we all know that, if you try and lock kids down, they'll just find a way of seeing or doing what they want to anyway, and most of the time you won't even know it. Whereas if you foster an attitude of trust, you not only hear about far more of it, but you get the chance to feed any concerns and thoughts into the discussion, and have your opinions considered and respected. I knew, for example, when my 15 year-old son was watching 18-rated films round at a particular friend's house, and what they were. We talked, I was happy with his attitude, end of story. If I hadn't been, we'd have found a way to sort things. Worked pretty well for me, anyway - I'm proud of the way they all turned out. (Although my daughter once told me that the most frustrating thing about me was that I never gave her anything to rebel against. Parental Judo. I took that as a compliment.)
What gets me most about utterly moronic pronouncements such as the one that prompted this whole discussion is the vacuous and unquestioning way in which so many apparently-intelligent politicians seem to have not only bought into the "copying is theft" meme, but the degree to which they seem prepared to accede to the very particular version of that tale being told by large corporate interests.
Copying is not only not "theft" (a simple and perfectly clear concept that everyone understands, despite the best efforts of those with vested interests to redefine it as something else), but utterly fundamental to human society and culture. From the moment we're born, we learn most of what makes us who and what we are by copying. Talking; reading; what's acceptable and unacceptable behaviour in our society; what things and actions are safe, and what are dangerous. If we see a good idea, we adopt it. If our neighbour finds a clever way to keep the slugs out of his vegetable patch, we try it. If we hear a good joke, we pass it on. If we hear a good tune, we whistle, sing or play it. If we like the new style of clothing we see someone wearing, we imitate it. If someone coins a useful word, we use it. Copying things is one of the most fundamental aspects of human behaviour that there is. The entire (and very, very recent) concept of somehow not having a right to copy whatever we choose to is utterly artificial, and utterly at odds with everything that makes us who and what we are.
Which is not to say that there's not a place for allowing someone to benefit from the fruits of their labour, up to a point - IF doing so benefits society as a whole. If it doesn't - well, no-one owes you, me, or anyone else (let alone some faceless corporation) a living. And most certainly not simply because having an artificial monopoly on something is the only way to achieve one.
Who knows? The point is that he was able to. It could have been a simple mistake. Although he was a test pilot, and he could have had a reason that made sense to him. Whatever, the hardware allowed him to do something that turned out to be fatally dangerous, presumably without giving him any indication that it really wasn't a good idea.
(There's a famous story of a Harrier jump jet test pilot asking "What happens if I vector the thrust nozzles during forward flight?" To which the answer was "We don't know - no-one's ever tried it." So he did. And nothing went wrong, and the Harrier's then-unique VIFFing manoeuvers were born. But there was always an outside chance that it would turn out to be a really bad idea, and that things could have gone horribly wrong, perhaps even killing the pilot. In which case we'd be asking what he was doing, doing something so stupid? Answer - doing his job.)
In my book, that's not pilot error. That's either a design failure, or a training failure, or a test pilot pushing the envelope beyond its limits.
"It's the same as asking me to pay for bank shareholders losses for a bank that I don't use."
'Which if you're a UK taxpayer, and don't bank with RBS, HSBC or Lloyds, is exactly what happened.'
Except that you DO use them. They finance the levy that guarantees that, if it's YOUR bank that goes tits up, you don't lose every penny you had in it.
Biting the hand that feeds IT © 1998–2022