You say it'll be more expensive, but what if "more expensive" is translated as "any more expensive and we can't afford it, full stop." Meaning it's either go cheap or go home, taking your business and livelihood with it?
What bugs me the most? World+dog just accepts crap software resilience
With Boeing's 737 Maxes grounded and its MCAS anti-stall software being patched, a high-intensity spotlight has been shone on the issue of software reliability. But putting aside whether Boeing's software is ultimately shown to be a risk factor, for some years now the industry has been sleepwalking into a tacit acceptance of …
COMMENTS
-
-
-
Wednesday 27th March 2019 10:44 GMT Anonymous Coward
There's a catch to that, though. You have to pay the ounce immediately, yet you can't afford that ounce. Then it becomes more, "In for a Penny, In for a Pound," meaning if things go pear-shaped You're Already Dead in any event, so why not gamble on things not failing later if the insurance against it would kill you just as easily?
-
Wednesday 27th March 2019 10:54 GMT AVee
> "An ounce of prevention is worth a pound of cure?"
That's a nice cliche, but it simply isn't always true. Sometimes the band aid is an ounce of cure where a pound of prevention is needed. I mean, we could get rid of all hard surfaces outside so our kids never get a scratch. Or we can buy a box of band aids, I know which is cheaper... (Obviously, there are plenty of cases where it is true.)
It's very simple in the end, the customer is just not willing to pay for quality. That's also not a problem that's unique to software, you see that everywhere. It's a problem of a culture where we only look at initial costs and fully expect to replace things pretty quickly. To some extend it even makes sense, often software is going to be outdated way before all bugs are fixed.
But yeah, I'd rather like to see high quality software build for the long haul. But that does mean you can't jump on every latest technology bandwagon, you can't be buzzword compatible etc. Basically it will be very boring software, and maybe that's a good thing.
-
Wednesday 27th March 2019 19:50 GMT Anonymous Coward
Software roadmap
"It's very simple in the end, the customer is just not willing to pay for quality."
While that might be true to a point, the bigger problem as I see it, is the software development cycle.
There is too much an emphasis on developing new features, rather than locking down and making rock solid, the existing code, which is buggy. Instead what we get is, new features, some bugs squashed, but more will need to be squashed, because new features. It just goes on and on.
The internet has not helped the severity-level of bugs, they're serious, and doubly so when it can involve actual loss of life.
I wonder how long before legislation comes in to govern QA of software, I think it will happen eventually, I am no expert on this, just saying what I think
How many updates do you get for your mobile apps each and every week?
Nuff said
-
Thursday 28th March 2019 06:12 GMT Olivier2553
t's very simple in the end, the customer is just not willing to pay for quality. That's also not a problem that's unique to software, you see that everywhere. It's a problem of a culture where we only look at initial costs and fully expect to replace things pretty quickly. To some extend it even makes sense, often software is going to be outdated way before all bugs are fixed.
Buying cheap and replacing may be true for individual, but is it really worth it for industrial?
And software being outdated is pure BS served by the sale department of the software companies. 99% of the new features added in the last 10 years are useless, why developing a new version instead of fixing the bugs?
Who care buzzword and latest bandwagon?
-
Thursday 28th March 2019 08:03 GMT John Smith 19
"99% of the new features added in the last 10 years are useless, "
But what sounds better.
"Word whatever now has <some other stupid new bell or whistle 5 people might understand and 2 will use>"
or
"Now we fixed the top 10% of the most complained about bugs in the last release.
Actually Microsofts greatest achievement is to convince people that s**t SW is all that is possible, and all that is needed.
-
-
-
-
-
-
Thursday 28th March 2019 01:45 GMT Long John Brass
It is far too often used to justify terrible engineering practices and decisions
I like that adage; I see too often teams and people not wanting to make a move on a problem area as they don't yet have the "perfect" solution. Implement a *good* solution then iterate on is a great way to break that deadlock.
Note: That doesn't mean implement the first piece of crap that comes to hand. :)
-
-
Thursday 28th March 2019 00:41 GMT veti
The costs don't fall in the same places, though. It's a mistake to think of it as about money. The ultimate limited resource is "time".
If the user has to spend an additional 20 seconds every transaction re-clicking on the "send" button until it works, that's a cost to the user, not the owner of the software. The owner's cost is limited to (1) making sure the user knows enough to do this, and (2) handling the (probably not measurable) attrition among users who get too fed up to keep doing it.
Software changes follow a depressingly predictable lifecycle. Someone requests a new feature, it gets specced and costed, and a few months later it gets delivered and tested and the testers, invariably, discover it's a humungous pile of bat guano. They list the defects, demand that they get changed, and a month later the new version arrives in which defects A-E have been fixed, F-J haven't, and K and L have been introduced. This process can go on as long as everyone's patience holds out, but sooner or later something will give - the customer will say "close enough", or the supplier will say "that wasn't in the spec", and everyone concerned is so sick of the sight of it that they deploy it just so that it will become Somebody Else's Problem.
Is it a good system? Heck no, but it's what we've got. Software engineering practices assume that a spec "should" be perfect and any falling-short of this ideal is a failure by one side or the other, but Gödel's Incompleteness Theorem means that the "perfect" spec is a logical impossibility, even before throwing in other confounding factors such as the Law of Leaky Abstractions.
-
-
Thursday 28th March 2019 08:13 GMT John Smith 19
Customers get the software they deserve.
Always.
Sounds harsh?
Why?
The customers don't complain enough (and they are often companies, not the end users in their offices who have to make some PoS application work).
They don't reward good software ("Oh it's too expensive") and they don't penalize bad software, maybe because it's the only supplier in its class (don't want to rock the boat old chap) or because they don't really care about their business (or govt department) working well as long as the minions keep the s**t moving.
And when people make choices what do they choose 99% of the time?
BTW that fondness MS and Adobe have for creating incompatible file formats every few years is not an accident. Again the market does not give a s**t, because there appear to be no mechanisms to do so.
-
-
-
-
Wednesday 27th March 2019 10:35 GMT alain williams
Who bears the cost ?
You say it'll be more expensive
The real question is ''who bears the cost?'' The answer is ''not the development organisation'', eg not the vendor but its customers. So the benefit to who has to pay the extra cost for the reliability gains little, a bit less work for its call center maybe, but that is about it. Its customers however: waste huge amounts of time on work arounds or become frustrated - but does the vendor care ?
The other cost of producing reliable programs is time. The extra time taken means that the competition might get their product to market first and so get the customers and maybe even the market. One company that has used this ''get something that vaguely works and let the customers suffer'' method is Microsoft; it did this in its early days, got better for a while and is now getting worse again.
Another part of the problem is that customers getting redress is rare. We are pushed to accept major bugs. It is hard for the customer to go elsewhere, they have already made an investment, changing is costly - and anyway: will the competition be much better ? Probably not: commercial factors dictate not.
-
-
Wednesday 27th March 2019 11:21 GMT Horridbloke
Re: Who bears the cost ?
You've got it.
Historically industries have often benefited from "externalised costs", i.e. costs paid for by someone other than themselves. Some decades ago in the west this might have been the factory that dumped nasties into the nearby river and didn't worry about the people downstream. More recent examples include web companies who can't be bothered to secure their customers' details. Many classes of overt crime are simply taking the externalised costs thing to an extreme.
The answer is to identify externalised costs and re-internalise them through regulation. This can be a whole spectrum of measures but should definintely include scope for criminal charges because otherwise most players will continue not to care.
I'm not sure who will still want to write software though. I prefer to do good work but I haven't always had that option and the day I'm held legally responsible for the quality of code I write I'll be quitting.
-
Wednesday 27th March 2019 11:49 GMT Anonymous Coward
Re: Who bears the cost ?
Yet I did most of my IT engineering (all the categories) in an environment where I was personally responsible for all the defects existing in what was our final product. My chain of command and me, personally, weren't going to enjoy our courts martial one bit, nor the convictions, after a bug caused a big smoking hole from the explosion and/or some fellow sailors/marines ended up dead.
Intense, demanding, challenging as fuck. Wasn't paid much either, low to middling enlisted rank. I enjoyed the hell out of it ergo I'm most definitely insane.
-
-
Wednesday 27th March 2019 20:39 GMT Anonymous Coward
Re: Who bears the cost ?
Ford finally fixed the exploding Pinto (US model mid 70's small car) problem after it came out that they calculated that the costs of fixing the problem were higher than the likely cost of settlements for death and injury related from the design flaw. Suddenly liability cases started to balloon and jury awarded settlements increased dramatically. Within two years the model had been modified to remove the risk.
-
Friday 29th March 2019 03:04 GMT M.V. Lipvig
Re: Who bears the cost ?
And that was a missed opportunity. The beancounters that approved letting customers burn to death should have been loaded into Pintos, then had something large, hard and sparky slam into the back of the car repeatedly.
This is one area where China got it right. CEOs responsible for putting profits ahead of lives are executed. If the Western world were more like China in this regard, a lot of the problems we've had would never have occurred. And not just the current CEO either, execute the CEO responsible for the problem, and execute all following CEOs who knew about the problem but did nothing. Shareholder interests lose a lot of their luster when it's YOUR (the CEO's) life on the line. After all, that's why they're paid the big bucks, right?
-
Friday 29th March 2019 03:27 GMT jake
Re: Who bears the cost ?
I just love the sound of a good knee-jerk in the morning. Read this:
https://en.wikipedia.org/wiki/Ford_Pinto#Subsequent_analysis
But don't let facts get in the way of a good rant.
(tl;dr version: The Pinto was a heap of shit for many reasons, but it was hardly the firetrap that the Popular Press made it out to be.)
-
-
-
Thursday 28th March 2019 06:30 GMT Anonymous Coward
Re: Who bears the cost ?
The techniques I developed then are still the ones I use for everything even today. I don't have to spend a ton of time testing as the tests are already in the code to make sure that the code is functioning correctly despite the hardware, libraries, other support code, even despite the OS which can do the damndest things on invocation. Sure, adds up to additional cycles needed but you'd never notice without serious system timing. These days we call it formal verification. Back in the '80's I called doing engineering correctly.
-
-
-
-
-
Wednesday 27th March 2019 23:07 GMT Denarius
Re: Who bears the cost ?
@Fungus.
True. IMHO the article misses a point by not asking if new features are worth it. I would love phone to just work and not do uncommanded actions occasionly. And as for the desktops where every desktop developer seems hell bent on objects moving, shrinking, reappearing, whatever without any user input. At least a few of the minimalist desktops have resisted.
An ElReg article on phones a month or so ago made a similar point about phones getting insane cameras as a new feature instead of better battery life and security updates. I would simply like daylight readable screens but these only occur on industrial units.
I suspect many of us older types would prefer to use familiar stable software with improvements in speed and resource reductions. OK same thing. Scrap java and bring back assembler sort of approach.
-
Wednesday 3rd April 2019 04:36 GMT Charles 9
Re: Who bears the cost ?
"I suspect many of us older types would prefer to use familiar stable software with improvements in speed and resource reductions. OK same thing. Scrap java and bring back assembler sort of approach."
Trouble is, you're in the minority. Steady Ain't Sexy, and Bling Sells to the majority who actually shell out. Guess who wins.
-
-
-
-
Wednesday 27th March 2019 20:21 GMT Anonymous Coward
Re: Who bears the cost ?
" I'm held legally responsible for the quality of code I write I'll be quitting"
I know how you feel. It's where I am right now. I work in the financial services industry. Make a software mistake here, and companies want to put you on the hook for their losses. I've been there.. had to do a two week code analysis with the customer to prove that their mistake wasn't my fault. But every single piece of code I write in a certain space, I have to be absolutely certain that it is as bug free as possible. I am the only one writing this code, and people want to hold me responsible if they software I write leads them down the wrong path. It is not a good feeling when I walk into the office and find out that a customer wants $20k for a mistake that they believe I made.
-
Wednesday 27th March 2019 20:43 GMT JohnFen
Re: Who bears the cost ?
"I'm not sure who will still want to write software though."
I would.
"I prefer to do good work but I haven't always had that option and the day I'm held legally responsible for the quality of code I write I'll be quitting"
That does not exactly give confidence in the code that you're writing. I write the best code that I can for anything that will end up in somebody else's hands. I have too often had to fight with management types to do it, but there's only a couple of times that, in the end, writing good code wasn't an option. And I quit those jobs, because a company like that isn't worth burning my time or reputation.
-
Thursday 28th March 2019 16:30 GMT doublelayer
Re: Who bears the cost ?
That's not sufficient. The problem with the people writing the code being held responsible for every bug isn't the "held responsible" part. Someone should be, and the programmer should be considered. It's the "people writing the code" part. There are a lot of people that could be responsible for the bug. The programmer who wrote it. The people writing test cases who didn't think of it. The people doing code review who didn't see it. The manager who said it wasn't important if they did see it. The customer who said "We can deal with it". The programmer on another team whose library did it. The spec writer who said not to worry about it. Any of those people could be responsible, but for any given bug, it is likely that not many of them are. Tracking down the responsible person may be doable, but in almost all cases, it is wasteful. Finding a convenient person who might be the person to blame and blaming them without checking is not a sufficient solution to the problem.
-
-
-
-
Wednesday 27th March 2019 12:45 GMT big_D
And that is the problem, people have become so used to cheap, they don't actually have any idea of what something is actually worth, if it is a quality product.
And what does more expensive mean? If you take into account time lost due to crashes, workarounds, rebuilds etc. The "more expensive" is probably a lot cheaper that the current "cheap" price, when you take everything into account.
-
Wednesday 27th March 2019 14:37 GMT Tikimon
Speak for yourself!
"And that is the problem, people have become so used to cheap, they don't actually have any idea of what something is actually worth, if it is a quality product."
Stop blaming consumers for the shite decisions of companies! That's like saying car buyers in the 1970's WANTED Planned Obsolescence and a car that would rust in five years. The manufacturers decided we should buy a new car every three years and built accordingly, that's all you had to choose from. They also tried to monopolize service and parts, that sounds familiar... Apple and others have revived those very business practices.
Not a single one of us wants crap telecom access for too much money. You would contend we're too dumb to know what a bad deal we're getting. WE KNOW, but where can we get decent telecoms when they ALL SUCK?
Your point is invalid. And very condescending.
-
Wednesday 27th March 2019 16:30 GMT doublelayer
Re: Speak for yourself!
I mostly agree with you. Consumers are usually quite aware of their poor choices. They are sometimes willing to pay more for the better option. And there are other consumers, including a large majority of businesses, that really do not care about quality.
The issue here is not the realm where there are no good options, as there isn't a choice that a consumer can make to help improve things. The problem that is relevant to this argument is where there are a few options, and there is one with exemplary quality, but people don't buy it because it is more expensive. Take a place I have had some contact with. Their former director wanted to purchase some computers, and went out to find some. There are plenty of computers that would have been fine, and some that would have been extremely reliable and useful. Instead, they bought a bunch of secondhand machines with a few faulty bits. They got a terrible deal. Why did they buy them? The up-front price was lower than most other options, and the description of the units didn't specify enough so the nontechnical director didn't realize how painful they would be to operate.
This type of situation is where it becomes problematic to simply blame the manufacturer or service provider. They could make a product with better quality, but they are going to extract a profit margin on it. Sometimes, they simply decided not to or made a bad version and sold that one. Other times, there is an option, but people don't choose it for the most useless of reasons. And other times, the price is seen as the indicator of quality, where a price that is too low clearly means it's crap, and a price that is too high can't be justified.
-
Wednesday 27th March 2019 16:50 GMT Gene Cash
Re: Speak for yourself!
Nope, I've seen a ton of shitty cheap products push decent slightly more expensive (less than 5%) products out of the market.
I'm not talking just software. I'm talking hand & power tools, PDAs, ISPs, printers, tires, 3D printers and other things.
Try finding a drill that isn't "prosumer" (i.e. shitty) now
So yes, eventually they get to where THEY ALL SUCK.
-
Wednesday 27th March 2019 23:19 GMT Denarius
Re: Speak for yourself!
@Gene
hand tools at country junk shops. Learn to sharpen them. Good power tools are available but only at industrial stores in in industrial areas. And they are expensive. Ironically, in Oz, tradies buy cheap tools because on work sites 5 finger discounts happen often.
OTOH, I would love to able to buy simple spare parts like commutator brushes. I have a 40 year old drill that just needs new brushes. Consumer grade but it has survived a lot of tradie work.
-
-
-
-
Wednesday 27th March 2019 10:15 GMT Tomato42
Reliable code
I recall reading about guidance code for NASA rockets. The price of it wasn't 3 times more, it was 10 to 100 times more, depending on what kind of software you were comparing (aviation one or general-purpose).
But yes, we do put far too little emphasis on correctness of the code that is then used by millions if not billions of people.
-
Wednesday 27th March 2019 10:46 GMT Anonymous Coward
Re: Reliable code
But the thing was, the cost of failure tended to be 10 to 100 times more than the cost of doing it right, not to mention hard-to-replace hardware and perhaps irreplaceable lives. In any other industry, losing lives usually results in costly Wrongful Death suits that cost mucho tiempo y mucho dinero. In this industry, they tend to only get one chance at anything.
-
Wednesday 27th March 2019 11:34 GMT Andrew Commons
Re: Reliable code
The EULA, or in olden times Terms and Conditions, have always stated that the software was not warranted to work as described on the box. When I was writing commercial software using refurbished, but still hideously expensive, microVAXes in the late 1980s we copied the DEC Terms and Conditions almost word for word. No guarantee that the software was going to work. If it didn’t work then we were going to be in deep shit so, with a few relatively minor exceptions, it worked as advertised.
The economics have changed now. Minimal development platforms do not represent 25% of the value of your house. Failure costs you nothing as a developer. And the EULAs still have that big out.
Changing that will require legislation. Safety and Privacy are possible avenues that can be used to achieve this. But consumer apathy will make this an uphill struggle, convenience and shiny will win every time.
-
-
Wednesday 27th March 2019 12:03 GMT Andrew Commons
Contracts cannot override statute
That is true. But software companies that issue patches for critical vulnerabilities month after month are still in business.
So then we get into another interesting discussion, which I think is part of some other discussions on this piece, are the bugs actually hurting? Or, are the users just conditioned to the inconvenience. Or, is action against the vendor not a practical option for the average user.
I've certainly experienced issues that have required days of work to recover from. Taking action against the vendor would have made that effort look insignificant. My financial capacity to take any action would also be questionable....they have very deep pockets if they are big and they can just go bankrupt if they are small.
So we come back to constructing an appropriate and proportional legislative framework to take on the problem.
-
Wednesday 27th March 2019 23:35 GMT Doctor Syntax
Re: Contracts cannot override statute
"My financial capacity to take any action would also be questionable....they have very deep pockets if they are big and they can just go bankrupt if they are small."
It depends on the jurisdiction and the amount involved. Small claims courts go a long way to levelling that playing field. Both sides bear their own costs so BigCo can't bully consumer with threats of ruining the plaintiff with costs and there's no point in taking on huge costs to defend against a low valued case. What's really needed is to forget the class action mentality and have thousands of individuals make their own claims. It would help if the allegedly consumer protecting organisations set up systems to help do this. Make it cheaper to build in quality than to run a department to handle all the court claims and judgements.
-
Wednesday 27th March 2019 23:43 GMT Andrew Commons
Re: Contracts cannot override statute
Very good points. Maybe there is an opportunity for a bit of 'innovation' and 'disruption' here in the form of service that does the heavy lifting for the consumer, a 'small claims broker'. Make it 'convenient', add an 'App', it ticks all the boxes. Create the tsunami of claims.
-
Wednesday 3rd April 2019 04:40 GMT Charles 9
Re: Contracts cannot override statute
Nah, most small claims courts tend to have conditions that restrict the admission rates: mostly to keep the dockets from getting too big. If one of those conditions is the need to fill out forms on site and submit their suits, with signatures, in person, among other things, I don't know how you can improve matters. Big business may take a countertactic of finding ways to clog the dockets or otherwise slow down the case rate against them.
-
-
-
-
Wednesday 27th March 2019 19:04 GMT fidodogbreath
Re: Reliable code
Speaking of "big outs" -- how can a EULA / TOS be a legally-enforceable contract when one party to the agreement claims the right to change it at any time, with no notice? In most other areas of contract law, a clause like that would be considered "unconscionable"...but every #$%! website TOS has something like that.
Another gripe along those lines: if you manage to discover that the TOS changed, and you disagree with the change, your only "remedy" is to stop using the site or product -- even if you paid for it, and paid subscriptions are not refundable.
-
Wednesday 27th March 2019 20:47 GMT JohnFen
Re: Reliable code
"how can a EULA / TOS be a legally-enforceable contract when one party to the agreement claims the right to change it at any time, with no notice?"
In the US, it's not an automatic given that a EULA/TOS is a legally enforceable contract. Sometimes it is, sometimes it isn't, depending on the specific EULA involved. Terms like "we can change the deal any time, in any way, without notice" tend to make it less likely that the EULA is enforceable.
-
-
-
-
-
-
Wednesday 27th March 2019 14:46 GMT vtcodger
Re: I can no longer find the article
Why do we accept shit software?
For the same reason we accept the inconvenient need to breath air, eat food, and drink fluids if we plan to make it through the week. There's no viable realistic alternative. If folks actually insisted on true software quality, we'd still be patching bugs in MSDOS, Word Perfect and Lotus123. And GUIs would still be a pie in the sky fantasy.
That'd be OK with me actually. But the rest of you seem to be a bit impatient.
-
-
Wednesday 27th March 2019 10:20 GMT anthonyhegedus
That's the state software is in...
That's the sorry state of the software industry today. Things are SO complex that their behaviour cannot be proved to work in every given situation. And the fast pace of the market means that it just isn't economical to spend months more on development to get a more reliable product out there. It's far better to push something out there, and add a few features later, fix the bugs later and above all get the user base to do the real-world testing. Software manufacturers put lots of checks and traps in the software to keep the majority of the system working the majority of the time: so your car will carry on working, even though the window keeps going down; the TV will carry on working even if the Netflix app doesn't always start etc. The great thing about this attitude is that people expect it. Why are we no longer surprised when we ring a call centre and they say that sorry, their system is slow (or down)? We are so used to things crashing, being unavailable for a bit, or simply not working at all that it really isn't even remotely unexpected any more. It gives manufacturers licence to keep everything in beta for example.
Unfortunately, this doesn't translate well into safety-critical systems such as, well, aeroplanes.
That's why space missions which, let's face it, would cost billions if they failed (oh and the human life thing) use ancient processors and simpler software. They are easier to prove they work.
-
Wednesday 27th March 2019 10:22 GMT Dave K
Adding features
Part of the issue here is that marketing doesn't care about reliability. Look at Office 365 for example. Can I think of many useful features to add? Not really. Can I think of any bugs, flaws and irritations to fix? Hell yes!
If MS were to say "We're not implementing any new features in 2019, instead we're focussing on fixing bugs, crashes and general irritations" I'd be delighted! Instead, MS (like everyone) decides to focus on questionable gimmicks when quality improvements would have a much bigger impact on user satisfaction.
-
-
-
Wednesday 27th March 2019 16:08 GMT Anonymous Coward
Re: Adding features
I would rate current MS software as high quality. It's used by billions every day and mostly just works.
No, that qualifies as "good enough", which is pretty much what Microsoft is after - VERY few organisations are interested in paying a lot more for software that has a deep QA cycle. As a matter of fact, Microsoft has pretty much externalised the costs of beta testing - there is a reason most IT people don't install an MS product until it has had at least one Service Pack applied, and they've been doing that roughly since WfW 3.11. It just got a tad out of hand with Windows Vista.
Anyone who tells me that MS software is high quality is either an MS sales person, has been listening to them too much or has in general low standards to begin with.
-
Wednesday 27th March 2019 13:23 GMT Daedalus
Re: Adding features
Yes but most of the features people actually use - I do not include cool UI stuff in "features" here - have been around for a long time and used to run in much less memory on much less powerful processors. "Progress" in software mostly comes from adding kyewl stuff or "improving the user experience (it says here)"
-
Wednesday 27th March 2019 14:53 GMT vtcodger
Re: Adding features
I would rate current MS software as high quality
Ehrrr ... no. That's acceptable quality ... if your standards aren't especially high.
I submit that you would not be overly comfortable if your local nuclear power plant, MRI machine, or commuter flight were running on top of Excel or Word.
-
Friday 29th March 2019 08:56 GMT Dave K
Re: Adding features
Don't get me wrong, I like Office 365 from a features point of view, but it is buggy and annoying at times.
Random examples from the top of my head:
*Pull down a filter box in Excel and start typing to filter? The carat doesn't move to the text field by default.
*Copy a cell in Excel, paste it somewhere and change the colour, Excel will reward you by clearing the clipboard.
*Try to close Excel with 100+ cells in the clipboard, Excel will warn you about "lots of data in memory", even though it's only about 5k and my PC has 8GB of RAM.
* Try to view a mailbox in Outlook that isn't cached? Watch whilst Outlook locks up solid for 20 seconds.
* Same with opening large Excel files, just watch as every instance of Excel hangs for ages.
* When I try and export a sheet from an Excel file as a PDF, if I try to save direct to Sharepoint, I get an "Access denied" error. I have to save into Temp on my PC then move it in Windows Explorer to Sharepoint.
* Odd times when I click a meeting request in Outlook, Outlook crashes and has to be restarted.
* Skype for Business sometimes fails to reconnect when my PC comes out of sleep (sticks at logging in for eternity), so I have to cancel login (thus losing my conversations) then try logging in again for it to work.
* Try an open an Excel file from Sharepoint and specifically select the "Edit" option? Excel will still open in read-only mode, you have to click the button on the banner to enable saving.
etc. etc. etc.
-
-
-
Wednesday 27th March 2019 10:26 GMT jpo234
This article misses a lot of nuances:
* People can and do develop software without formal training and there is little that can be done to change this
* PEBCAK: problem exists between chair and keyboard, e.g. the user didn't bother to learn how to use the software
* interaction between software that was written independently (the cookie example: is the bug in the browser or the web app?)
* bad specifications
-
Wednesday 27th March 2019 10:49 GMT Charles 9
Isn't that why the push to make software more intuitive, meaning it shouldn't require training to use? I will admit some limits will kick in eventually (I mean, you still have to be taught how to use a hammer), but people aren't that patient anymore. You tell them RTFM, they reply, "No time! We have to DIE and the deadline's looming!"
-
Wednesday 27th March 2019 17:55 GMT jake
"you still have to be taught how to use a hammer"
Not really true. Stick an eighteen(ish) month old child in front of a "captive nail" kiddie's workbench and hand them the included plastic hammer. Most will soon be bashing the nails in all by themselves, with the hammer oriented in the correct direction. Many will actually figure out how to turn the thing so they can start over, again all by themselves. Don't just take my word for it, try it. I've won bar bets with this one ...
-
-
-
Thursday 28th March 2019 12:18 GMT Charles 9
Would you believe some of the people I'm talking about are actually knurd? Because they're the Designated Drivers for the rest of their group and hate the taste of alcohol (I'm a teetotaler myself because I can taste it when nigh anything has ethanol in it--I once won a bet being able to taste a vodka jello shot)? No, I've personally seen people when told to look left, look right, when told to stop, go faster, and when told to turn around, do a full 360. Every single day.
-
-
Saturday 30th March 2019 13:08 GMT Doctor Syntax
Last weekend I deliberately used the side of a lump hammer. Why? I was driving in a new stake for an apple tree. The stake was rather long and difficult to reach the top. initially the few inches between the side and the face made all the difference between a square blow and a glancing blow.
-
-
-
-
-
Wednesday 3rd April 2019 07:12 GMT jake
Re: To be fair ...
I hear you ... it's true that finesse has it's place. But part of the fun of rebuilding a kitchen (for example) is taking the old one down to wall-studs and the subfloor in as short a time as possible, preferably as loudly as possible. It's good for the human psyche to get the aggression out in a controlled setting :-)
-
Wednesday 3rd April 2019 19:44 GMT Charles 9
Re: To be fair ...
When my kitchen was done, the aggression vent was achieved by yanking out the chunks by hand with help from a few strategic cuts or bashes in the centers of the panels. Was also a great time to run some data lines from the attic through the upper floor down to the exposed downstairs and from there through some stud holes and out to an exterior conduit. Thanks to that, I was able to retrofit wired Ethernet downstairs.
-
-
-
-
-
-
-
-
-
Wednesday 27th March 2019 10:59 GMT Doctor Syntax
"People can and do develop software without formal training and there is little that can be done to change this"
Some people without formal training develop good software, some people with formal training develop execrable software. The critical thing is that software is put into a satisfactory state before release and that "satisfactory" is a high hurdle to cross.
"the user didn't bother to learn how to use the software"
Or the S/W wasn't sufficiently intuitive, the developer decided to depart from existing norms of user interfaces to "differentiate" themselves, the S/W was poorly/not at all documented, etc. There are lots of things to lay at the vendor's door.
"interaction between software that was written independently (the cookie example: is the bug in the browser or the web app?)"
Products A, B, C and D work fine in the browser. Product E has to have the cache cleared. Is the problem with the browser or E?
"bad specifications"
Who writes the specifications? For commercial software, usually the vendor.
-
Thursday 28th March 2019 17:13 GMT Stork
Bad specifications
This!
I was at Maersk Data testing applications for Maersk Line, there were always several on the team who had a past on the other side, and often developers and managers who had been working with the area for at least ten years. And as the owner was the same there was also a common understanding that we should keep the boxes moving.
Sometimes you also need a project manager who can say: "no, we have not finished testing yet. Do you sign here to accept these unresolved issues plus what is found in production?
I had one!
-
-
Wednesday 27th March 2019 10:51 GMT Anonymous Coward
Utterly fatous article
As the last 40 years of the software business has proved again and again there is a Gresham Law of Software Quality, companies that ship badly buggy software will succeed and companies that sell well written stable software will fail.
The reason is pretty simple - the software market is the classic ill informed market. In microeconomics terms. The average buyer has little or independent metric or expertise for distinguishing quality between products. And when some bug reveals itself the vast majority of buyers have no expertise to distinguish between user error and software defects. The default is to assume that they, the user, did something wrong. Not that, as is in most cases I have seen, the software is riddled with bugs.
There is also another basic law of economics in play. Writing and shipping crap buggy software is cheap. Writing and shipping very high quality software is very expensive. Multiply by at least 5 to 10 for the cost of the typical desktop application development team between one that will ship any old crap out the door that is half way stable versus a fully tested, robust application.
I have seen a lot of very good software development companies go out of business over the decades because they tried to write quality code. Took professional pride in work. Whereas the companies that walked off with hundreds of billions in profits, like Microsoft, have not written one single high quality product since their days in a motel room in Albuquerque. Anyone who has seen MS codebases for shipping products will have seen all the evidence needed for this statement.
Its just a fact of life. Unless you are writing software for a very heavily regulated business sectors like avionics writing any old crap and shipping is a proven way to business success. Spending your time and putting in the immense effort to create a quality product is a proven path to business failure. Except in a few small niece markets. Where the good guys might actually succeed. For a while. But I would not bet on it.
-
Wednesday 27th March 2019 11:02 GMT Charles 9
Re: Utterly fatous article
The TL:DR version: in most fields, quality really isn't that important. In many case First In Wins by saturating the market before any competition can get a toehold.
Only in niche industries where quality is key to the industry itself (such as those directly with lives on the line like medical and avionics, or those where most things only get one chance such as aerospace) will you find quality as a selling point (because in those industries, it's hard to get a second chance).
-
Wednesday 27th March 2019 11:20 GMT Doctor Syntax
Re: Utterly fatous article
"In many case First In Wins by saturating the market before any competition can get a toehold."
So how do you explain the likes of MySpace vs Facebook? Don't ask me what made the latter more appealing to the mass market than the former - I have no use for either. But the late-comer doesn't necessarily lose out. There must be some differentiator that appeals to the public.
Quality can be that differentiator. Back in the late '60s & the '70s, at least in the UK, quality of construction of houses and the domestic equipment that went into them was pretty bad but then improved. Maybe the answer is that when everybody has finished in the race to the bottom quality becomes the only differentiator and we can actually start a race to the top.
-
Wednesday 27th March 2019 11:50 GMT Anonymous Coward
"So how do you explain the likes of MySpace vs Facebook? "
I wouldn't call the user-facing part of site like MySpace or Facebook "software" - they have a very basic UI and very basic functionalities. And users may never seen server-side bugs unless they are really huge ones (as MySpace losing all songs..). And being "free" means most users accepts lower quality.
FB became a fashion for a while, it was cunningly promoted - with many media outlets promoting it a lot - and I'm sure it wasn't because of naivety.
Anyway, web applications themselves made the public used to lower quality software. Being built on many layers of bad designs and code (why cookies exist - and moreover are implemented so badly?), often by cheap developers using ill-designed languages, they brought software development back tens of years.
-
Wednesday 27th March 2019 23:45 GMT Doctor Syntax
Re: "So how do you explain the likes of MySpace vs Facebook? "
The point isn't what happens server side or whatever the UI is like. The point I was making is that not being first into a market doesn't necessarily preclude success. That being the case it's not necessarily a bad thing to take a bit longer and come up with a better product and that applies irrespective of whether the product is a service, a piece of software or some humongous machine.
-
Wednesday 3rd April 2019 04:48 GMT Charles 9
Re: "So how do you explain the likes of MySpace vs Facebook? "
That's why I said MOST. You don't see much competition to Amazon or Netflix, do you? Often, the first in, if allowed to mature enough, can create barriers of entry too high for anyone else to get a toehold in the market. MySpace is probably an exception because it didn't get mature enough quick enough while Google wasn't the first in re: search engines, but it was probably first in re: finding other ways to harvest and exploit data. But we can pretty much agree that these markets are much like utilities and tend towards natural monopolies, and once you have a winner in such a market, it's hard to dislodge.
-
-
-
-
-
-
Thursday 28th March 2019 12:22 GMT Charles 9
Re: Utterly fatous article
Then why has Boeing been sweating bullets for the past few weeks? Avionics is one industry where software errors can show immediate and serious consequences. Put it this way. Patch Tuesday rolled around around the same time, it rarely made the headlines, but Boeing did.
-
-
Tuesday 2nd April 2019 08:09 GMT pauldee
Re: Utterly fatous article
How about this though:
Writing and shipping crap buggy software is cheap. Writing and shipping very high quality software is very expensive.
Maintaining high quality software is expensive. Maintaining crap buggy software is really, really expensive.
I honestly believe writing crap code is a bad investment. I'm not just saying it makes me sad and doesn't fulfil me as a developer - it actually costs the company money.
-
Wednesday 3rd April 2019 04:51 GMT Charles 9
Re: Utterly fatous article
"Maintaining crap buggy software is really, really expensive."
This is the problem. You can't prove this is the case. Savvy businesses can find was to externalize these kinds of maintenance costs, find cheaper ways out (say, it can be cheaper to lawyer their way out), or otherwise make it Somebody Else's Problem. Especially transnationals who can be slippery enough to dodge laws.
-
-
Wednesday 27th March 2019 11:05 GMT revilo
rwe live in a good time of software
I have used software and written programs (though tiny) myself since 35 years. The fact is that software has in general become much more reliable. General critique is always cheap. The fact is that building good and reliable software is always going to need effort and time.In the Boing case, it had partly been also time pressure due to competition from Airbus. Things were rushed through which should have been tested more. The general critique on ``crap software' is totally unjustified. We have never been in a better time what concerns software. Phones do not crash any more, I have never seen one crash. My computers are running for years without reboot. I can not remember also my desktop or laptop ever filing except I bring it down with come impossible comphuting task or if a hardware feiled. Almost all applications today are pretty reliable (if chosen carefully). Yes, as we are so spoiled now, we also feel every failure much harder. But lets not forget how it was, when configuring a system needed a lot of time, applications routinely crashed, operating systems bugged you constantly or failed after an update or took control and where using software was always also a fight to avoid the bugs. We have choice today and can fortunately avoid any nonsense if we chose to.
-
Wednesday 27th March 2019 11:29 GMT Doctor Syntax
Re: rwe live in a good time of software
"Almost all applications today are pretty reliable (if chosen carefully)."
So almost all carefully chosen applications are pretty reliable? What about all those which were eliminated when you made the choice?
A secondary problem is that when you've made your careful choice the next iteration of the product may well introduce a whole lot of changes dropping a few features you needed and turning the quality to shit.
-
Wednesday 27th March 2019 15:06 GMT Daedalus
Re: rwe live in a good time of software
The problem is software getting into areas where it isn't really needed but can fail dangerously. I used to keep my Sparc 1 running all the time because even logging in was a pain, and I didn't want unnecessary fsck's from powering up. I keep my work laptop running 24/7 because company policy, but I reboot at least once a week. Likewise my personal laptop.
I have seen an iPhone freeze up - my only regret is that I forgot about the two-finger salute and we went to the phone store to watch their guy do it.
But this isn't about Sparc's or laptops or iPhones. It's about chips with everything. It's about chip designers who produce incomprehensible documentation for their hardware and support software (I do not name NXP). It's about dealing with a glitchy I2C when manglement wants features features features. It's about Kanban projects where nobody seems to be in charge but everybody has a mission, just not the one to produce a viable product. It's about designer droids who think that embedded systems come with the same capabilities as high-end laptops (but said droids can't spell or proofread to save their lives). It's about people who talk faster than they think, except when they have to talk about reality. It's about manglers who come into phone conferences utterly unprepared, without even bits of paper to refer to. It's about clients who lambast your software because they found an Android app online that produces results that contradict yours.
-
Wednesday 27th March 2019 16:48 GMT doublelayer
Re: rwe live in a good time of software
But let's be honest. A lot of terrible software exists today, and we have a lot of contact with it. I don't deny that. But a lot of crap software existed ten years ago, twenty years ago, and thirty years ago. At least some of that isn't really the case now. I remember earlyish Linux distributions. They were pretty bad. The kernel was still pretty good, but it had a lot more panics and oops events. You had creaky desktops that would fail and require reconfiguration. Now, we have a more stable OS and a bunch of desktops that, while not being to everyone's taste, at least break a lot less. Ethernet connections required a proprietary driver that might not run and a lot of configuration on pretty much every OS. We may now have a problem once in a while with a WiFi driver, but we can be relatively sure that an ethernet connection will work for installing packages, and the WiFi driver is usually a one-time fix.
We've learned how not to break things at such a low level. Unfortunately, we moved up to a new level and a lot of things on that level are really broken. I'm not saying we should just accept that, but we should also temper our nostalgia with a healthy dose of pain. My suggestion: everyone run up a windows 95 virtual machine. Only give it eight megabytes of memory. Try to use it and nothing else for a day. Switch it out for a 2000 era Linux distro. Don't let it update. Try that one for a day as well. Assuming you didn't do anything too complex, run it for one more day, which it will probably survive. Then go back to whatever your normal system is and remember that every time you're angry at them, you at least don't have the things you saw with the VMs.
-
Wednesday 27th March 2019 23:54 GMT Doctor Syntax
Re: rwe live in a good time of software
"and a bunch of desktops that, while not being to everyone's taste, at least break a lot less."
I'd disagree with that. My current KDE desktop has several misfeatures over the previous version and a couple of failure modes which make it considerably less robust. That previous version had one dropped feature relative to its predecessor which makes for less convenient operation. From 3.x to 5.x it's gone downhill quite badly.
OTOH I agree that early Linux distros weren't that good. It was a long while before I switched from SCO to Linux. I reckon that if SCO had made a bit of an attempt to make a better price/quality offering Linux would never have got off the ground. That, of course, is before it switched tactics to litigation once it had missed the boat.
-
-
-
Wednesday 27th March 2019 18:21 GMT anthonyhegedus
Re: rwe live in a good time of software
“Phones do not crash any more, I have never seen one crash. My computers are running for years without reboot”
I beg your pardon? What on earth are you wittering on about? That’s the most ridiculously false couple of sentences I’ve seen in a long while!
-
-
Wednesday 27th March 2019 11:09 GMT Anonymous Coward
My place, which isn't short of a bob or two...
... still hasn't come up with a toolchain, still hasn't coughed up for decent static code analysis software an has no interest in addressing the huge technical debt they have.
Some people just go with it and use vi, others tear their hair out trying to get a local IDE set up and working with servers running ancient gcc, gdb, and firewall rules which work against you.
They don't particularly want to automate stuff as it might mean fewer hours to charge end clients.
There's only money because it's a market with a lot of money and few competitors and a lot of regulation which limits innovation. If it weren't for that, this place would have sank years ago.
-
Wednesday 27th March 2019 20:53 GMT JohnFen
Re: My place, which isn't short of a bob or two...
"They don't particularly want to automate stuff as it might mean fewer hours to charge end clients."
Ugh, I hope that's not their reasoning. I'd prefer to chalk it up to the wrong sort of laziness. I think the right sort of laziness on this point is epitomized by the saying "if it's worth doing twice, it's worth automating".
-
-
Wednesday 27th March 2019 11:11 GMT Anonymous Coward
I love the ubiquitous Marketing attack
Having worked on the dark side, Marketing, and on the shiny side, Engineering - both Hardware & Software, the whole problem is effectively insoluble. Marketing exists to draw attention to the company and the products they make. Engineering creates the products by meeting the specifications that are given it by Marketing. Marketing is supposed to be listening to the prospective customers and all too frequently keeps Engineering in the dark. The fly in the ointment is Sales. When your metrics say sell to get a paycheck, you sell, you sell anything you think will get you that paycheck. Sales is totally coin operated.
When Engineering is allowed to attend an event at which real live customers are present, and if they are allowed and willing to speak with real live customers, frequently great products ensue. Products that make the company a "Brand" for Marketing, great quantities of high value orders for Sales, and maybe an Ataboy for Engineering.
Of course the previous sentences are strictly a romantic fabrication because most of Engineering knows that real live customers are terminally stupid, and Marketing is at or past their two drink minimum most of the time, while Sales couldn't sell Gold for the price of Copper.
As I said previously, the problem is insoluble.
-
-
Wednesday 27th March 2019 19:17 GMT fidodogbreath
Re: I love the ubiquitous Marketing attack
And management sets the development budget. If marketing specs state that the software must be of "high quality" but management only budgets for "good enough," the budget wins. You can finesse specs; finessing the budget will get the dev manager fired.
-
-
Wednesday 27th March 2019 11:29 GMT SVV
Bug fixing and machine learning
This is a superbly made point. In traditional systems (which are all basically strictly procedural) the awareness of the presence of the bug and the process of tracking it down and removing it is a well known process. Hence the original analogy of finding the moth that got caught in a circuit.
With more advanced machine learning and adaptive algorithms, the new analogy gets into the realm of chaos theory, and it's more like trying to solve a hurricane by tracking don the original butterfly that flapped its' wings in a forest somewhere.
-
Wednesday 27th March 2019 17:39 GMT Mike 16
Re: Bug fixing and machine learning
I'm not sure that having an AI capable of explaining its actions will be an unalloyed good. When its explanation for why it hit a pedestrian is something like "He was a Ginger", how do you plan on fixing it? We've had millennia to work on that problem with the previous meat-sack implementation, with little to no progress (some would say retrograde) so far.
As for quality in general, as I've said before, in the past I worked on embedded systems that would count as Capital Equipment. "Fit for purpose" was the standard, and the customer did not care if the issue was hardware or software, it was on the manufacturer to eat the cost of the fix, and possibly pay some part of the loss of revenue to the customer.
In the Ginger-hating autopilot case, how would one expect to squeeze the money for getting a better training set out of developers hidden under 7 layers of shell company?
-
-
Wednesday 27th March 2019 11:37 GMT anthonyhegedus
Sage
Look at Sage accounts - if you dare! Everyone uses it, not because it's any good, but because everyone's accountant, suppliers and customers use it. A bigger heap of disjointed, buggy, flogged-to-death-but-still-bad and crappy code I could not conceive. It's HUGE. And it's based on very old code from the 80s, with bits added on over the decades. The only reason it works is because there's Sage maintenance available, and almost everybody uses it. Basically Sage software keeps working because there are dozens or hundreds of support staff helping the users when the database goes wrong every other month.
We support several small companies who use it, and it's their biggest problem. The updates never work first time for example. We support accountants who use it to support their customers: they spent literally tens of hours a month on the phone to Sage support.
There's no incentive to make the code work better, because they make plenty of money from support contracts, and people keep buying the new version every year.
-
Wednesday 27th March 2019 12:32 GMT deshepherd
Re: Sage
Think the same applies to all the major banks computing systems .... lots of decades old systems patched to add new features/cope with merging other banks after takeovers/etc which it seems only persist becuase of the very public examples of what happens when anyone tries to make any sigificant upgrade.
-
-
Wednesday 27th March 2019 11:37 GMT Doctor Syntax
"But it's also possible to treat the standards as a box-ticking exercise."
I'd put the finger on ISO 9000 as the source of many ills. It was, in essence, a standard for box-ticks. However mediocre the product, providing the mediocrity was achieved consistently the process by which it was achieved met the required standard.
-
Wednesday 27th March 2019 12:38 GMT big_D
Situation
We need to get ourselves into the situation where normal practice is for consumers who suffer bugs to report them to the developers, whose normal practice is to diagnose and fix them.
We need to get into the situation where users/customers accept "proper" pricing. You can rant on about quality all you like, but if the customer isn't willing to pay for quality, they won't get it.
We have had a couple of decades now, where prices have been pushed to the limit, there is no more working margin on many products. The supplier can only survive through sheer volume. The food industry is a prime example, meat, eggs etc. have been pushed to such a low level by the discounters, like Lidl and Aldi, that the farmers are often on the poverty line.
They get a contract from a discounter for 10 times what they currently produce, but the marging is okay. They invest in new stock and new processing machinery to cope with the increased volume. Then, when the initial contract period is over, the discounter comes back and offers a quarter or half of what they are currently paying, take-it-or-leave-it. The farmer can't actually turn a profit at the reduced price, but they still have to pay off the loans for the additional equipment and stock. They either have to take the contract and build up more debt or declare bankruptcy.
The online shops are doing the same for other areas, they push prices down to the point where suppliers can only continue to make the product if they cut corners to meet those new prices. The consumer gets used to getting everything cheap, but complains that nothing lasts like it used to.
I replaced my electric toothbrush a couple of years ago, the old model's battery had held up for over a decade, but it was down to needing to be on the charger all the time, instead of recharging every 2 - 3 weeks. I got the replacement model, it cost a little less than I had paid for the previous model, but the build quality was not a patch on the old one. Even some of the comfort features were missing, like the battery LED, it was still there, but instead of going from green to yellow to red to flashing red, it went from green to flashing green - after 2 days use and stays at flashing green until the battery is empty 2 weeks later! But a green LED probably save a couple of cents on the build price, compared to a multi-colour one.
Today, nobody is looking to make a product that lasts a couple of decades and build up good sales of quality, expensive products based on reputation, instead they build the product to last just longer than the legal guarantee period and keep developing new products with new gimics that nobody needs, in the hope that when the old version breaks, you will come back to the same tat vendor to buy the newer model.
Software is the same story. When I first started, software cost a small fortune, but it was generally reliable. Now it is dirt cheap, but often gets near daily updates to fix problems, because proper testing was cut out of the "quality" chain, because it costs money.
Cheap flights anyone? Anyone really surprised that so many discount airlines are currently going bust? Nobody is willing to pay what it really costs to get from A to B, because we have become used to dirt cheap deals.
-
Friday 29th March 2019 18:34 GMT Charles 9
Re: Situation
"Cheap flights anyone? Anyone really surprised that so many discount airlines are currently going bust? Nobody is willing to pay what it really costs to get from A to B, because we have become used to dirt cheap deals."
Because any more expensive and people wouldn't fly anymore, you think? This may well be an untenable market where the P and Q graphs don't actually meet. Meaning the airlines are caught in a Morton's Fork. Keep going and get run into the ground or throw up your hands and just close now?
-
-
Wednesday 27th March 2019 12:39 GMT Anonymous Coward
We're powerless
I agree with most, if not all, of the views put forward in the article, but I can't see a way that we can influence the situation. If we could report bugs, we know that the reports will likely be ignored (and in any event, user error reporting is rather like witness statements - you have as many versions of the crime as there are witnesses).
The only thing I can see that would halt this decline in software (and system) quality in its tracks would be for the powers that be to change laws to make the suppliers (and particularly the decision-makers higher up that stop engineers doing a quality job) legally liable for the consequences of failure - and I mean actual people actually being prosecuted.
Until that becomes a trend, I don't see any way out of this.
-
Wednesday 27th March 2019 13:12 GMT johnnyblaze
Exactly the reason why companies like Microsoft are delivering software 'as a service'. There's no need to find/squash bugs anymore - unless they're showstoppers. Push whatever cr*p out the door your like, then worry about it later. When you've got nearly 1B beta testers, and are collecting reams of telemetry data from most of them, what could go wrong!
-
Wednesday 27th March 2019 13:36 GMT Daedalus
Forget not the bozocalypse
Too much technology, too few technologists.
Remember when driving a car was a skill not a horror movie? These random glitches are going to get worse because more and more of the basic functions of the car are being controlled by microchips. Meanwhile the number of people who can create and debug this stuff will stay about the same or even get smaller. When 1% of the people born each year have the brains to do this stuff, and the aging out numbers equal or exceed the numbers coming in at the young end, you get a situation of impending collapse. At some point obsolescence and hardware failures will exceed our ability to deal with them.
Don't be fooled by "coding for all", "anyone can do this". Go into a software or IT dept. and you pretty soon discover that a good fraction of the people there create more problems than they solve. And those are the people who took the courses, passed the exams, certifications etc. They're only there because their manglers are totally incapable of distinguishing horses from donkeys.
-
Thursday 28th March 2019 00:08 GMT Doctor Syntax
Re: Forget not the bozocalypse
"They're only there because their manglers are totally incapable of distinguishing horses from donkeys."
In a lot of cases the ones who were good got promoted to management irrespective of whether they had the requisite skills for management. The result is that we have management who weren't appointed for their management skills which are in consequence pretty average managing work at which they'd be very good but being performed by a mixture of those who aren't good enough to be promoted or too new to be judged.
(There's also an argument that some of the useless ones got promoted to management in the hope that they'd be less able to do damage there.)
-
-
Wednesday 27th March 2019 13:37 GMT arthoss
the demand for software is too high to have time to properly do it
Yes, there is much to do. And what do you do when companies are busy trying to churn out new shinier software (that admittedly sells better) instead of investing in the existing one...
Anyway I just came here to contradict "Software developers know how to create and deploy software for which extremely low bug counts are guaranteed. " Most developers are barely adequate so I'm not sure they know how to create software with a low number of bugs.
-
Wednesday 27th March 2019 13:58 GMT yoganmahew
Re: the demand for software is too high to have time to properly do it
"Most developers are barely adequate"
This!
Not only are they barely adequate, but they really don't give a monkeys about the quality of the product. As long as QA signs it off, they move on to the next shiny thing. Technology-led is idiot-led flavour of the month, anything to avoid doing the hard work of tracking down and fixing bugs. Let's refactor! Or change platform!
Even when QA doesn't sign off, the shirkers charter (agile) let's them put the defect in the backlog either for some other charlie or for it to be abandoned as technical debt as they run out of time and budget.
-
-
Wednesday 27th March 2019 14:19 GMT heyrick
My new Audi A5 has a passenger window which goes down at unexpected times,
Take it back, demand a refund.
If they can't get something like a window right (and appearing to need some convoluted way of "fixing" it), how can you be sure that the engine management isn't going to freak out when you're on the motorway? Or better yet, cars like Audi have numerous airbags that include complex mechanisms to determine when to fire and how much inflation to apply. Given you can't test this, and you REALLY need it when you need it, can you trust such a system on a car that can't even manage to keep a window closed?
-
Wednesday 27th March 2019 14:59 GMT T. F. M. Reader
The cost of writing good SW is not incremental
Every time I hear a statement like "we can't afford to do it right" - and I've heard it uncountable times during my career, from all sorts of bigwigs in all sorts of companies - I know it's a lie. This kind of statement invariably masks the real state of affairs: "we cannot do it right". Or, at least, "we cannot do it right within any feasible timeframe and/or budgetary limit."
It is never "that extra ounce of expense or effort" as someone put it here in this forum. If you can do it right it will not take more time to do it. It will probably take you less time because, frankly, it will be hard for you to force yourself to think about what you need to do in wrong ways. You will also develop rather quickly and efficiently, unlike someone who is not "skilled in the art".
What really happens in the "we can't afford" cases is that the dev team simply do not think right in the first place. This leads to bad design that a) causes bugs (usually in design) and b) makes those bugs very expensive to find and correct.
In such a situation it is never a case of "give us an extra month and everything will indeed be of the highest quality". That will never happen. It is not a case of the developers being intellectually incapable of learning how to do things right. But they can't learn it on a week-long SW engineering course and there is no silver bullet like a new shiny IDE. Quite a few of them may "get it" after a couple of years of experience in some other organization and with good mentoring, but that's infeasible. And these guys are here and now - firing them and hiring really good people (who will be expensive) will also not help incrementally - those good people will a) not know how the current system works in the first place, b) will want to fix the current bad design that will not be an incremental effort. The fix will be a significant investment in parallel to maintaining the current line of products - and then switching the existing customers over will also be a pain. This is not impossible, and I've seen cases of such re-design, but such cases are few and far between and happen only in organizations that are both rich enough and forward-looking enough.
It does not help that the beancounters don't have a "savings" column next to "revenue" and "expenses" in their spreadsheets.
-
Wednesday 27th March 2019 16:29 GMT Anonymous Coward
Plenty of excellent comments here...but....
....no one mentioned "agile".
*
Once upon a time we had some sort of formal "requirements document". I know all the drawbacks about this....but at least there was something in place against which to write an exhaustive test plan.
*
Then we got to the idea that we WOULDN'T BOTHER WITH initial requirements. In its place there would be some ever-changing set of "user stories" which would comprise the "product backlog". And this means that testing can only be done one "user story" at a time. What happens a year later when "user story 999" conflicts with "user story 99"?.......no one has the fogiest idea.
*
And each "user story" now gets deployed ("continuous development", "devops").....and users in production get to find out that there might just be an issue. And this happens because the developers -- even if they wanted one!!! -- don't have a comprehensive test plan.
*
Of course the "product manager" is a representative of the user community though this process....so anything that goes wrong in production MUST BE THE USER'S FAULT. Brilliant.......no one at all is responsible for quality.....not least because without a comprehensive test plan, there's NO DEFINITION of quality.
*
Oh dear!
-
Thursday 28th March 2019 11:07 GMT Primus Secundus Tertius
Re: Plenty of excellent comments here...but....
I retired before 'agile' became the buzzword of the day. But I suspect it made little difference really: most 'design documents' were just box-ticking tokens for the management plan, that repeated the user's requirements but did not discuss HOW to implement those requirements.
That was fine in the 1950s when computing was just automating arithmetic on a large scale using FORTRAN; but has been totally inadequate for supporting any kind of business.
-
Wednesday 27th March 2019 16:37 GMT JulieM
And this is why I will never pay for software
There is a reason why I have never paid for a piece of PC software in my life, and in all probability will go to the grave without ever having done so.
When you pay for software, you have neither the right to modify it to suit your requirements, nor the ability in extremis to take that right by force. You are entirely beholden to the supplier's whims and caprices. They can charge you more money if you want to add more users. And at any time, whenever they decide, they can render it obsolete by altering the formats in which data are saved; so anything saved out of a newer version becomes incompatible with your version.
When you use Open Source software, you -- or any competent programmer -- can study its operation and adapt it to your own requirements. You can make as many copies as you like, and the only limit on the number of users is imposed by your hardware. Upgrading to a newer version need not cost any money. If data formats are changed incompatibly and for some reason it is not possible to upgrade to a newer version of the software, or upgrading would require other software to be upgraded, it is possible to write a separate program to translate between "new" and "old" formats.
Until paid-for software comes with the same freedom to study and adapt it (even without the freedom to share it) as Open Source software, it really isn't a hard decision to make.
-
Wednesday 27th March 2019 22:57 GMT aaaa
Re: And this is why I will never pay for software
> Until paid-for software comes with the same freedom to study and adapt it (even
> without the freedom to share it) as Open Source software, it really isn't a hard
> decision to make.
Absolutely yes.
The problem is not paying for software. It's the T&C's (as many other previous posts pointed out).
Yours was the first post I saw point out that the fundamental problem with the T&C's is access to source code, so that if it's not commercially viable for the vendor to fix your problem - you can fix it yourself.
I run a company that sells software. All our software includes source code. Back in the day this was always done - a small company selling software to a large company could expect the customer to require a copy of the code kept 'in escrow' in case the small vendor dissappeared. We got around that by just simply supplying the source with the commercial binaries, and a license clear that the source is copyrighted by us and they can't resell the software or create derirative works. I'm not sure how many customers use the source, I've had a few reach out to point out missing headers that we forgot to include, so some clearly do check it. I only know of one customer who have ripped us off - but that's just by using 3000 copies of the binary when they are licensed for 10, no evidence they even tried to re-compile the source to do it.
I use a lot of open source software - and pay for all of it. I either donate, or if the vendor has a 'commercial' partnership arrangement I use it. Plus of course I submit bug fixes, donate the time of by dev team to work on code (because it helps us in the long run), etc.
-
Thursday 28th March 2019 00:24 GMT Doctor Syntax
Re: And this is why I will never pay for software
"When you use Open Source software, you -- or any competent programmer -- can study its operation and adapt it to your own requirements"
There are a few big caveats here which severely limit this in practice.
One is the need for expertise in the appropriate area. If you don't you can seriously screw up, especially if what you do gets pushed upstream. Remember the Debian random number fiasco. You may have the required expertise in one or two areas across the functionality of a richly featured distro but you're unlikely to have it across the lot.
Another, similar, is the multiplication of languages in use.
A third is that life's just too short. I regularly had to hack each WINE release to remove a deliberate misfeature until the requirement went away. I'd hate to repeat that across many other packages.
But finally, this entire approach is essentially elitist. Most people acquire software to do a job, not to study it. There's no good reason to expect that users should have the knowledge to do any of these things. Relatively few people have the skills required, they don't have the time, inclination and maybe aptitude to do so. They expect software to just work and that, really, is a reasonable expectation and unless there's a big "not yet ready for production use" sticker on it that 's what they should be provided with irrespective of whether it comes from closed or open source, commercial or free as in beer.
-
-
Wednesday 27th March 2019 16:51 GMT Anonymous Coward
@JulieM
Quote: "...I have never paid for a piece of PC software.."
Sorry, but I simply don't believe this....every MS-DOS or Windows computer you ever bought meant that a portion of the purchase price went to Redmond, Washington, USA to pay for software.
Every Apple computer you ever bought included software costs.
......or have you only ever bought "barebones" boxes with no software at all? If this is the case, I applaud your devotion!
-
Wednesday 27th March 2019 16:59 GMT Gene Cash
> have you only ever bought "barebones" boxes with no software at all?
That's what I've done, since the machines I want don't exist (an i-7-3770 with watercooling for silence) or were eye wateringly expensive. I've had to accept if I want a new PC, I have to build it myself. And then slap Linux on it.
-
-
Wednesday 27th March 2019 16:56 GMT JohnFen
I could not agree more
I've been watching with increasing dread and despair as my industry, that I've been very active in for 30 years or so, is producing software whose quality has been clearly decreasing every year. When I get on my soapbox about this, graybeards tend to agree but (significantly, I think) the younger developers simply don't see the problem or actively believe the reverse is true.
There needs to be an industry-wide recognition that the way things are going are unsustainable and we need to look at and fix the methodologies that have led to this deplorable state.
-
Wednesday 27th March 2019 17:44 GMT JulieM
Re: I could not agree more
Maybe start with a new law that Source Code is not secret from End Users; and guaranteeing users the access to the Source Code and the rights to study and adapt the software they use, even if not the right to share it. Not giving out the Source Code certainly has never stopped anyone from making pirate copies; and if anybody rips off your Source Code for use in their product, it will be obvious when you read the Source Code they are now obliged to show you.
When you know that other people are going to be looking at your code, you take a bit of extra care with it. It took a significant amount of effort to get OpenOffice.org even to build on any system that did not have both address and data buses 32 bits wide, because of some basic schoolkid errors that the original authors thought nobody would ever see.
-
Wednesday 27th March 2019 18:44 GMT JohnFen
Re: I could not agree more
"When you know that other people are going to be looking at your code, you take a bit of extra care with it."
Working this way has been something that I developed as a habit a very long time ago. Even with my projects that aren't open source, the odds are very high that other people will be looking at my code -- even if the "other people" is future me. Future me doesn't know nor care about what present me is thinking, so the code that present me writes has to be inherently clear to future me.
-
Wednesday 27th March 2019 19:19 GMT Anonymous Coward
Re: I could not agree more
But that could have unintended consequences, such as all software developers in the affected area packing up and leaving because they don't want to Give Information To The Enemy. What happens when suddenly there are no more takers because being forced to publish their source is too high a price?
-
-
-
-
Wednesday 3rd April 2019 05:01 GMT Charles 9
Re: I could not agree more
And I'm serious (utterly seriously) that you can't count on that. If the P and Q graphs don't meet (in this case, because the compliance cost pushes P too far to the left), you end up with a market that can't be met, where it doesn't make sense to even go in. You can never rule out that possibility.
-
-
-
-
Thursday 28th March 2019 10:21 GMT JulieM
Re: I could not agree more
Such a law, as I am envisaging it, would explicitly ban vendors who refuse to release the Source Code from selling their software. If the government wanted to play hardball, they would also annul the relevant copyrights in the now-unsaleable software -- they give them, and they can take them away. So the vendors might think they were taking their ball and going home ..... But everyone would still have copies of their ball. And a whole new Open Source project could start out of the decompiled remains of the original, former proprietary code,
Besides which, "giving information to The Enemy" is not really a hardship if everyone has to do it.
-
Thursday 28th March 2019 12:30 GMT Charles 9
Re: I could not agree more
"Such a law, as I am envisaging it, would explicitly ban vendors who refuse to release the Source Code from selling their software."
Right, and if they refuse to release their source, there is nothing for anyone, not even the government, to look at. Even if the governments nullify copyrights, they're doing so on ghosts because they'd rather not play. How do you read how to do things when everyone takes the only copies of their trade secrets (which they refused to publish) with them? That's one of the reasons behind patents and so on: to convince people to open up rather than keep everything close to the vest.
-
-
-
Thursday 28th March 2019 00:28 GMT Doctor Syntax
Re: I could not agree more
"When you know that other people are going to be looking at your code, you take a bit of extra care with it."
Sadly, that's not always the case. I've seem some real stinkers from open but proprietary code. The best you can do in those cases is educate the vendor.
-
-
Saturday 30th March 2019 15:29 GMT Doctor Syntax
Re: I could not agree more
Too true.
I spent a whole afternoon going through vendor's code to work out why a weekly invoice run had interrupted my pub lunch two Fridays running. It was causing the database engine to use more & more memory until it ran out of its allocation. I eventually drilled down to repeated statements that asked the engine to allocate memory without freeing it, buried in deeply nested loops. I showed them how to allocate it once and then reuse it.
A while later I had a gig with that vendor to do the on-site install of a part-custom set-up for a big customer. When we got round to testing I found it was going to take over 24 hours to load each day's data. It looked very odd, a single database engine thread seemed to be doing all the work. It transpired that this was even worse than the previous. They were just running in-line SQL statements to load each line of data rather than a PREPAREd statement or anything more sophisticated. All the SQL parsing was done by the thread handling the connection to the application. Once parsed the job was distributed over multiple threads as expected but the load on each of these was negligible compared to the thread needlessly parsing the same SQL.
-
-
-
-
-
Wednesday 27th March 2019 18:15 GMT J27
I fully agree with this article. Being rushed into pushing features out the door with minimal or even no testing is the norm these days. Continuous deploying, devops and all those new buzzwords make it much worse because the user is pretty much always using a beta. But as for it being more expensive? In the long run you can cut down the time spent chasing down bugs if you spend more time making sure your software works in the beginning, the main issue is the push to get things out the door faster. This is mainly due to pressure from people who know nothing about software like sales departments and direct customer requests.
I often feel guilty about having to work like this, but there isn't much I can do. I'm constantly pushing, but it's not up to me.
-
Wednesday 27th March 2019 18:32 GMT Boris the Cockroach
I think
the solution would be.
When training up new coders, they are given a basic task such as a flight control module that keeps the plane flying straight and level and reacts well to any side wind to keep it straight and level.
Then the coders have to get aboard that aircraft and fly in it over say 1/2 mile at an altitude of 1000 feet.
Then that experience will stick with them and you'll get better code..... also it sorts out the crap coders who really shouldn't be doing the job.
I write code for a living ... someone has to tell those robots what to do.... if I get it right all is good... if I get it wrong the part is fired out of the machine, through the safety window and hits the wall 50 feet away still spinning.
much more fun than coding a cranky widget than may or may not respond if 3 other widgets are pressed while focus is on a 4th
-
Wednesday 27th March 2019 19:21 GMT Charles 9
Re: I think
Sounds pretty expensive to me when you start wrecking lots of planes. Not to mention the Wrongful Death suits for sticking them into such a risky situation, a good lawyer would easily find a way around the "It's their fault to begin with" charge, as they'll say the "thrown out of the nest" technique tends to end up with a lot of corpses below with no guarantee you'll get any survivors.
-
Thursday 28th March 2019 08:24 GMT Mike 137
Re: I think
Thanks Boris, but the real problem we have to solve is the general acceptance of "coders" rather than "programmers".
In normal engineering practice we go from concept to design to implementation (three stages).
Software is largely developed in two stages: concept, implementation, leaving out the rigour and snag-finding provided by the design stage. This is the equivalent of a town planner engaging a team of bricklayers (however competent they may be) to build a new office block, without employing an architect. Agile and dev-ops only exacerbate this well established practice.
Fundamentally, a coder knows a language, the knobs and levers of a dev environment and its library but otherwise generally relies on "crowd sourced" knowledge of implementation, whereas a programmer understands the principles of converting required functionality into an algorithm and an algorithm into a robust implementation in code (independent of language and dev system). They may also preferentially know one language and/or dev system, but that's a side issue.
If you need evidence of the difference in results, see https://net.cs.uni-bonn.de/fileadmin/user_upload/naiakshi/Naiakshina_Password_Study.pdf
This is not a matter of "training" or whether there are penalties for poor performance, but of whether software development qualifies as an engineering discipline with all its implied education and validation of competence, and as software permeates ever more deeply into almost all branches of engineering the distinction is becoming critical.
For an exhaustive analysis of the problem from 2016 see https://www.nist.gov/document/integratedinfosecrfiresponsepdf
-
Thursday 28th March 2019 12:35 GMT Charles 9
Re: I think
"This is not a matter of "training" or whether there are penalties for poor performance, but of whether software development qualifies as an engineering discipline with all its implied education and validation of competence, and as software permeates ever more deeply into almost all branches of engineering the distinction is becoming critical."
I think it's more a matter of compliance and costs: both in money and especially time. When everyone wants unicorns yesterday because The Competition is breathing down their necks and the pace gets faster and faster, there comes a point when humanity simply cannot keep up yet they get harangued regardless, and anyone who pipes up gets discarded. Who around here has gotten rewarded for NOT releasing a new version, given no one believes in perfection (aka getting it right the first time)?
-
Thursday 28th March 2019 18:36 GMT Boris the Cockroach
Re: I think
Quote:
In normal engineering practice we go from concept to design to implementation (three stages).
Software is largely developed in two stages: concept, implementation,
Speaking as someone who's closely involved with 'proper' engineering and has qualification in software engineering too, you dont have to explain the correct way of doing engineering in general because both displines need the same validation... or you endup with the cockups I've dealt with (1" dia precision drive shaft running in a 2" diameter hole on the mounting plate.... or idiots who think just because you can draw it on a CAD system means it can be turned into metal....)
Software engineering needs the same 3 stages : design, building and validation just the same as a brand new hydraulic actuator needs the same 3 things . its engineering, if you want it right there are no shortcuts, no easy ways, but as with all engineering you are faced with the same question when making the decision about the product
You can have any 2 choices from the following 3
Quality
Price
Delivery
-
-
Saturday 30th March 2019 15:41 GMT Doctor Syntax
Re: I think
The reason it's the same across the board is because manglements aren't taught the basics Boris explained.
Back in the days when I was still a not particularly humble employee we had one of those programmes where individuals form across the business were gathered together in groups to be "motivated" by senior managers. The senior manager in my group was going on about quality being delivered quickly and cheap. Not being particularly humble at speaking up I had no compunction about explaining the Iron Triangle to him in front of the group - all the rest probably knew it anyway. Even if, as it happened, he was the director of my part of the corporation. After all I couldn't leave my oppos with the impression that that was the standard of knowledge there.
-
-
-
-
-
Wednesday 27th March 2019 23:08 GMT John Savard
Solutions that Work
I think pretty well everyone knows that the quality of commercial software products is not what it should be.
The trouble is, though, that consumers can tell what it does and how much it costs, but as far as how bug-ridden it is, it's much harder to tell.
And therefore, in the current market, there aren't even choices. There aren't suppliers of software known for their high reliability records that one could turn to.
Suppose IBM were to resurrect OS/2 and the Lotus/WordPerfect suite after doing thorough work to make them far less buggy than either the Windows or Macintosh alternatives. Would they actually manage to sell any?
Maybe there's a chance. Maybe they could start by selling them to offices that are sick and tired of being at risk for ransomware and the like, and who don't need more than a basic software suite. Years later, perhaps game companies would start making OS/2 versions of their games, so that consumers could consider it as a viable alternative to Windows.
But IBM got burned once with OS/2, and there are plenty of good reasons why this would seem like a hopeless waste of money to try. Who else would dare?
-
Thursday 28th March 2019 05:47 GMT SNAFUology
SNAFU - The industry was founded on unreliable software
Too right, but don't hold your breath.
"issue of software reliability" "the industry has been sleepwalking into a tacit acceptance of unreliable software"
The industry was founded on unreliable software
Microsoft Windows was IBM compatible, yet you needed a graphics accelerator, more memory, bigger hard drive...etc to work with any reasonable efficiency.
Some developers did amazing things to get their products to work, although it often came to nothing as other parts of their software could not over come the SNAFU's.
"Sony Bravia TV takes 10-20 seconds to respond to a key-press on its remote controller"
I can vouch for that - & Bravia EPG in Aus, I can make a cup of coffee/tea and be back before it updates.
Cookies Arrrg too right - Paypal takes your money then refuses to give a receipt telling you to go back set cookies then do it again - charging you for another item, when all it had to do was give you a receipt & exit. To complain you need a 3 part application and a note from your mother.
-
Thursday 28th March 2019 08:33 GMT Potemkine!
Not only more expensive, but also late
First, I doubt it would be possible to eradicate all bugs from software which are more and more complex.
But let's suppose that.
Following the rule of the 80/20, eradicating the last 20% of bugs will cost 80% of the whole time needed for the project. Making a bugless software will be costly, but also it will take a long time. The product may then arrive too late on the market, there could be other solutions available before that. It's sometimes better to have a not-so-perfect tool rather than no tool at all.
It's all about context: if the software risk analysis shows lives could be at stake, then the debug process should be as complete as possible. If not, I think the software issues may be an acceptable alternative if the software feeds a need anyway.
-
Thursday 28th March 2019 09:57 GMT Adam 1
Another take on this problem was pointed out by "uncle Bob" in a lecture I saw but am now too lazy to find the link.
The number of people that you would loosely define as "computer programmers" has roughly doubled every five years since the 1960's. Or to put it in a more frightening way. About half the code warriors involved in every piece of software you might buy today have less than 5 years experience. Many haven't yet been burned by the shortcuts they think they can get away with, and many in that bracket aren't yet at the levels where they can push back against the PHBs demanding dangerous processes (or more usually lack thereof)
-
Thursday 28th March 2019 10:39 GMT Anonymous Coward
Productivity
In many software development workplaces * take the extra time to design, implement and est your software carefully (manulaly and adding automated tests ) and you get into hassle for lack of productivity due to the extra time taken as philosophy is just get new cahnges out of the door quickly.
* There are (unfortunately) many without proper testing and QA regimes where main aim is to crank out code as rapidly as possible
-
Thursday 28th March 2019 21:07 GMT John Savard
Also...
Add new features, and you can slap a new release number on it, and charge money to upgrade to it.
Fix bugs, and people expect to get that for free. (Because they expect the bugs shouldn't have been there in the first place, which is right.)
So businesses figure adding features is a profit center, while fixing bugs is a cost center.
And there's no way, it seems, to punish companies by not buying their software if it has (too many) bugs.
-
Friday 29th March 2019 10:08 GMT johnrobyclayton
Idiocracy
I remember a scene in the movie Idiocracy.
It was in the triage section of the emergency department of a hospital.
There was a girl in front of a full graphical patient classification interface.
It had pictures of red marks on a picture of a body and she would select which one that most matched the problem described.
It had all the complexity of the classic game operation without the requirements for good hand eye co-ordination. The icon pictures were rather large.
I have worked in support for many point of sale companies.
I have seen the evolution of this type of software on a daily basis. I have heard comments from the owners of more than one business that they wanted software that someone who was not very good at reading to be able to use.
Users get stupider and stupider because they are allowed to by companies that want to sell to the stupidest people they can find because these are the ones that will pay the most for software no matter how many issues it has.
The demands on software are increasing every day not just by the increasing complexity of our operating environment but by the requirements that stupider and stupider people need to be able to use it safely and profitably.
The problems of allocating blame for software that fails to meet requirements is a difficult one because it is the responsibility of everyone that contributes to its development, design, testing, marketing, use, selling, purchasing, training, legislating, securing, compliance, etc etc etc ad infinitum.
-
Saturday 30th March 2019 09:24 GMT Anonymous Coward
Re: Ideocracy
@johnrobyclayton
Quote: "...The problems of allocating blame for software that fails to meet requirements..."
*
1. A LONG while ago, J Edwards Deming pointed out that "blame" is not a useful concept. His point was that, in most circumstances, once a problem has been solved almost no one is interested in "blame". In fact, introducing the concept of "blame" BEFORE THE PROBLEM IS SOLVED almost certainly gets in the way of a solution!!
*
2. Then there's the concept of "requirements". You may not have noticed that since the adoption of fashionable processes such as "agile" and "scrum", the concept of "requirements" has been replaced with the idea of an ever moving "product backlog" comprising fragments called "user stories". This fashionable method of software development has resulted in the COMPLETE inability to develop or deploy a comprehesive test plan for a software product. So your idea of something that "fails to meet requirements" is so 1990's!!!
*
Oh dear!
-
Saturday 30th March 2019 09:52 GMT Anonymous Coward
"...fails to meet requirements...."
...and then there's the fact that fashion has also removed the words "bug" and "defect" from the dictionary.
*
Today it's fashionable to talk about "technical debt"!!!
*
Today no one ever writes bad code or caused a problem in testing or deployment...."technical debt"....suitably bland and completely anonymous!!
-
-
Sunday 31st March 2019 15:31 GMT livefree
Functioned as designed
Your larger point is well taken, and I agree. But in the specific case of the 737 Max software, we will find that it is a "garbage-in-garbage-out" problem, the roots of which go way, way back, before "software updates" were even a thing. In this case, the algorithm is flawed.
In every motion control system, there are actuators and sensors. For example, "move the control surface" and then "what is my new angle of attack?" In the 737 Max software, you have an inverse (divergent, open-loop) problem, that is a single-sensor reading causing an unrelated actuator to engage. This problem will be fixed by using a redundant sensor, that is, either a second angle-of-attack sensor, or an accelerometer / gyro / wind speed / air flow sensor to confirm the initial sensor. AND THEN make the decision to change a control surface. Normally this "redundant sensor" would be the pilot, but this video game programmer who wrote the 737 Max software has optimized this critical step out of his algorithm. In other words, an actuator has been commanded based on a single sensor reading, and faulty logic of the software designer. If this sensor is defective, the logic fails.
The software probably functioned as designed. (It always does.)
-
Sunday 31st March 2019 19:55 GMT Colin Bain
Walk away
We had a Doctor who did not take blood pressure at the visit, even though we had high blood pressure and one of us had kidney disease. On the exit interview the enticement to stay was that they were adding extra services. Our responses was that if you can't get the basics right then why would we stay. They then asked if we wanted to see the doc. They looked at the calendar and suggested one 3 weeks hence. At that point we said point made and never darkened their doorstep. Why this matters is that software is the same. If it doesn't work, repeat business is lost and so are the future recommendations and reputations.