Shortage? What shortage?
What happened to all those Indian hordes of Java programmers that have been flooding this country for the last 30 years or so? Does it mean it's been all for nothing?!
Shoddy software cost the US an estimated $2.08tr in 2020, according to the Consortium for Information & Software Quality (CISQ). That's down slightly from a revised 2018 total of $2.1tr but still isn't anything to brag about. In its 2020 report, The Cost of Poor Software Quality in the US, the Massachusetts-based standards …
Most of them were next to useless and couldn't code their way to a 'Hello World' app in a month of Sundays without referring to the internet first esp Stackoverflow.
|We blocked that site for a month and out so-called 'consultants' from India did virtually no coding. Not having the ability to cut and paste code (that often didn't work) really stymied them.
I'm sure that there are many other examples.
I had one "expert" come over who was meant to consult on deploying a new solution. All he did was cut and paste large swathes of vendor documentation (and I mean I was able to put in an entire paragraph and get an *exact* match, word for word).
Lets just say he didn't stick around too long....
Why is the situation so grim? The report argues there's an IT talent shortage, a claim others have made as well.
Perhaps rather than complaining about artificial skill shortage it might be time to take a really good look at methodologies as well and reassess whether trying to shoehorn agile into every single project was such a great idea. Who knows, it might not be the best approach to every single application/project.
Too many people trying to push over aggresive changes in too shart a time.
When working on large projects any changes have to be carefully thought out. Agile is not so usefull when you are working on a mature system. You cannot just plug in half a screen or table and hope it will work.
China Joe Biden will bow down the tech companies that helped him steal the election.
He'll open up the H1B flood gates bringing in hoards of incompetents and also demand to retrain the unemployed baristas to become 10 week wonders.
The truth is that we need good software engineers. But it takes 4-5 years of training.
As part of our interview process we have a test that involves some fairly simple coding questions that must be answered without looking anything up. You would not believe the number of applicants who either walk out (these are what I'd call scammers) or fail miserably (these I'd call lacking in any noticable skill). And the quality of applicants applying to jobs doesn't increase for senior positions, often the applicants for senior positions are worse than the junior applicants.
Just been let go by a US bank. They have lost all respect for any IT skills/aptitude and have the attitude that anyone can code, as long as it's all API based or whatever weed enhanced fad has been dreamt up in California this week to sell more books. All passwords being replaced with trusted authentication for example. Even where its blindingly obvious that it leaves all your data sources open for abuse.
It is blindingly obvious that the US bank you just left is ripe for a Post-COBOL Apocalypse event. The kind that happens when obvious and relatively easy to understand COBOL code is abandoned for "sexier" languages, known to coders as, "Job Security". [Posting anonymously because, among other negative traits, brogrammers don't take criticism well.]
Although I agree that a lot of people could be retrained to code, 'retraining' in most politician's minds, is to put a coal-miner in front of a computer and tell him 'to work smarter, not harder'.
In their mind, 'IT is easy', they're using their smart phone without much of an issue, how much harder could it be to write code?!
The reason why so much money is wasted in IT is due the 'Dunning-Kruger' is strong in a lot of managers.
This is why HP flourished when the original founders were still involved and the finest quaity product they put out today is printer ink.
As friend put it, "Computer programming is the most difficult branch of applied mathematics."
No. Most people really cannot be trained up to that level. In fact, most people currently employed as programmers cannot. What they can do is produce a lot of code that mostly works in the two or three cases that they thought about. When the bug reports come back, they make the entire thing more complex (introducing new bugs) trying to close it.
Getting this stuff right is HARD. Just like most people cannot be "trained up" to be civil engineers or aerospace engineers, they cannot be trained up to be software engineers. And that is where the shortage lies.
No. Most people really cannot be trained up to that level. In fact, most people currently employed as programmers cannot. What they can do is produce a lot of code that mostly works
Completely agree. Hence my use of the term 'coders'. There's a huge difference between being able to code and doing software engineering
Just the representation of 'hackers' in popular media shows the infantile perception that most people have of IT. That lack of knowledge is the main reason in of the wasted $2 trillion/yr.
Combined with managers who can't think much further ahead than the next financial quarter and it's no wonder IT is in the state it's in.
in a software firm you need a core group of Programmers or software engineers to create the program logic and flow a build in the security etc., then you need a miriad of coders to translate this logic into the miriad of languages used across the miriad of platforms.
Manglement in their effort to cut costs said why are we paying the core group so much, why not just pay them the same, so quality did as it does and left the sinking ships, this is why we see less and less inovative features, and more tweaks to current code, building up more and more technical debt....
If you pay me enough I'm willing to learn any language that exists, COBOL isn't even particularly complicated it's just very very unproductive. I'd argue that any truly proficient developer would be able to pick up nearly any language fairly quickly with the right level of motivation.
While the article qualifies the usual assertion of a developer shortage by noting a shortage of good developers, I simply cannot see the difficulty.
We all know the definition of a good developer, it's someone who delivers on time and on budget. Accomplishing this goal merely requires a calendar and a calculator, both of which are included with Microsoft Windows and all the other (admittedly minor) players in the desktop and server space.
The CASE tool renaissance of the late 1980s taught us that the act of writing code is so trivial a task that it can be automated. The Y2K crisis taught us that anyone can engage in the act of writing code having augmented their keyboarding skills with a "Learn language in n [hours|days|weeks]" book.
Disabusing decision makers of the above load of equine excrement is going to be difficult. In the middle of the last decade of the previous century I recited to my boss that old saw about "cheap, fast, or efficient - pick any two." He replied that he'd heard that one, and he wanted all three. I took another offer soon after. And no lesson was learned.
The mindset that software developers are resources to not just be used, but used up is pervasive. That software development and project management are othogonal remains unknown to those who design org charts and compensation plans.
This is not a problem that gets fixed by adding more developers, no matter how good they are.
The core new clear problem to see is always the same and is something which cannot be fixed, but that doesn't stop deaf, dumb and blind and intellectually challenged systems administrations from tasking future developers and systems administrations with the same impossible and increasingly rapidly eventually inescapable and personally identifying self-destructive task ...... the ever more complicated defence of the indefensible that constantly inspires and feeds ever more stealthy and increasingly damaging and deadly attacks.
The simplest of complex solutions is easy to see and share ....... Stop trying to fix something impossible to fix and just simply decline to continue to defend the indefensible and inequitable and iniquitous.
IT aint FCUKing Rocket Science ...... Greater Common Sense.
What this needs is an acronym.
From the article: the report calls for better software defect detection and remediation of identified vulnerabilities
B.S.D.D.n.R.O.I.V (ok maybe not)
But I usually solve these *kinds* of problems through Super High Intensity Testing.
And that's an acronym that's easy to remember!
the answer is not detecting the defects and fixing them
the answer is not making the defects in the first place.
this is why you need programmers and engineers, not coders.
Heres an acronym for the Authour of the piece:
Forget Usual Coding Knowledge Or Finding Faults, Treat With Adept Talent
I do not consider myself a developer, nor a coder, but a programmer. I will use a library if I know what it does inside, outside, topside...every side in every conceivable case I can muster. I will test it thoroughly as I test my own code. I am costly in both time and money, but strive to produce as correct code as I can; aka quality.
Many, many years ago, I was contacted to produce a payroll system for a company which had expanded from a single tax jurisdiction to multiple. They wanted the thing done in a month, they could not wait. I replied I would require no less than three months with one or two contingency until I could analyse the requirements. I did not get the job. Fast forward five years and I found out the "coder" the hire instead of me had delivered in two months instead of the agreed one. And, they were still working out bugs and had dozens of work arounds to process payroll including pre- and post- processing with a spreadsheet. They are now out of business.
The problem is unwillingness to pay for suitable quality. This goes far beyond software in IT. IT security is afflicted. It also does not help that the current trend of software is change for change sake and the abominations we are seeing is horrendous. There are losses due to abandonment of well reasoned user interface elements. How much time(money) is wasted when a form rejects input on submit when the input is formatted had been formatted in common form and only instructing the user of the expected form after the fact? Not really a rhetorical question, I just do not have the resources to make that determination. I can write a routine to transform input in common forms to the desired form for processing, I do not consider it to be time consuming or challenging. There are so many examples of poor programming, I could write an encyclopedia on the topic.
I agree training more developers is not the fix. The problem is deeper and requires a more complicated mitigation and is not likely to be accepted by those who make the decisions.
Referring to notes on Sainsbury's online shopping account, (1) the password must have big and little letters, numbers, and NO spaces, (2) I didn't know that when I went about setting it up, and (3) I've never yet used it.
Another service apparently doesn't work with a Yahoo e-mail account - I mean, yes, I know, but why shouldn't I. Nowhere does it say, while creating an account, that it won't work. This includes the account user authentication e-mail. Result, no usable account and a big waste of time.
Despite the shortage of developers, agencies will reject those of a certain age. Past that age, stay where you are (if you can) because get a new role will be tough. Of course, rejection for reasons of age is illegal so it's because "the company wants people to be there long term".
The mystery is why an industry filled with educated people decided to outsource hiring to the recruitment industry, an industry filled with people that are smart in a different way.
The mystery is why an industry filled with educated people decided to outsource hiring to the recruitment industry
They’re the same reasons why a cohort filled with educated people decided to outsource finding a mate to the matchmaking industry — viz the convenience of someone else doing the drudgery of weeding out clearly unsuitable potential partners, and the belief that a specialist in making matches can find an ideal partner for someone more readily than that selfsame someone could.
The clients of the recruitment industry want their developers “just so” — possessing every technical and personal skill that is needed for a position so that they can “hit the ground running“ on the first day of work. However, those clients are generally not willing to either (1) offer enough to tempt fully qualified but happily employed developers away from their current positions, or (2) invest enough into mostly qualified but frustratingly unemployed developers to bring them up to speed to the clients’ particular skill requirements. As a result, that industry filled with educated people is perpetually bemoaning the lack of availability of suitable developers — which does nothing to help unemployed developers, be they in chromotrichopause or no. Until that industry realizes that in one way or another they’ll need to use more from those hoards of venture capital to get the employees that they want, the situation won’t change, much to everyone’s detriment.
As an aged migrant to IT with non IT stem degrees (early 50's when I moved) I landed a couple of jobs. One was to gain experience (pay sucked but the experience was needed). The next job has much better pay and long term stability (switched industries). The real issue is not there are not skilled programmers available or people with enough general technical skills to transition to IT, it is many elements think if you over 30/35 you are washed up and untrainable. But what they throw away is experience and the wisdom experience teaches.
I haven't had that problem at all --- I'm in my late fifties and get a steady stream of contacts from recruiters.
Or maybe the "certain age" you're thinking of is somewhat older than mine?
Around here there seems to be plenty of demand for experienced developers.
"Of course, rejection for reasons of age is illegal so it's because "the company wants people to be there long term"."
The interesting thing is that most younger workers are going to job hop. Older workers aren't as interested in moving house every six months and may have family nearby they don't want to move away from.
Companies have created this monster themselves. For many people, the only way to get a raise or move up the ladder is to change jobs. That will be their first move if they see the company bringing in outside people over them rather than promoting from within.
You cannot write decent code if you do not understand the context. As in, the business, the workflow, what the code should actually do.
How many coders (or anyone involved with coding) ever have that type of knowledge?
Bugger all is the answer. So, this naturally leads to the result that all you get is an “Agile” serving of bad spaghetti.
Yes. But that treats coding as a tool rather than as a specialized profession. I've come away from a few organizations that hid all their software devs in a separate department. (With its own highly paid management structure.) And insisted that anything involving 'code' could not be touched by their rank and file engineers.
Code is more like a tool. I expect any competent mechanic to be able to handle wrenches, screwdrivers (and most importantly) large hammers. I expect that their knowledge extends to diesel engines, washing machines or clocks. And they know when to select the proper tool as needed. I'd hesitate to hire a mechanic whose field of expertise was 'pipe wrench'. But that's what we do for software.
Sure, there's tons of code out there that "works" today and will be upgraded next week ... maybe - don't worry if the upgrade has a bug because we can update it later. Hardware is built the same way too, phones work great and then the battery dies and you need to buy a new phone. We're told that keyless cars are more secure and easier to use, but they get hacked or just driven off because the driver left the key fob in the car at the gas station. Nobody buys CD and records anymore because it's easier to stream music ... this is killing new bands and artist but it's making the streaming companies rich, the phone companies are building new phones all the time ... everyone's making money from unreliable products.
Look at the Irish backstop and the great Brexit agreement that achieved everything ... written the same way that applications are written - achieving a solution today that will need an update... this is normal isn't it?
Yes of course. The Brexit agreement has been developed in an Agile Fashion. Now of course it will require endless additional releases, without any time spent on the technical debt that will be introduced as we go. So yes it is normal...
Knowing how to implement something isn't the same as knowing how to design it. The two skills might look the same to the untutored eye but just as knowing how to throw a plank over a ditch doesn't make you a bridge builder knowing how to code doesn't make you a programmer. Coding is an implementation skill, its an essential skill, just like being able to read and write, but just as being able to write doesn't automatically make you a great author being able to code doesn't make you a programmer..
It doesn't help that a lot of coding is done inside frameworks that hide most detail from the coder. This is great from a cost-savings/productivity perspective but it has serious limitations should the requirements of the job fall outside the capabilites of the framework. Here experienced programmers will know just where the edge is -- how far you can go inside the framework before things fall apart -- and either adjust the design to fit inside the framework or build a new structure that accomodates the requirements of the task.
One of the major details were I work isn't the coding, often it is pretty straightforward but properly interpreting the requirements and knowing when to get clarification of the requirements. It's the translation of the requirements to working code that is the real skill. A skill which is broadly language independent (obviously you need to know the desired site language).
It's not just IT. It's not just corporations. It's society.
Consumers, in general, are unwilling to pay more or wait longer for a quality product.
Take burgers. It is easy to find a better burger than McDonald's serves but it costs more and takes longer to prepare. Hence most people opt for McDonald's or other fast-food chain.
Or take Cyberpunk 2077. The company had to meet a holiday release deadline or they would lose sales and they couldn't raise the price because, again, they would lose sales.
How many of us would pay extra for a formerly free and open source product if it meant that the developer would be paid full-time and could hire additional full-time developers.
How many of us would pay extra for a formerly free and open source product if it meant that the developer would be paid full-time and could hire additional full-time developers.
Sure, absolutely, but only if I kept getting the source code under the same terms, i.e. being able to modify and enhance without paying some kind of subscription fee in perpetuity even if the developer(s) stopped developing.
It's Leo all over again.
When Leo came out, they'd develop a business system by understanding how it worked and make a bespoke system.
Naturally businesses caught onto this and started to develop systems that were bog standard and quoted low prices that caused the client to accept software that wasn't made for them and their business suffered.
History is doomed to repeat itself if you're ignoring it....
The icon is appropriate.
Many many years ago, I wrote code. A specification was written for us by somebody who sort of understood the problem, but nothing about the application of that to a computer. That was supposed to be our job, to take the spec and make it code.
Only, it wasn't. Our job, it transpires, was to take the spec extremely literally and make it code. An example I remember was a set of values that were multiplied by powers of two. I forget the reason, index offset maybe? But it was basically a calculation that the sixteen bit processor could do as a simple arithmetic shift. But no. The spec said that the value had to be multiplied, so a costly multiplication routine needed to be invoked. But we couldn't write one because we already had an even costlier multiplication routine that worked with fixed point numbers. So, yes, we took a value and converted it to fixed point and multiplied it and converted it back. Hundreds of machine cycles for something that ought to have been one instruction.
Why? Because the spec said so and some asshole manager wasn't going to accept any deviation from the spec. NONE.
The project was full of stuff like that, and arguments between the programmers (I was just a junior keyboard monkey) and the manglement were common. And since the higher ups understood their sycophants and not us, we were overruled every single time. I left before the project was anywhere near finished (and already late) and it was an utter piece of shit. Bloated, slow, horrific. Nobody wanted to put their name to it. But, alas, it completely followed the spec. The spec written by somebody barely competent to use a pocket calculator.
Suffice to say, I took a completely different career path. Pays less than a developer, sure, but less stress and I got to meet some interesting people. Best discussion I ever had with somebody was working as a carer in a nursing home with this excited old lady jumping up and down about as much as possible in a wheelchair. Why? Official secrets was up, she helped bust the Enigma and had been waiting a lifetime to talk about it. And she did. Utterly fascinating how it worked and the methods used to reverse engineer something just by looking at the encoded messages.
I don't regret my decision, but I do regret that once upon a time people were valued for their skills. Writing code, running an effective ward (in a hospital), being a blacksmith... and somewhere along the line we all accepted these total and utter losers to come along, call themselves fancy titles, tell us all what to do. They don't have a bloody clue. I bet you could randomly fire at least half the management from any company and when the shock settles down, you'd realise that they weren't actually that necessary and maybe just maybe the workers would be more effective by not having people that don't understand their job telling them how to do their job, and of course (especially in high concentration jobs like programming) not having to stop everything mid morning for yet another stupid fucking "update" meeting where everybody is brought together to say where they are different to yesterday, or the day before that. It doesn't help the programmers, it doesn't help the other workers, it's only done in order to make the management feel like they are important.
They are the reason we're all screwed. People worship the guy with the big desk and the company car.
And the people that can actually do the work? Diminishing, mostly due to the actions of the aforementioned parasites.
I was in the role of A/P, and came to be the only one that understood a critical part of a large system. The large system is being "mainframe offloaded to cloud", so now I'm a cloud architect setting the requirements for a bullshit bingo outsourced vendor to go and rewrite the system in jave framework on cloud vendor for lowest possible cost to a fixed timescale. Oh, and exactly matching function...
What can possibly go wrong? What could possible be a more depressing job vista?
> I bet you could randomly fire at least half the management from any company and when the shock settles down, you'd realise that they weren't actually that necessary...
At one former employer, it was nauseating to see just how many people in the company org chart were directors---directors with nobody reporting to them. When a big chunk of the IT team were laid off during an ill-considered outsourcing arrangement (it got several higher-ups, including the CIO, a personalized escort out the door by security), the jobs of those directors-with-no-reports were untouched. I'm aware of a couple of them that, ten years later, are still there---directing nobody in particular.
(Its worse than you think because the act of multiplication in a processor involves shifting. Depending on the procesor, compiler and circumstances you might even get shifting substituted automatically by the compiler's optimizer. But having a dicttorial know-nothing in charge......just leave......not worth the fight since failure is the only option.)
> Why? Official secrets was up, she helped bust the Enigma and had been waiting a lifetime to talk about it. And she did. Utterly fascinating how it worked and the methods used to reverse engineer something just by looking at the encoded messages.
Please tell me you were able to record those conversations?
Right. Too often the design is not what people want to buy, it's what salesmen want to sell.
Case in point: all the cruft on a "multi-function printer" aka a copier. At least they kept the Big Green Button and the Little Red Button.
Recent example: a medical steriliser with a GUI. User configurable parameters: zero. Lots of cruft for basically just opening the door, putting in the item, closing the door, and pressing "Go".
Java, smattering of C/C++, Basic (hah) and Z80 assembly, also Heidenhain TNC, and Fanuc ISO code and mastercam(spit)
In my job, you have to do it right, theres no second chances with "Just run the code and hope", that way lies much insanity (and a shedload of very expensive machinery lunching itself), we have to test, and test again, and then make sure we've even specified what tool holding to use (gets fun fitting a 40 mm dia tool down a 30 mm hole at 10000 rpm... and yes we had someone do that).
But thats the correct way to program/design our work, theres no short cut, no publish and hope, and it always boils down to how much the company is willing to spend on skilled staff...
I'm not a coder, developer or software engineer - probably just naivee!
But is it down to the internet and the idea of "ship it, and send out the fixes later", whereas in the olden days it was more difficult to send out patches via floppies, firmware was on ROM, not EEPROM and so had to be working first time?
There is some truth in it. After all Microsoft is a prime example.
There should be no issue with taking on trinees or junior devs, but the problem sometimes is that they are dumped in at the deep end instead of working with a experienced developer and learing all the ins and outs of the sustem and the development software. You CAN train people well, you just have to commit the time and resources to it. a 5 days cource in c# is no good to anyone on it's own.
To be fair, though, the average modern operating system won't fit in a 16K ROM, or a hundred pages of fanfold paper.
It's just a shame that modern fast connectivity has so quickly been translated into a "push it out the door and fix the problems later" method of software development.
Way back when I was a programmer/analyst we were developing financial software and it *had* to be accurate.
These days the investment statements that my wife gets from her bank don't add up. For example the statement will show Total Dividends $27.52 and the first dividend (of many) shown is $45.60. I called the bank last Spring to complain about it and they gave some rambling bullshit answer. They obviously didn't have a clue what they were talking about and the Total Divs line disappeared the following month.
Well guess what just reappeared on the latest statements? Yup, Total Dividends line is back and it still doesn't add up.
The bank involved is CIBC (in Canada).
Hmmm..... just had a slightly evil idea.... At tax time I'll report her dividend income according to her bank statements. When Revenue Canada sends a nasty letter about unreported income I'll play dumb and send them copies of the bank statements.
I would like to thank El Reg and all the supporting commentards replying on this thread for the generous mountain of valuable information and priceless intelligence so freely shared. To know that there be others out there not so stuck in rut and dumb in a world as to be virtually impotent and practically useless must surely be cause for great celebration that hope springs eternal and will always deliver the goods as be needed rather than as may have been planned.
And have you noticed how, whenever some really interesting and engaging novel tales appear here on El Reg, there can be solutions discussed in conversations that are able to shake and stir up markets to their very core.
Commentards of a certain vintage will remember the Mad comic of another age, which had a certain crazy disruptive and subversive edge to it, and which was not necessarily to everyone's taste, and so be it here at times I am pleased to say, albeit it being considerably more spohisticated nowadays and much more likely then to be imagined as a stealthy Phrack on steroids and ACID and talking a lot of unavoidable, temporarily inconvenient sense to systems in administration and systems that should really suffer being put into administration given the madness and mayhem, chaos and conflict they cause.
However, as is always the case in such matters of progress, YMMV.
... is that the kids who graduated Uni/College and got into the corporate computer and networking world back when computers started becoming ubiquitous on desktops all over the corporate world are now roughly in their mid 50s.
Note this is managers, users, coders, programmers, systems folks, everyone.
They started commercial computer work with Windows 2.x and DOS 4.0 (or thereabouts), and have become conditioned to the Redmond Way ... In their minds (and the generations following) it's supposed to be shoddy code, it's supposed to not be secure, it's supposed to break at the least convenient time, it will crash at random, updates will make things worse, over time it gets bigger and worse, if you turn it off and back on again it might fix it (maybe; try it again) ... these are all enshrined in the corporate attitude.
So why bother building clean, elegant code that just works when the underlying OS doesn't support such a concept? There is no point.
Those of us who started coding in the 60s or earlier are just left shaking our heads. Can you imagine what the reaction in Corporate America would have been if DEC or Burroughs or Sperry or IBM had made just one release that was as buggy as the code that is run as a matter of course on modern computers? Or worse, the drek in "the cloud"? The company's stock would have tanked, they would never have been trusted again, heads would have rolled ... ugly wouldn't even begin to describe it.
But these days? Navigating through crap, buggy, crash-prone bullshit has become business as usual. Because THAT'S HOW COMPUTERS ARE SUPPOSED TO WORK! Ask any manager. Or coder under 50. (Thankfully there are still a few real programmers out there in each generation.)
I have no answers. I'm not sure there are any.
"Can you imagine what the reaction in Corporate America would have been if DEC or Burroughs or Sperry or IBM had made just one release that was as buggy as the code that is run as a matter of course on modern computers?"
I don't need to imagine although the country was UK and the offender ICL.
The release was a package of semi-compiled (think library module) for FORTRAN that wouldn't link. In that case it lasted about half a day - just long enough to be detected given the turn-round times of punched card jobs. How long would it take to do the equivalent roll-back today?
If I remember correctly (and I'm certain someone will jump in and let me know if my memory is on the fritz), that was a compiler flag issue. Even today that would be an easy fix. Or should be, anyway. See Slackware-current's change log, search for "revert" for one way of handling that kind of thing that works well.
To fix what I was talking about would be a complete re-write, from scratch, of the core OS used by almost all businesses today. And then a generation (or three) to un-learn bad habits and reflexes.
Microsoft almost had it with Win2K ... then Marketing took over completely.
Consultants have always sucked. Back in '82 I had to clean up after two geniuses who couldn't even code a FIFO. Actually at the time, neither could I, but I learned really fast, unlike them. Fortunately the office had a real genius on staff who introduced me to the world of automatons. The rest of the staff were a zoo, including the manager and his conspicuously busty assistant.
That sums it up nicely.
The Boeing 737 Max debacle illustrates that nicely. Even there, where Boeing had outsourced the software job to the cheapest possible contractor, the devs they ended up with; even they questioned, "Er, is this really a good idea?". Reputedly the contracting company made sure to ask this by email, and made especially sure to preserve Boeing's reply in the affirmative.
Possibly the most valuable email chain ever.
The software that was written was seemingly written exactly to spec. The cost savings were Boeing's the results of their corporate structure making it impossible for engineering shortfalls to be apparent in Boeing.
The MBAs had a successful divide and conquor approach to controlling engineers, and it cost hundreds of lives, billions of shareholders' money, and Boeing has just used another $2bill of shareholders' money to make sure none of them end up in court answering awkward questions.
"I wouldn't trust them to program my video recorder."
This is interesting on multiple levels
1. It shows that they see there is a problem and the scale of it.
2. They don't realise that video recorders haven't existed for over a decade
3. There is nothing happening to prevent it happening again tomorrow.
I look back at 1MHz CPU's and the amazing things that were done on them by clever coding - BBC Micro Elite as a good example or people that understand bit shifting rather than complex multiplications. Then I look at modern coding that requires massive containers with a bunch of frameworks, GB of RAM to do simple things because the developers don't understand anything underneath the framework they are working in, but "its how things are done now". I bet that 95% of those developers can't explain what that actually means or why, even at one level down in the platform stack. How many could interpret a stack dump or patch the binary directly ?.
There should be a qualification that shows that people understand the hardware and software stack and how to best use it to get the best outcome - even in the days of cloud.
I wonder what would happen if modern "developers" were put on 1970's / 1980's (or even 1990's) systems and told to write a stock control application (or a spaceship flight control program). You only have 1MHz of CPU, 128K or RAM and a 20Mb hard disk. Users are connected via RS232 terminals of a variety of types and in a variety of locations and you can't claim FrAgile development methodologies and you can't access the Internet to research things. The space spec would be far lower.
Who were the real programmers and who were the apes that claim to write the works of Shakespeare ?
"I wonder what would happen if modern "developers" were put on 1970's / 1980's (or even 1990's) systems and told to write a stock control application (or a spaceship flight control program). You only have 1MHz of CPU, 128K or RAM and a 20Mb hard disk. Users are connected via RS232 terminals of a variety of types and in a variety of locations and you can't claim FrAgile development methodologies and you can't access the Internet to research things. The space spec would be far lower."
Seems a pretty easy challenge , I'll do the low level RS232 comms section first since thats what I'm good at heh.
Although considering your idea to be a valid one, I think it would be better if the application under development should be motion control of large bits of machinery, or critical flight or nuclear plant control stuff.
Write bad code here and you will kill people, that will teach 'coders' to do stuff right, and then make manglement personally liable for any bad code that gets through QC as well.
might make people sit up and pay attention, but then it all falls back to cost.... bid to build a website, sub it out to the cheapest coders you can find, pocket the money and move on.
Of course it isnt really, because it takes far longer and more coders to make a decent program, compared to a decent developer who wants paying more.
By the time the 'cheap' code is delivered late, buggy and not fit for use, the hiring boss has already hit his cost saving targets and banked his bonus and he's long gone before the wheels come off.
That's why Execs don't stick around more than 2-3 years.
That's why Execs don't stick around more than 2-3 years
Not just Execs. It's all the way down to the folks that have "Digital" in their title or department who start a new project with the latest flavour of the month language/framework and are gone before completion and before it goes over budget and past deadline.
“How often have I said to you that when you have eliminated the impossible, whatever remains, however improbable, must be the truth?” The Sign of Four
If manglement has been warbling about a skills shortage for the past three or more decades, then the skill they are most lacking in, is management. Which means that manglement is getting paid on false pretences. Unfortunately they're judge, jury and executioner in this question, which is why they never get any closer to solving this "skills shortage".
Good luck - for them - that "problem solving" isn't in their job description. Perhaps "problem causing" could be added to their official job description, in the interest of truth in advertising?
The cost of shoddy software isn't a US only thing; it's a global thing. I should think that saying shite code 'costs the territory should actually read 'costs the citizens of erritory, as the costs of the crap code, remedial updates (if any!) and the expansion of organisational processes to work around the bugs etc. are always passed downwards by the Government or Business commissioning the code. El Reg is littered with stories of crap/late/abandoned projects
Also, an individual's 'code' is usually part of a larger system or component - and even if their bit of 'code' is perfect, if the other bits or the overall architecture are poorly dimensioned/designed the final thing will be shite.
All too often, people start to design stuff and start coding without an understanding of what they are actually trying to achieve. Taking time up front saves time and money later on - but taking time is not popular these days, and very few people seems to take pride in what they do.
"There is a shortage of good IT workers" - This is a half truth. The full truth is "There is a shortage of good IT workers at the pitiful wages we are offering". The result is that highly intelligent and creative people who would make good developers go into professions like financial engineering where their work is actually rewarded. Not a surprise, really.
- software treated as projects that "resources" are assigned to briefly, instead of having a team of people that that build and maintain them (part/full time)
- software treated as projects where success is measured in dollars spent and deadlines met, rather than working software that can create a lot of value
- software architected by people who don't think in systems
As long as bean counters will be our overlords, it will stay the same, or worse. It doesn't matter to them if there are higher costs later. What does matter is right now. It's shitty, but it is not expensive at the first sight. It's all that matters.
There was a time, long ago, when companies were led by entrepreneurs, with a vision and a capacity to see beyond the end of their shoes. Now we have managers, whose only goal is to satisfy shareholders at the present time. They don't care if it means destroying the company in five years, they will have left with a comfortable pay check long before that.
They don't care if it means destroying the company in five years, they will have left with a comfortable pay check long before that.
Even sooner if they can engineer a buyout by a bigger, more stupid company, a la the infamous Autonomy sale.
I mention this because the company I work for, having been spun out of another company after a buyout and now owned by.....well, anyway, they've started going on a buyout binge instead of trying to address the real problems at the coalface.
The role of management in software quality is effectively to keep it down --- to push developers into getting features out of the door as fast as possible (or somewhat faster) rather than spending the time to get things right.
Agile really doesn't help either, for anything beyond adding fields to web forms. I left my previous employer primarily because, having been bought up by a larger company, they started insisting that all tasks must be estimated in two-day chunks, even the ones that can't be because they involve exploratory work and you only know that they're likely to be large and full of gotchas.
they started insisting that all tasks must be estimated in two-day chunks, even the ones that can't be because they involve exploratory work and you only know that they're likely to be large and full of gotchas
Not sure I've seen that bit of the Agile manifesto.
The chat network touted as an alternate for a certain bird-related one. ;-)
Parler apparently got owned through unprotected API endpoints, along with insufficient security internally to segregate what you should and should not see. 70 TB of user data (including 'deleted' media, posts, etc), all lifted (and no doubt handed to the FBI after that little incident at Capitol Hill).
The major excuse for many companies outsourcing and wanting to import workers is there aren't domestic workers for the jobs. The truth is there are plenty and could be plenty more willing to go into programming as a career if it paid what it used to. There is no incentive to take on a bunch of student loans to get a poorly paying job with limited opportunities for advancement. Especially if you are rather good at it.
Biting the hand that feeds IT © 1998–2021