back to article Software innovation just isn't what it used to be, and Moxie Marlinspike blames Agile

There's a rot at the heart of modern software development that's destroying innovation, and infosec legend Moxie Marlinspike believes he knows exactly what's to blame: Agile development. Marlinspike opened the second day of Black Hat with a talk that was ostensibly supposed to be a fireside chat with Black Hat founder Jeff …

  1. Anonymous Coward
    Anonymous Coward

    Misleading headline?

    Sounds like he's blaming over-abstraction, and happened to mention Agile for some reason.

    I don't think Agile mandates anything about abstraction.

    1. JohnSheeran
      Unhappy

      Re: Misleading headline?

      Agile done extremely poorly without the benefit of all the components is what we are seeing I believe. In the corporate world, agile dev is a joke.

    2. Andrew Hodgkinson

      Re: Misleading headline?

      It's never Agile's fault, because you're always doing Agile wrong if you say anything bad about it.

      Given, then, the number of possible ways to do Agile wrong, the very fragility of its implementation is ultimately a glaring and fatal flaw.

      1. Anonymous Coward
        Anonymous Coward

        Re: Misleading headline?

        Agile = Dogma Agile = Dogma Agile = Dogma Agile = Dogma Agile = Dogma

      2. Michael Wojcik Silver badge

        Re: Misleading headline?

        Substitute "developing software" for "Agile" in your argument and you have a claim that is just as correct, and just as useless.

  2. may_i Silver badge

    He's right about black boxes and over-abstraction though

    I find a lot of the M$ C# development patterns totally impenetrable, web development in particular.

    How anything works is hidden behind so many levels of abstraction and hidden code that you can't see and don't know how it works that doing anything with this stuff amounts to little more than making magic incantations in a language you don't understand and hoping you don't summon a major demon.

    1. cornetman Silver badge

      Re: He's right about black boxes and over-abstraction though

      I find that *some* level of abstraction is good and getting all the related code together actually helps to understand what is going on under the hood. I like the basic ideas of object-oriented programming to organise and structure code.

      However in some code that I see, code is hidden behind layers and layers of filler and when trying to root out bugs, you often have a devil of a job finding wherever the real code is hiding. Java is a beast for this and it seems like layers of "interfaces" make code discovery very difficult. Case in point, if you ever try to use something like Eclipse to take you to a function definition, it nearly always takes you to a generic interface definition and not the real code. I can see the design idea here but it makes code navigation nearly impossible.

      1. Alan Mackenzie

        Re: He's right about black boxes and over-abstraction though

        > I like the basic ideas of object-oriented programming to organise and structure code....

        seems to lead inevitably to

        > you often have a devil of a job finding wherever the real code is hiding.

        In short, OO programming leads to difficult to debug software. As Martin Fowler put it in his book "Refactoring", which assumes OO is the standard way to program: you no longer pass arguments to the functions which need them. Instead, you keep data in _objects_, which have _methods_ for extracting data, sometimes recursively, so that you pass these objects to methods which somehow have ways of getting to the needed data. He admitted this openly, without any discussion of the difficulty in debugging this causes.

        Some OO languages (C#, I'm looking at you) prevent individual data items being passed by reference, thus forcing an entire objectful of data to be passed into a method rather than a pointer to the single item which is to receive a new value. This greatly hinders analysis of what data is used where, particularly when some data item gets a rogue value.

        I thought I liked the basic ideas of OO programming too, until I saw what they did to debuggability.

        1. doublelayer Silver badge

          Re: He's right about black boxes and over-abstraction though

          "Some OO languages (C#, I'm looking at you) prevent individual data items being passed by reference, thus forcing an entire objectful of data to be passed into a method rather than a pointer to the single item which is to receive a new value."

          There are several problems with this description. First, it's not C# or any language doing that. It's the writer of the object labeling a member private or protected or not exposing it directly. If it's your library, you can change it quickly without breaking compatibility. If it's not your library, it leads directly to the second problem, which is that this is done on purpose for a reason. It's done because it lets the writer of the library restrict their code to defined behavior. If they don't want you manipulating an internal value without going through the setter, it probably means their setter is doing something or it might in the future, and if it does your direct manipulation might break stuff.

          This is the same reason why, even though I can, I don't write to internal data of other parts of a program or dependency. Maybe it works now, but there's no guarantee that it will continue to work when something changes and if it doesn't, the failure is likely to be annoying to debug, unlikely to show up first in testing, and potentially damaging to the user's data or environment. It also makes debugging harder because, if you put something invalid there, the problem will not be detected when you do it. It will only show up when some later operation tries to work with it. Debugging will require tracing back from where it broke to where you inserted the invalid data.

        2. cornetman Silver badge

          Re: He's right about black boxes and over-abstraction though

          > In short, OO programming leads to difficult to debug software.

          I guess that when I said "basic ideas" about object oriented programming, I mean that I like to aggregate related code together into class member functions to keep it logically together and increase discoverability. You can do this in C using function pointers and by simply grouping related functionality by source file.

          Where this really starts to get unstuck for me in terms of discoverability is polymorphism, a massively overused concept which is useful in some very limited cases. I have seen Java code, though, where the default approach is to make "everything" an interface even when it serves no useful purpose.

        3. Dan 55 Silver badge

          Re: He's right about black boxes and over-abstraction though

          In C++, an object is a structure of variables with a pointer to this which points to the class functions, so if it's difficult it's down to the debugger.

          1. Alan Mackenzie

            Re: He's right about black boxes and over-abstraction though

            Difficulties are _not_ caused by debuggers. They're caused by fragmented source code - source code where things you need to look at, rather than being in one place, are separated into several, or lots of different places.

            Object oriented programming is very good at causing this fragmentation. OO programmers think they have done a good job by "hiding" abstractions. This aggravates the difficulty in debugging, because debugging necessitates boring into these abstractions and understanding their implementation in detail. If a class is implemented by inheriting from a super-class, possibly on several levels, that is severe fragmentation when it comes to debugging.

            You may say that the super-classes are bug free, due to them being properly designed and tested. I would reply that there's no such thing as a bug free piece of software, only ones whose bugs have not yet been fixed.

            There's a tendency nowadays to emphasise the ease of reading or writing source code. This is misplaced - the emphasis ought to be on ease of _debugging_ the source code. OO methods don't help, here.

            1. Dan 55 Silver badge

              Re: He's right about black boxes and over-abstraction though

              Ok, if you're comparing large monolithic functions vs smaller functions distributed across the codebase, I don't think OO makes this any better or worse, rather OO aligns with a certain coding style anyway.

              In my pure C programs I tend to separate the code into small specific tasks where possible and in a way which means they can be reused where necessary. That way once I'm pretty sure they work and know the values it returns are correct I can skip over them in the parent stack frame. Or I can call a function with test parameters to check if the values are what I expect them to be.

              I find that easier than debugging a large monolithic function and from that style it's just a small step to C++ anyway.

              Now if you're talking about Java programs and debugging them... that way lies madness.

        4. Madre O'Fender

          Re: He's right about black boxes and over-abstraction though

          I understand the problem, but I'd put the blame not on OO, but on OPC - Other People's Code. That's the main cause of undebugability IMHO.

        5. F. Frederick Skitty Silver badge

          Re: He's right about black boxes and over-abstraction though

          In short, OO programming leads to difficult to debug software.

          Then you're doing OO badly. Not that I can blame you, since a lot of the literature is heavy on the details of features like inheritance, but light on the best practices.

          My preferred programming style employs objects that are immutable, with the builder pattern and defensive copying of things like mutable data structures (lists, etc). Those data objects are "dumb" with little to no business logic. The business logic is all in service classes. Those data objects rarely use inheritance, and the service classes the same. The occasional abstract class if there is shared logic that can't be abstracted into a simple singleton with no state.

          Too much OO related literature - particularly that written by Stroustrup - stuffs data and the logic that operates on it into the same class. As it's often trying to teach you the minutiae of a particular language, it tends to encourage the idea that inheritance is essential and that "dumb" data objects are somehow bad practice. Java went through this painful stage about twenty years ago, with people like Rod Johnson advocating then discouraging objects with data and business logic combined.

          Inheritance is not something that most classes need, and it should be made clear that outside of frameworks or standard class libraries it's to be avoided where possible. Lowering the dependencies between classes is a good thing, lowering the cognitive load when figuring out what code is doing. An exception to this is interfaces or pure virtual classes that describe generalised behaviour in terms of a clear API.

          1. Alan Mackenzie

            Re: He's right about black boxes and over-abstraction though

            > Then you're doing OO badly.

            That seems to be a "no true Scotsman" defence of OO. I was talking about the OO that I encounter in existing programs I need to debug and/or modify. I would agree with you, though, that OO can be helpful as long as its essential features aren't overused.

            > Those data objects are "dumb" with little to no business logic. The business logic is all in service classes. Those data objects rarely use inheritance, and the service classes the same.

            What I, as a C programmer, would term data and functions.

            > Inheritance is not something that most classes need, and it should be made clear that ... it's to be avoided where possible.

            Couldn't agree more. Just as the goto statement should be avoided where possible in C programs. Though for advanced users, there are legitimate uses both for goto and inheritance, as you point out in the bit I elided.

            I think we're in violent agreement regarding most of these things. It's worth pointing out that Paul Graham, the Lisp hacker, once wrote he had never once used the OO constructs in Common Lisp, despite these being full-featured. OO is a _choice_, not a necessity.

        6. LybsterRoy Silver badge

          Re: He's right about black boxes and over-abstraction though

          If you then add event driven into that you have my vision of hell for IT staff

          1. Pseudonymous Howard

            Re: He's right about black boxes and over-abstraction though

            Somehow it sounds like that for some people here over-abstraction starts with anything more complex than "Hello world!".

    2. captain veg Silver badge

      Re: He's right about black boxes and over-abstraction though

      > How anything works is hidden behind so many levels of abstraction and hidden code that you can't see and don't know how it works that doing anything with this stuff amounts to little more than making magic incantations in a language you don't understand and hoping you don't summon a major demon.

      Yes.

      And what's worse is that periodically Microsoft changes the spellbook and airbrushes the old one as "legacy", meaning that all the impenetrable cut-and-paste potions you can find online no longer apply. And so almost all (barely) functioning applications are basically round-shaped code samples hammered into square-shaped business problems.

      -A.

    3. Anonymous Coward
      Anonymous Coward

      Re: He's right about black boxes and over-abstraction though

      You stated that some developers are behind so many layers that they “can’t see” I believe you meant “can’t C”.

  3. pip25

    Tradeoffs

    You have an idea for a service, or found a market gap you want to fill with your own solution. Obviously you don't want to release some unstable crap, but time to market is still of critical importance. Unsurprisingly, in such cases you won't be reaching for some kind of bottom-up, close-to-bare-metal solution. Your software will be much slower than what it could be, but that's the price you pay for getting something done as quickly as possible.

    Not all projects are like this. But few of them lack deadlines altogether. Without the abstractions and "black boxes" that speed up development, these projects would not be better or worse, they would simply not exist.

    1. James Anderson Silver badge

      Re: Tradeoffs

      But developers are pushed to use frameworks which may make development a bit quicker but

      make debugging and tuning a nightmare.

      Who knows what really goes on inside ORM generated code?

      1. Rich 2 Silver badge

        Re: Tradeoffs

        “But developers are pushed to use frameworks…”

        l have written embedded stuff, multi-platform distributed PC apps, some pretty large and complicated web apps, etc etc and I have never used a “framework”

        People use frameworks because they are either lazy or because they need to go back to school. Coding OpenGL from scratch is a nightmare, but STILL doable and you will learn a lot.

        1. doublelayer Silver badge

          Re: Tradeoffs

          This probably depends on your definition of framework, but people use frameworks because they want something that is likely to work and they don't want to spend long times implementing and maintaining it. We do it all the time with operating systems. I could write something without using Linux's system calls. I know how to do it. They taught me how to and made me do several when I studied operating systems. If I'm working in something embedded, I will. If I'm not, I won't. I won't because I want my program to run later on, so if the system call gets improved, I will get the benefit. I also don't want to waste time I could be spending getting new things to work redoing something that's been done before, probably better than I will because they've got years of experience finding and fixing bugs in it. If I'm writing another platform, library, framework, or whatever else this is, then of course I'll reimplement because I'm trying to make something better than they have. If I'm trying to build something that needs such a platform, library, framework then I won't rewrite it unless their one is not good enough.

          There are lots of good reasons not to use a dependency, but ascribing laziness to those who do is a ridiculously broad stereotype that does no good to anyone. Marlinspike's comments were aimed, as he says, at those who cannot implement such a thing, not those who don't. A large part of knowing how to do something properly is knowing if it makes sense to do that thing.

          1. Pascal Monett Silver badge

            Agreed

            Now tell me : since when has it made sense to download external, unverified code to a production server ?

            1. doublelayer Silver badge

              Re: Agreed

              Now tell me: when did I recommend doing so?

              You don't have to do that to use a framework. There are lots of ways to manage dependencies that don't clone a repo on invocation and just roll with it.

            2. Claptrap314 Silver badge

              Re: Agreed

              You talking about the OS? A printer driver? What?

              No one verifies all of the code running on a server, and no one has for decades.

              1. captain veg Silver badge

                Re: Agreed

                I haven't encountered anyone that downloads an OS off some random site on the internet. I suppose it's slightly more plausible for a printer driver. But JavaScript libraries* are commonly fetched off CDNs for various more-or-less bogus reasons. Fingers crossed, eh?

                -A.

                * I write my own library code. Try it.

                1. Anonymous Coward
                  Anonymous Coward

                  Re: Agreed

                  "* I write my own library code. Try it."

                  Good man.

            3. Anonymous Coward
              Anonymous Coward

              Re: Agreed

              @Pascal: Fuckin' a right. But that sort of auto-inclusion/auto-download is the mode de jour.

          2. jotheberlock

            Re: Tradeoffs

            If you literally mean 'system call', no, you can't, not in a Linux userspace program. At the very least you'll have to call exit() ( 'mov rax, 60' then 'syscall' on x86-64 ) or execution will continue past the end of your program and you'll segfault. And if that's the only syscall you're doing your only interaction with the outside world is the exit code.

            You could in theory write your own kernel I suppose but there's a few more implications there than 'what if they improve the syscall'.

          3. martinusher Silver badge

            Re: Tradeoffs

            One small snag with overdoing frameworks is that everything starts to look and work in exactly the same way. "Fine", you say, "It simplifies the user experience, eases maintainance and so on". True, but when the interface is missing useful functionality (because it doesn't fit into the template, I suppose) and other, ostensibly quite different products, exhibit the same deficiencies then you've got to ask yourself what's going on.

            My guess is that the machine's designed to the capabilities of the interface rather than the interface reflecting the capabilities of the machine.

        2. captain veg Silver badge

          Re: Tradeoffs

          What bugs me is that often learning the "framework" is a bigger task than learning the underlying language. People do it for the benefit or their CV and never really understand what their code is actually doing.

          Showing my age here, it used to drive me crazy encountering so-called devs who believed that jQuery was somehow magically able to do stuff not achievable in plain JavaScript.

          -A.

          1. Anonymous Coward
            Anonymous Coward

            Re: Tradeoffs

            @captain veg

            Indeed. It is the same with many CMS and theme editors. In the end, you become experts on the system and not the task you initially set out to be.

            jQuery Mobile v1 in about 2009/2010 was the best framework I ever used. All went to hell after that. Helped write a JSP tag framework for an old beast of an XML monster called WURFL ... spit!

            Agile does have its uses but must not be controlled by project managers (the roadies of IT). Split control between dev/production and marketing. Project managers report to marketing and stay well away from Production.

            If anyone from California with a tummy, beard and low physical appeal brandishes an 'Agile for Dummies' book at you, kick him (it is never a woman) right in the goolies.

          2. Orv Silver badge

            Re: Tradeoffs

            I knew jQuery did the same stuff you could do in plain JavaScript because I learned it in plain JavaScript first. I still used jQuery, though, because a) it was faster to write, and b) it accounted for a lot of browser quirks that I would never have had time to find and write polyfills for myself.

        3. An_Old_Dog Silver badge

          Re: Tradeoffs

          "People use frameworks because ..." One reason programmers use frameworks is "stupid people stuff," e.g., management bought a pile of stuff and orders staff to use it, "to see a return on our investment."

          Fortunately I've never been in the "you will use this framework" boat, but I've been quite a few others. New HellDesk ticketing system. System Development Life Cycle[1] and a bunch of commercial software to implement it. ITIL[2]. New tape backup hardware/software that would not reliably locate/restore "backed up" files. I was pushing to have our source put under version control, but nope, that's too hard (management believed), and a waste of programmers' time.

          [1] SDLC founders on the rocks of a single problem: ignorant and/or insufficiently-imaginative people don't bring in all the stakeholders necessary to create a not-failed project. But hey, management sails on in their cloud of grand disconnection, looking at the pretty "dashboard" displays.

          [2] Fortunately, implementation of this ground to a halt and was quietly dust-binned after a month and a half. Maybe it's somehow useful, to somebody ... somewhere.

  4. JoeCool Silver badge

    Has "Agile" become synonymous with "Development management" ?

    I can agree on the issue of Silos, but there's no non-coincidental link to Agile.

    For the Nth time, the Agile manifesto never stated

    "do stupid, self limiting things" those are coming from other causes.

    1. StillBill

      Re: Has "Agile" become synonymous with "Development management" ?

      Yes,

      A bunch of consultants have showed up to director and above levels and spouted you must be agile or be a dinosaur. So these people say we must be agile and go and find a bunch of poorly self trained "scrum masters" amongst their internal ranks and say make it so. You end up with people with a poor understanding of project management running the henhouse and no good ever results.

      Agile and scrum (somehow these things are always linked in very strange ways) are just interesting ways of getting management off your back long enough (usually 2 weeks) so you can actually get some work done

    2. abend0c4 Silver badge

      Re: Has "Agile" become synonymous with "Development management" ?

      To be fair, the Agile Manifesto doesn't say much about anything, except in the vaguest of terms.

      It's as difficult to blame it for any specific failures as it is to credit it with any particular success.

      The biggest difficulty with it, in my view, is that it's the work of software developers, putting themselves front and centre. I think what Marlinspike might be getting at is that software developers are not necessarily the founts of wisdom they may believe and it can be difficult to challenge them when they're seen as the linchpin. Also, perhaps, the typical Agile approach is to start with coding and then refine it later: that doesn't really encourage thought and understanding of the problem before you start - or indeed later, provided the code appears to "work".

      1. Falmari Silver badge
        Devil

        Re: Has "Agile" become synonymous with "Development management" ?

        @abend0c4 "The biggest difficulty with it, in my view, is that it's the work of software developers, putting themselves front and centre."

        Don't try and blame software developers for the 'Agile Manifesto', they not the ones responsible for it. ;)

        It's obvious the manifesto is not the work of software developers. At 4 lines long you can't even call it a manifesto, anyway the 4 points are so vague in real terms, they are just unquantifiable twaddle. Yes there are only 4 lines the other 5 (3 above 2 below) lines are really only comments, it seems there is more comment than points.

        The 'Agile Manifesto' was written mainly by consultants, I don't think any of the 17 authors at the time could really be called software developers, certainly not as their primary employed role. Read https://agilemanifesto.org/history.html and see what you think.

        "On February 11-13, 2001, at The Lodge at Snowbird ski resort in the Wasatch mountains of Utah, seventeen people met to talk, ski, relax, and try to find common ground—and of course, to eat. What emerged was the Agile ‘Software Development’ Manifesto. Representatives from Extreme Programming, SCRUM, DSDM, Adaptive Software Development, Crystal, Feature-Driven Development, Pragmatic Programming, and others sympathetic to the need for an alternative to documentation driven, heavyweight software development processes convened......"

    3. Anonymous Coward
      Anonymous Coward

      Re: Has "Agile" become synonymous with "Development management" ?

      agile is for writing code... like a monkey or a plow-mule with blinders-on.

      Development is what agile calls pathfinding, and happens outside the agile-grinder.

      1. yoganmahew

        Re: Has "Agile" become synonymous with "Development management" ?

        In most enterprises, there is only the agile grinder. 26 2-week sprints in a year Or 52 1-week. Or... anything that completely fills the year up with 100% productivity.

        1. Orv Silver badge

          Re: Has "Agile" become synonymous with "Development management" ?

          Yup. I see so many job postings now with "Agile" either in the requirements or outright in the job title. It's become a cult of its own. Reminds me of Six Sigma that way, where the rules became not just a way to improve processes but an end in themselves.

          1. spiketoo

            Re: Has "Agile" become synonymous with "Development management" ?

            Been a PM for 40+ - one of those greybeards tho I prefer seasoned professional.

            I confess I'm not sure what Agile is. Always looked to me like Waterfall in iterations. Kinda reminds me of UML where it was decided to use stick figures to explain requirements to try and get around the language barrier of English. Agile at it's intro had to be better because it was new and all.

            IMHO, the issue has always been the delivery vs the deliverable but that's PM speak for you. The process itself of delivering has become so commoditized with Jira and Confluence and now Dustin's Asana looking for a piece. Since I've basically been responsible for big iron in financial institutions, these apps remind me of RPG code (not role playing) where literally you had to fill in the boxes in a predetermined way - at least best I can recall. That's what project delivery has become - we have to use these Atlassian apps because alas, everybody else is.

            As always, YMMV.

  5. chololennon

    "Agile teams end up siloed"

    "...agile teams end up siloed, working separately from each other, and without much visibility into what other teams are doing", he argued.

    I tend to disagree with him... I spent 40 years developing software: what he describes I saw it with waterfall methologies, not with agile. Agile has its own problems, but certainly this not one of them.

    1. Brewster's Angle Grinder Silver badge

      Re: "Agile teams end up siloed"

      Which probably means all the corporate flaws Agile was pushing back against have now be transplanted to Agile, and that the underlying problem is corporations tend to encourage programming in this way. Maybe a few used the opportunity to reassess the situation and created a better culture.

  6. JamesTGrant Bronze badge

    Before The Internet, it seemed that there was an understanding that producing functional software took quite a long time. I remember reading from books and ringbinders really learning and studying - down to the physical layer.

    Now I generally smash out ‘looks about right’ code from ChatGPT with tweaks. Close to full stack - if I think how long even today’s efforts would have taken back then I feel stressed just thinking about it!

    These days - ‘good enough’ is perfect. Get it working, fast - pass the tests, get it approved, move on. Feels like taking the foot off will result in failure. Maybe that’s just my insecurity.

    1. tsprad

      Your first sentence nailed it. When I first heard of the metric "lines of code per day" (in 1975) I was flabbergasted and appalled, and the numbers being used in those days were around 10 to 100. I think it was Dennis Ritchie who remarked that one of his most productive days was when he deleted about a thousand lines of code.

      No mere mortal will ever be able to comprehend millions of lines of code.

      1. Bill Gray

        > I think it was Dennis Ritchie who remarked that one of his most productive days was when he deleted about a thousand lines of code.

        This. Few things beat the joyful feeling of realizing that some mass of horrendous spaghetti code can be revised into a few succinct lines.

  7. Bluck Mutter

    I started my programming career in 1977 and retired in 2019.

    My comment is not specifically about agile but for old blokes like me, you could have a meaningful career knowing one language (COBOL,C) or maybe two (C then C++) or three (C then C++ then Java)... noting the serial rather than parallel nature of language acquirement.

    We did have interlopers like Pascal come along (and then recede) but the "issue" I see today is that languages/frameworks are breeding like rabbits, which fragments the gene pool and forces dev's to have to have wide but not deep knowledge of many languages/frameworks.

    Check out a dev job requirements list today and you need to be an expert in 10, 15, 20 languages/frameworks/devops tools

    That to me, more than agile ***, is a major part of any issues today... this layering of dev stacks where the dev potentially only has a passing knowledge of many.

    I personally moved into many different app types/languages during my career but that was by choice and each self made move meant I could immerse myself in my new "calling" and become competent in the new chosen language/app space.

    I would hate to be a dev today, being forced to learn new "stuff" poorly to keep a job/get a new job/stay relevant.

    Bluck

    *** Although I like to use Fragile and Scum for Agile and Scrum.

    1. jimbo60

      Not retired yet, but a similar path (C, C++, Javascript, Python for scripting). But I've also needed to learn and use source control systems, build systems (Makefiles), IDEs, and so on, and would not want to work without them.

      Frameworks are another tool in the tool box. They have their place and uses when they are the right tool for the job, but they are not always the right tool.

  8. harrys Bronze badge

    Agile a brilliant idea.... in a perfect digital ordered world, shame reality is analogue :)

    Still it can be implemented well.... ironically only when the head honcho has the power/ability to crack the whip when needed

    Even then ... said person will eventually leave/move on.... then the whole thing starts crymbling away

  9. Anonymous Coward
    Anonymous Coward

    T-Cubed

    It is impenetrable because it's now a "Tottering Tower of Turds".

    Everything has to be internet-bluetooth-javascript-server-side-subscription etc etc etc.

    Much of it has very low actual dollar value - often the business case (dream) for it is actually to shift the workload off a companies books and onto the users time, and not actually make anythingbetter or more productive overall.

    Since it's not really worth doing, it has to be done fast and cheap.

    Since it is sold with a great list of buzzwords, it has to be done using a T^3 of other (more or less random) tools and services, that the developer has little understanding of, and no control over.

    It is simply guaranteed to be unstable, as it has too many moving parts which are held together with duct tape, spit, and hope, and no bolts.

  10. froggreatest

    Beatiful words but

    I like the man and he inspires me. My life chooces and the move towards improving the wand skills is partially influenced by him.

    However I want to stress that nothing will change because majority does not care about the quality of software or its security when buying it. The incentives effectivy reward building features on top if each other. Nobody needs an engineer/developer that understands how packets traverse networks how memory works or how cryptography aids in authentication. Everyone just needs a React dev who can put a button in their UI or a dev who can add another API endpoint, or the one who can render responses from openai in the view.

    Microsoft started prioritising security internally and dealing with the technical debt after recent failures. But if tgere will be no failures in 6 months who says this practice will be maintained? The same applies to other companies. The future is bleak.

  11. cdegroot

    Nonsense.

    What has happened is that we went from a handful of artisanal craftsmen, who had to make their own tools, to millions of tool users and it has led to the extreme success of "software eating the world".

    Of course if you buy a DeWalt cordless set at Home Depot you gave a different relationship with that tool than if you carefully constructed a bow drill, having to take into account exactly the one purpose the drill is to be used for, learning about the material you need to drill into, the quality of the cutting materials at your disposal, etc. The DeWalt, you pop in a drill bit and plunge it into whatever and now you have your hole and move on.

    Moxie then contradicts himself by noting that these artisans still exist and make useful contributions.

    I'd say, move on, nothing to report here. We are an industry now. Film at 11.

  12. nautica Silver badge
    Boffin

    "People who are more than casually interested in computers should have at least some idea of what the underlying hardware is like. Otherwise the programs they write will be pretty weird."

    --Donald Knuth

    1. tsprad

      Who could downvote Donald Knuth?

      Kids these days! Furrfu.

  13. JRStern Bronze badge

    This isn't about Aglle at all

    It's about modern development but "Agile" doesn't cover the technology or architecture or toolset, and that's what these complaints seem to be about.

    Many of the points are valid but they aren't about Agile. It's "the stack" of open-source web oriented components.

    I've been at this since never mind, but I wasn't in a web-scale SOTA app until about 2016, and I was pretty much appalled at the environment. The stack of tools is accompanied now by just countless little widgets downloaded from wherever, just trying to list them on my resume takes three lines in nine-point type, LOL.

    And just the Microsoft Entity Framework is OMG. Back in the last millennium Microsoft missed the boat on mainstream OOP so now they're trying to make up for it with the most verbose, over-abstracted, ... well never mind, I'm sure most know it better than me. And the web pages were hand-crafted with a separate stack of a bazillion tools and hooks. Others may use more advanced tools, I hope, maybe I hit a bad place at a bad time, but look at the garbage on any major web page today, when it goes crazy on your home system.

    There's technological pain out there, but Agile is another topic entirely.

  14. Pete Sdev Bronze badge

    Legends...

    infosec legend Moxie Marlinspike

    Many many moons ago I wrote a E2E system neutral encryption protocol for instant messaging. It was called the Whisper Protocol. I also wrote a PoC Jabber client (WhisperIM) in Java. The symbol and icon was a raven, being familiar with norse mythology and living mostly in Sweden at the time.

    I got in touch with the OTR team offering to collaborate as they were working on the same problem at the time. The rather snooty reply was that I could implement OTR in Java for them. Incidentally, their protocol (v1) had a MitM weakness.

    Years later we had Whisper Systems. Great minds and all that...

  15. JLV Silver badge
    Facepalm

    What this brings to mind...

    Not bothering to learn the tech you are using underneath.

    (too) Much ADO about OOP.

    ORMs come to mind here, with how developers can't be bothered to learn SQL - because it is "so hard". While spending countless hours figuring out how to express query edge cases in said ORM.

    Change your ORM? Learn it all over again, while SQL is, for the most part, quite portable.

  16. Locomotion69 Bronze badge

    Agile requires commitment from all parties, and an active attitude regarding their role.

    Especially the customer role is important for success - but a nightmare for each project manager as this cannot be planned properly - PMs do not "rule" the customer.

    Doing Agile right is difficult - sometimes impossible. I believe in software engineering today the terms "iterative" and "agile" shall be distinguished. Most developments I see these days are "iterative" developments, not Agile (according to Agile manifesto).

    Blaming Agile for project failure where actually an iterative approach was used proves the point.

  17. Anonymous Coward
    Anonymous Coward

    Blame........

    A very long time ago, a wise man (J. Edwards Deming, an expert on "process") commented that "Blame is a useless concept".

    He said this because he believed that once a problem had been solved, most folk would have forgotten about "blame".

    Now take a look at this article, and the comments...........most of the debate is about finding something (or someone) to blame.

    My question: Can anyone articulate (in a few sentences) EXACTLY what the problem is?

    1. doublelayer Silver badge

      Re: Blame........

      I'm inclined to agree with you in many areas, but I think your main point still gets it wrong. There is indeed a lot of complaining, including on things that are not worth complaining about. I've pushed back on several of them already. While I haven't talked much about Agile here, and I tend not to like it, it too seems mostly irrelevant to most of the things that we're discussing.

      However, this is not something where we can just articulate what the problem is, because the topic in which we'd try to find the central problem is "what is wrong with software in general, worldwide, and why". There are lots of problems in that area and almost none of them apply to everyone. When you have a question as vague as that, answers are less likely to work because most of them won't apply well to all software writing. Those that do will probably be worse, because the only way you can claim to comment on all programmers is to have ridiculously broad stereotypes (E.G. young people are terrible because not one of them learned anything other than JavaScript, old people are terrible because not one of them can write in anything other than VAX assembly). No simple statement of a problem, no matter how accurate, will make that the problem.

  18. Michael Wojcik Silver badge

    Non-expert pontificates about area he doesn't understand, news at 11

    Add Moxie Marlinspike to the list of people who don't understand Agile but talk shit about it nonetheless.

    Of course, Marlinspike isn't magically an authority on everything. He's had some good ideas (the Cryptography Doom Principle, the Whisper protocol) and done a bunch of good work in security in general and cryptography in particular. I respect him in the areas where he has expertise.

    He's also had some terrible ones, like adding cryptocurrency functionality to Signal. Add this one to the latter list.

    Abstraction has been a fundamental and necessary aspect of software development since the very beginning. AUTOCODER (and thence COBOL) and FORTRAN were created precisely to implement an abstraction level. Software was being developed using abstractions before computers existed — see Rojas, "The First Computer Program" (CACM 67.6), for a discussion of the abstractions Babbage used when developing programs for the never-completed Analytical Engine. It's surpassingly historically naive to suggest abstraction in software development is in any way related to Agile.

    And no one, no one, develops software without relying on abstractions. It would be cognitively impossible. Pretty much no one even understands the whole stack. There are many examples in places like the LKML or Dan Luu's blog of how the vast majority of software developers don't understand low-level details; conversely, the developers who do know about, say, cache coherence generally have very little experience with the requirements of designing public APIs or usability research.

    1. Anonymous Coward
      Anonymous Coward

      Re: Non-expert pontificates about area he doesn't understand, news at 11

      @Michael_Wojcik

      Quote: "...people who don't understand Agile..."

      OK.....you think that Moxie Marlinspike does not understand "agile".

      OK....."abstraction" has been with us for a while.

      The rant tells us about what YOU know.....

      ......but please give us some idea of your problem with Moxie, or with abstraction, or with agile........

  19. mili

    Agile is not a mantra it is a tool

    Those who do not understand the difference between a mantra and a tool will never understand why their agile development did not work out.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like