In the early days, GDS were instrumental in forcing government departments to adopt and embrace new ways of delivering software. The potential was high to banish the endless cycle of EDS-style failures that were constantly reported in the computer press.
However, after the Exemplar programme, GDS were largely de-fanged, and while most switched-on government departments recognise the value in modern development methods, some are still held back by legacy PRINCE2 internal constraints and misunderstandings about what agile development (for example) actually means. For gov.uk services, the GDS assessment mechanism still works, as it enables suppliers to develop software in the right way, focussing on the end-user needs rather than the department's domain knowledge-driven approach (I remember on one occasion, I stumbled across a Scrum board where every single ticket started with, "As a Product Owner, I want the user to...").
As for GDS' internal capability, that I cannot assess, although Verify was an unmitigated failure largely because it focused on the technology rather than the user. As stewards of the development process, I can't fault GDS much, but maybe they don't have the internal talent to develop services according to their own guidelines. The fact is, government doesn't pay much so the best people work for consultancies.
GDS has been, and continues to be, an advocate of good working practices. They just need the right person in charge, and to be given the authority to carry out their mandate. At the moment they have neither.