I resemble that remark
Don't think I ever used a single-kHz CPU. Did start out on an IBM 1401 at just under 90 kHz (with 16kB memory, using a term that had just been invented by 1965). We only had one terminal, but there was what would now be called a word-processing system available that could handle something like a dozen concurrent users, keeping most of each document on disk (1 or 2 MB). An early (first?) compiler-compiler was written on/for a 1401, as was a FORTRAN compiler.
Besides grumping, I wrote this comment to mention that COBOL was the first language I ever saw (and still one of very few) that made a clear distinction between external and internal representations of data, with the compiler taking care of what in more modern languages is all too often left to complex, ad-hoc, buggy marshalling code, or more likely by vomiting C-like structs (Pascal records) onto a disk or wire and expecting that all will continue to work well when CPU or OS changes invalidate the "well it work on mys system" assumptions.
It's not COBOL (or 360 assembly) that is the missing skill. It is knowing what resource and operational constraints are, how to plan for updates (and recovery), and how to badger upper manglement into giving a damn about their commitments to customers.
That said, a friend accidentally let her employer know she had done some 360 assembly back in the day and found herself maintaining a library whose documentation should have been shelved next to the Necronomicon.