Re: I know my career hasn't been typical...
As for your second point, "generalising the schema" is de facto denormalization and logically ends with something like Redis.
Not necessarily.
Take, for instance some process control situation. You're going to take data out of the database, pass it to some machine, take the results of automated inspect and then either pass the job onto the next machine or send it into some rework loop. You can hard code all these rules. Alternatively you can have tables which define the rules so that your code becomes a rules engine. You want to modify the process? Change the rules. You want to reuse the whole thing in a different process? Set up a new database, same schema, different rules.
Another instance: you have a block of data which describes the job. You never handle individual parts of it, you just bung it out to the machine that does the job. You could "normalise" it - and I've seen this done against my advice - but, in fact you're breaking the normalisation rules because it is, really, a single entity. The right way is to treat it as a blob, store it as a single entity, output it as a single entity. I watched the spec of the job change and more and more tables were added, more and more code written to split the data up, store it and reassemble it.
The version I did had even more complex demands than changing the spec of a single product over time, it started with multiple products with different specs to which many more were added over time. No problem, the database didn't need to change as all it saw and all it needed to see was a single column to hold the blob in a simple table to keep track of it. Another line of business with a broadly similar requirement? No problem, a few tweaks of code to handle some different T&Cs and we're good to go. We were being agile but I don't think we were necessarily Agile.