Fun with iteration
I've had a similar experiences a few times with poorly optimised websites.
Once in another lifetime, I was working as a liaison for a customer who was using our system across dozens of countries. Performance was always a major complaint, but the dev team would just shrug and maybe fiddle with some of the data queries.
Eventually, I fired up my web browser and took a look at the network traffic. Only to find that some of the pages were tens of megabytes. Most of which was whitespace. And being sent, uncompressed over relatively low-bandwidth lines - this was over a decade ago and office broadband speeds were generally limited.
Turned out that the pages mostly contained lists. Said lists could contains hundreds or thousands of items, and were generated via templates. A single instance of a template with (say) 200 characters of whitespace wasn't an issue. Multiply by several thousand and sooner or later, you're talking about real data...
This was duly flagged to the dev teams, who cleaned up the templates and enabled mod_gzip on the servers; between the two, page sizes were generally reduced by over 95%. And practically overnight, complaints about performance dropped!
I never got around to figuring out how much time/money my little discovery had saved, but when you have thousands of users across the globe, even a minute or two per day/user quickly adds up...