"the database in question had nearly 100 tables, none of which were linked"
I've dealt with a product of that scale although it had a decent schema and worked well. The only issue was that, running on Informix, it had been written in the pre-cost-based optimiser days and the vendor wished to stay there so they forbade customers from running UPDATE STATISTICS. The SQL was written to hint the query path to the engine. I recognised the technique because I also go back to the pre-cost-based optimiser days. Step outside the limits of what could be done with that approach and the performance was a disaster.
When I arrived there were a few home-written reports that they couldn't run - if allowed to complete they'd have taken well over 24 hours. In my second - or maybe even late in my first week - I took a look at them and found the query plan generated took absolutely the worst route it could, index searching almost the entirety of small tables and joining large tables to them with a sequential search. Temp tables to the rescue.
Eventually the vendor decided that as the cost-based optimiser had been around for over a decade and probably worked they'd allow it.