There are some very good technical articles that explain why Flash traditionally sucked on OS X and mostly they were to do with the browser, the APIs and the lack of hardware acceleration the plugin had to work with.
Potted summary - the Windows flash plugin usually runs in it's own native window (though it can also be windowless in some cases) which means it can render whenever it likes without begging for a repaint from the browser. It can also tell when it is hidden / visible to throttle repaints. In OS X all plugins are windowless and at the mercy of the browser for rendering so if the plugin needs 30 repaints a second it must ask the browser. Multiply by 3 or 4 plugins and 3 or 4 tabs and it's easy to see how stuff bogs down. Additionally, until Core Animation, the plugin couldn't even benefit from hardware acceleration for it's purposes so video performance really suffered.
So yeah Flash sucked and if we were to apportion blame then Apple would take its fair share. Note that I said traditionally above. Most of the issues have been resolved by Adobe engineers working in conjunction with Apple engineers which is why Flash performance is so dramatically improved.
All of which is by the by. Flash has been demonstrated to work more than adequately in a range of phone operating systems before and after iOS. The only reason it was left out was protectionism. If Flash (specifically AIR) ran on iOS, it would allow Adobe to hive off their own ecosystem which was OS independent. Users visit Adobe's app store to run an app and it wouldn't make any difference if they switched to Android because the same runtime would be there too.
Google seems to have taken a more pragmatic approach that most people use what their phone supplies out of the box (power of the default) so there is little point actively preventing a small % of users from installing some other runtime / browser. Doing so just antagonizes people, and probably scares off more prospective users than it would gain by locking them in.