I believe he's meaning in aggregate broadcast
With broadcast there is no bandwidth increase based on the number of customers. i.e. 1,000 customers have the same bandwidth requirement a 100,000 to watch a single show at the same time (US HDTV spec has a peak of ~19mb/sec which you never get but that's the spec), ignoring requirements for repeaters, etc you have a constant peak requirement of 19mb/sec. With Netflix, etc (anything that isn't using multicast to send the data) everybody gets their own point in time, you aren't tapping into a constant feed. Minimum bandwidth for a HD Netflix show is around 6mb (it can go higher, lower generally is SD).
So 1000 users watching the same HD show simultaneously
Broadcast = 19mb/sec peak
Netflix = 6,000mb/sec minimum
To put it another way: Comcast has 16.3 million cable customers, if 50% of them are watching something at the same time. It would require 48,900,000 mb/sec or ~46.6 terabit of bandwidth.
Assuming the same 50% of the Comcast customers are watching something, and lets say they are even spread across 200 different shows simultaneously at the peak load of 19.6mb per show that would require at max 798,700mb or ~0.76 terabit of bandwidth.
789700mb / 48900000mb == 1.633% == the percentage 19mb broadcast bandwidth stream requires compared to a 6mb non-broadcast stream for the same number of customers.
I'd say that the current method of "on demand" HD streaming isn't sustainable right now for everybody (maybe in a few years that will change but it won't be cheap), but if there is a switch to multicast or equivalent it could be.