US customers won't be able to use this for 20+ years
As some said.. 25 to 50mbps+ for one channel.. given the ridiculous state of internet speeds in the US and the astronomical prices we pay for a fraction of the bandwidth as most other countries, we're a looong way off from having the bandwidth to handle one channel like this. I believe Japan, South Korea, most of Europe, Australia and probably many third world starving nations will be able to handle these speeds far sooner than the US given all the crap the cable companies do to prevent any sort of decent internet speed. Thanks to all the MPAA (and others) and piracy, the US is in such shambles with regards to internet speed each person will have to pay for their own direct fiber to the house. I am betting on true 4G, or possibly even 5G speeds surpassing land speeds in a few years, although that's not much better.. given the unbelievable restrictions of 2GB per month all the carriers are putting on, it's pointless to even get a device with 4G or 5G.. who wants to run out of bandwidth after an hour of HD. I don't understand how the country that invented a lot of this stuff is by far one of the most backward/behind with allowing the customers to actually enjoy it. $200 a month (USD) for a 20Mbps connection, my 4G on my phone is faster most of the time (I am thankfully grandfathered in to the last "unlimited" plan until Verizon decides to stop that supposedly sometime soon, going back on their word as well). A friend of mine in Japan pays the equivalent of $22 a month for a 100mbps to the house with unlimited bandwidth... South Korea is cheaper I think I read.
Besides all that crap, it's going to be years before the standard is finalized and put into place.. 2014 is when they may come to a rough idea of the final spec.. a good couple years after that before we see it implemented and start being used, and mind you, besides bandwidth issues, you have CPU issues. Present day PCs might be fast enough if the encoder/decoder were mulit-threaded and used all the cores. What they need to do is put a simple chip like they do on blurays or what not that decode movies, into computers so it's separate from the main cpu/gpu, but that probably won't happen.
Interesting no mention of the audio format.. 4K/8K is supposed to usher in the 22.2 surround sound format as well. 4K is what most digital movies are shot on.. 8K is typically for IMAX (which is actually usually 5K or so). 22.2 surround is just nuts.. speakers below you, above you, to the sides, back, several in the front. Not sure any home could accommodate the 22.2 spec short of building a custom home theater room. I just finally put my 7.1 together and it sounds really good, can't imagine how much better 22 speakers is going to make it short of very few movie scenes where something goes under me and I can pin point the pan of the sound as it goes under me.