"The more you tighten your grasp, the more systems will slip through your fingers." -- Princess Leia
One more reason for users to cut the cable and go to streaming services. The broadcasters are just hastening their own irrelevance.
Jessica Rosenworcel, a commissioner at America's broadcast watchdog the FCC, has criticized a proposed set of TV standards as a "household tax," due to its lack of backwards compatibility. Addressing a conference of Catholic Bishops in Washington DC this week (we have no idea why either), Rosenworcel complained [PDF] that the …
How does that make any difference? You still pay the same content companies, the only possible savings is avoiding renting the cable boxes (which you can do already if you buy a Tivo and rent a cable card)
People who think streaming is a panacea are going to be in for a rude awakening in a few years. Already the streaming market is being fragment, with Disney removing all their content from Netflix in a couple years, for example. They'll start their own streaming service. CBS is trying to leverage Star Trek to push theirs, and if they succeed no doubt the other networks will do the same. Before long you'll end up paying more if you want to watch all the same stuff because of all the different subscriptions that will be required. The only people streaming will help is those who don't really care what they watch, they can subscribe to Netflix and just watch something and be happy, and won't care about all the stuff they can't get on it.
This post has been deleted by its author
Yep they are definitely serious.
We are on Sky Fibre unlimited in the UK - no usage caps or traffic management. I can download at full speed 24/7 for a month with no drop in service if I needed to.
Check: http://www.sky.com/shop/terms-conditions/broadband/network-management-policy/ and scroll to the section for "Sky Broadband Unlimited, Sky Broadband Unlimited Pro, Sky Fibre Unlimited,
Sky Fibre Max, Sky Fibre Unlimited Pro, Sky Ultra Fibre Optic products*"
The only listed drop in service is for external network issues or faults.
It's not even expensive really.
This post has been deleted by its author
"The broadcasters are just hastening their own irrelevance."
by behaving like MICRO-SHAFT???
I already have to 'rent' set top boxes, even for a 3 year old LCD TV. It has HDMI output (and composite for older TVs). But, the remote control is a piece of CRAP, and 3 or 4 buttons practically don't work after only a couple of years of useage. TV remotes last 10 times as long. But "the cable" went to DIGITAL a while back. So I _have_ to have it.
So I expect "more of the same" then, a special box to decode whatever the hell they broadcast at you. Nothing different except the details. and the increased monthly bill. And not wanting to see anything on 3/4 of the available channels. And too many FEELING commercials injected into the content.
[I had to edit the topic, it was too long with the 'Re:' prepended]
If you're referring to the transition from analogue to Freeview, I think that was done quite well.
Freeview was around for a long time before they finally switched off the analogue signal. And a basic Freeview box was pretty cheap (I think there were even some help to buy schemes for the disadvantaged). Plus undeniably it was a big improvement.
There's some people using 1930's TVs with a Freeview box. Not bad for backward compatibility (ok, they're using a scan converter too...).
The trouble with having done that is that the reasons to upgrade beyond that become significantly less compelling to the end users. Freeview is still Freeview, which is excellent, plus they've managed to sneak in a couple of HD channels. That's all been handled reasonably well.
And of course what they're doing in America is the equivalent of turning off Freeview altogether and starting from scratch. Doing that here would result in the Daily Mail exploding in indignation...
"If you're referring to the transition from analogue to Freeview, I think that was done quite well."
Yup. My reaction was to wonder if they're still on analogue over there. We've actually gone through two transitions in post-war TV broadcasting: VHF/405 lines to UHF/625 and analogue to digital. Both were handled with sufficiently long transition periods so that anyone actually forced to buy a new TV must have been using a pretty ancient one on its last legs: my last analogue CRT set used with an STB had one colour channel die.
No, the US turned off the analogue signals with the original ATSC transition last decade. Like your plan, there was a subsidy program to encourage people to buy ATSC tuner boxes for those who didn't want to give up their TVs. That's since moved on since ATSC-capable TVs are ubiquitous and inexpensive (think a little over $100 inexpensive).
The same issue would arise here if Ofcom mandated that all DVB-T and DAB broadcasts were switched off in less than 20 years from analogue changeover date (DVB-T2 was already out before then and analogue radio isn't going away any time soon)
Sales of DVB-T (only) or DAB (only) receivers have been nonexistent for a long time, but there are so many installed and in use that there would be consumer uproar.
In this case ATSC3 isn't even released, but the proposed changes would render existing ATSC sets useless in a short period - and given the dearth of USA terrestrial broadcast channels(*) it gives cablecos who already have a defacto monopoly an opportunity to further lock in customers by forcing yet another expensive decoder box to be installed and rented.
(*) A large chunk of the few remaining NSTC terrestrial broadcasters simply switched off their transmitters entirely on ATSC changeover day because the vast majority of viewers are on cable thanks to NTSC's wild colour shifts when any kind of signal multipathing (ghosting) happens(**)
(**) PAL fixes this by alternating the phase of the colour signal carrier on alternate lines, resulting in colour shifts on any line cancelling out the ones above/below and the overall picture displaying correctly at normal viewing distances, You can see the shifting in each line if you look closely enough. The other changes between NTSC and PAL are extremely minor(***)
(***) The basic colour tech is identical apart from the phase inversion. Using subcarrier suppression is a transmitter power saving tweak and the change in subcarrier frequency from 3.58 to 4.43MHz is a direct result of the difference in line numbers and frame rate of NSTC and CCIR mono broadcasts(****). The same multipliers are used in both.
(****) Some countries (france, russia, a few others) messed around with CCIR sync pulse polarity and in some extreme cases inverted the luminance levels (some of eastern europe) or used non-standard audio carrier separation (UK) to either keep out foreign broadcasters(*****) or lock in local manufacturing, but these changes are all relatively easy to dynamically accomodate even in 1960s-70s era sets. There are even a few wild-ass countries which used PAL-M (PAL encoded 525 line 60Hz) or NSTC-4.43 (NTSC encoded 625 line 50Hz)
(*****) And then there's SECAM, which is also CCIR-based, with the same tweaks as above, also used by protectionist countries(france and russia in particular), and subject to even more region-lockin variants aimed at keeping foreign broadcasts out (which even in repressive states resulted in consumers 'acquiring' decoders for the other systems). Aren't you glad most of the world has settled on DVB-T/T2/S/S2 and that ATSC is really only used by a couple of markets? (USA and some of its puppet states^W^Wneighbours/allies) (Yes, There's ISDB, but that market is tiny and fragmented into islands of incompatibility. And you thought TV was TV was TV... :)
"The same issue would arise here if Ofcom mandated that all DVB-T and DAB broadcasts were switched off in less than 20 years "
Well, in fact some changes did make some boxes useless well before that timeframe! The wedge shaped Pace DTVA-T box was one such thing I had to throw out, and there were many more modules consigned to the dustbin or hopefully recycled!
See
https://www.radioandtelly.co.uk/freeviewobsolete.html
for a list
"Well, in fact some changes did make some boxes useless well before that timeframe! "
Yes, I'm well aware of that, having had one such box go titsup - but that failure was a direct result of the manufacturer making assumptions that a field size would never change despite there being provision in the standard for it to do just that, not a change in the technology.
A bunch of other boxes (SetPal based) went titsup when the size of the program index pages increased and this was down to the exact same cause.
Reality says that we replace our setup boxes around once a decade if not more. The number of 1980-era TVsets still in use is miniscule, let alone anything older and even my late 2000s LCD "HD-ready" set is on its last legs already (the builtin DVD player packed up years ago and the CCFL lamps are going pink)
> If you're referring to the transition from analogue to Freeview, I think that was done quite well.
Some aspects were rather poor, IMHO. Starting in the North of the country meant that the (on average) poorer towns paid the expensive 'early adopter' prices for the kit. By the time the rollout reached the (on average) more affluent South, 4 years later, the kit was half the price or less.
For me personally, I'm on the Hannington transmitter but this was on 1/4 power for 4 years because of "the risk of interference to reception in Guildford". Quite why one of the richest towns in the country needed mollycoddling I'm not sure. In the end I went to Freesat because it was simpler than dicking about trying to understand whether it was my aerial or the transmitter.
> Some aspects were rather poor, IMHO. Starting in the North of the country meant that the (on average) poorer towns paid the expensive 'early adopter' prices for the kit. By the time the rollout reached the (on average) more affluent South, 4 years later, the kit was half the price or less.
I live about as far south (south west) as it is possible to get. My local transmitter - Caradon Hill - was switched over to digital in Aug/Sept 2009 and was the last transmitter in the SW to be switched. Thgis was a couple of months before Winter Hill (Manchester and area) and about 2 years before Emley Moor (Yorkshire). The North East of England was one of the last areas to undergo switchover, after London, with Northern Ireland being the very last.
Not exactly much of a north/south divide at work. Much of the south west isn't known for its affluence...
To be honest, by the time digital switch over actually rolled around, Freeview kit was already cheap and had most of the bugs ironed out. It was a full 11 years after it first launched, after all.
I suspect that "new" video display boxen (aka TV's) will attempt to be compatible with new standards, but you never know. In my house, I still have (count 'em) 5 NTSC only TVs. They still work quite nicely with the TiVo box that emits proper signals. Yes, I do have a bunch of "adapters" (with enough $40 coupons you can get quite a few) and a single W I D E screen video display box for watching sporting events at times (it also makes a great display for a Raspberry Pi).
Someone should have designed the ATSC standard to last a bit longer. NTSC lasted over 60 years in one form or another, and served us quite well. One thing I learned is that we humans can interpolate quite a bit in the visual field, and while some things need lots of resolution (computer monitors seem to be high on the list), entertainment TV got by quite well at 480p resolution for quite a while!
So, life goes on and another standard goes obsolete. (*SIGH*)
Someone should have designed the ATSC standard to last a bit longer.
I think the problem here (and it's not specific to ATSC, I could also mention DAB and DVB-T) is that until the mid 1980s, analogue was all we had (or at least all that was practical) and there was very little you could do to improve analogue TV without also consuming oodles more bandwidth. Japan had Hi-Vision while Europe had PALplus and D2 MAC, none of which really took hold.
From the 1980s onward, the mandatory SCART socket on televisions began to make people realise that their ordinary TVs were capable of extremely good pictures - some home computers could send RGB to a SCART socket, as could some video games consoles, and of course, eventually, DVD players. I have a theory that one reason DVD took so long to get going was that left-pondians didn't have the advantage of an RGB connection via SCART. With US TVs mostly only having composite or s-video connections, the picture quality improvement of DVD over VHS and particularly Laserdisc wasn't as apparent as it was to us Europeans. I know some US TVs had "component" inputs, which would have done the trick, but few DVD players had component outputs I think.
Where was I?
Oh yes, the difference now is that since digital processing of video has become relatively trivial, it's also trivial to keep making it better. A few years after one standard is set (say, MPEG1 layer 2 for audio as used by DAB) another one comes along which offers either higher quality for the same bitrate or the same quality in fewer bits, or a lower decoding burden meaning it runs better on low-power devices, or all three at once. The same is true of transmission standards, as exemplified by the differences between DVB-T and DVB-T2.
Somebody pointed out the well-managed transition from analogue terrestrial broadcasting in the UK to digital, but they failed to point out that there is a digital-to-digital transition under way as we speak. In some ways this is similar to the ATSC to ATSC-3 transition, but the difference is that DVB-T forces broadcasters to work together (effectively, many producers share one transmitter and thus one method of transmission) while ATSC was set up specifically to allow individual broadcasters to maintain sole control of their own transmissions.
In the last very few years, streaming has become a practical delivery method too, and this also alters the landscape. If traditional broadcasters are not to wither, they need to adapt, and adopting new transmission methods, particularly if they enable easier integration with net-connected services, could be useful.
Alongside the improvements in technology of course has come a vast reduction in the cost of receiving equipment. Even back in the early 1980s, a normal (for the UK) size colour TV probably cost in the region of a week's wages for most middle-class people. These days, when you can buy a connected, full HD TV for under £200 - even a newly qualified teacher can earn that in a couple of days - the TV has turned from a "consumer durable" expected to last perhaps 10 years alongside the 'fridge and the oven into a commodity item and manufacturers are able to produce them at such low prices partly because they expect repeat business every 3 to 5 years.
That's my 2p anyway, sorry if I'm late into this argument!
Oh, you also said
entertainment TV got by quite well at 480p resolution for quite a while
Firstly, it was 480i - there is a big difference between interlaced and progressive scanning and secondly, those of us in 50Hz countries actually had a few more lines of resolution (for home-grown programming anyway) at 576i.
M.
"I know some US TVs had "component" inputs, which would have done the trick, but few DVD players had component outputs I think."
I think the problem was the other way around. Component outputs became ubiquitous first (easier to do on the player end with the right board and chip designs), but as people held out their old CRT TVs that may have only had composite input (or nothing but the RF antenna input if they were really old or cheap), adoption was pretty slow until the PlayStation 2 provided another way in (come for the games which didn't need ultra-high-quality TVs, stay for the movies).
"Firstly, it was 480i - there is a big difference between interlaced and progressive scanning and secondly, those of us in 50Hz countries actually had a few more lines of resolution (for home-grown programming anyway) at 576i."
NTSC was lower resolution, higher rate. PAL was the opposite.
"those of us in 50Hz countries actually had a few more lines of resolution (for home-grown programming anyway) at 576i."
Except we didn't, because PAL (and SECAM) has half the vertical colour resolution of NTSC, so what we saw was effectively 238i-and-a-bit despite the nominal 625 line frame (576 visible).
The startling improvement in RGB video quality from computers and DVD players over PAL was due to the change from that effective resoution to a true 576i
This post has been deleted by its author
Re Alan Brown:
"Except we didn't, because PAL (and SECAM) has half the vertical colour resolution of NTSC, so what we saw was effectively 238i-and-a-bit despite the nominal 625 line frame (576 visible)."
Actually even that's not quite true - That did not apply to the luminance (brightness) signal at all, which gives those systems its higher resolution. The colour itself is literally smeared on top of that, at a very low smudge like resolution.
The horizontal chrominance bandwidth of modern analogue colour resolution was also greatly restricted, to about 1 MHZ due to the colour difference signals being placed on a subcarrier of 4.43MHZ (Pal colour) the luminance channel ONLY, provided the fine detail - The vertical reduction in colour resolution for PAL D and Secam wasn't a problem - the notion being that if it could be reduced horizontally, (as it always was) it could also be similarly reduced vertically too.
It should be noted that cheap simple Pal receivers with no phase errors, and a correctly adjusted decoder wouldn't suffer from decreased vertical colour resolution at all compared with NTSC!
It was just delay line averaging on of the chrominance signal after alternate line phase inversion of R-Y which reduced the resolution on more advanced decoders. Those phase errors then translated into changes in colour saturation. Perhaps modern techniques could even correct any phase errors without such averaging with a delay line.
It should also be remembered that digital colour television ALSO reduces the colour resolution in order to preserve bandwidth, and the only thing running at (almost, because digital TV is lossy compression)full bandwidth is the luminance content. All this simply takes into account the eye's inability to discriminate high resolution colour, horizontally or vertically. I'm sure one could easily sum up the luminance signal from an RGB source, and use it to re-create the R-Y B-Y and G-Y colour difference signals to see what kind of colour definition is transmitted on digital TV - In my opinion standard definition colour resolution on DVB TV often seems worse than PAL ever was!
> I have a theory that one reason DVD took so long to get going was that left-pondians didn't have the advantage of an RGB connection via SCART.
Y-C component (S-Video) input was fairly common on North American televisions by the mid-90s. While not as good as RGB signaling over SCART, it was good enough for the televisions of the era when viewing DVD movies. I'd argue that cost was the initial barrier to adoption of DVD.
Where S-Video was noticeably inferior was with game consoles and home computers. The colorspace and chroma bandwidth limitations were more of a hindrance with true RGB/I sources.
That's mainly because the standard was way ahead of video technology of the day; it wasn't until the late 80's that televisions could even show off the full fidelity of the standards. Admittedly, for its time, both NTSC and PAL were good technology that used an enormous amount of bandwidth to make up for their simplicity. Raw NTSC is about 50-100MB/s, depending on how accurate you want color to be, meaning that you could store a whole 1.5-3 minutes of raw video on a DVD-9. It took a LONG time to outgrow that, but once HD showed up, that was that.
On the other hand, there's now lots of investment in continually improving the state of the art, and where ATSC could meet the needs of HD easily, it's again not going to work for 4K or HDR/deep color. This changeover is as much consumer-driven as industry-driven.
It's not like ATSC 1 barely came into being and now it's time to toss it, it's over 20 years old as well (though the H.264 extension is only 10 years old). By the time the new standard is ratified and anyone starts broadcasting with it, we're probably looking at another decade at least. There's only so much future-proofing you can put into digital technology with fancy algorithms, since it still has to be cheap enough to purchase early on.
"It's not like ATSC 1 barely came into being and now it's time to toss it, it's over 20 years old as well "
The issue is not the age of the existing digital standard, it's the time taken since the last time that people were forced to upgrade their sets or settop boxes on pain of them no longer working.
> The issue is not the age of the existing digital standard, it's the time taken since the last time that people were forced to upgrade their sets or settop boxes on pain of them no longer working.
Like I said, what's the point? By the time the standard is hashed out, ratified, implemented, and finally cut over, you're looking at a minimum of another decade, maybe even two. But thanks for ignoring that.
This change may lead me to dropping my TV subscription depending on the cost of replacing the TVs. I have one aging 42" LCD TV and new(ish) 55" cheapo. It is entirely possible that when those fail I won't replace them. I'll be Internet only then. I don't expect the savings will be huge though. As more people go Internet only the price will go up.
It's just a subsidy for the electronics industry. It rewards South Korea for letting us further militarize the area. As you point out, it's not like any large number of people care about 3D. While 4K isn't quite DOA, there's not a lot of interest, either.
They got a bump eliminating analog TV. Guess they're looking for another.
In fact, manufacturers were see flat sales after the big switch to ATSC, mandated by the government. The industry tried to goose sales back then with 3DTV, it failed, they’re trying again now with 4K. I guess it isn’t working as they’ve got the government to mandate the initiatives in such a way to make all of our TVs obsolete.
4K DOA? Haha I actually intentionally downscale 4K content. I don't want to look at people under a microscope. 4K is great for car chases, but it's horrible when you see how bad your favorite actress's skin looks when displayed as a close up on a 65" screen from 2 meters. 4K is absolutely horrible.
And I saw a few 3D movies and I actually stopped going to the movie theater because of them. I'd rather watch a film on Oculus Rift if I want it huge. In fact, it costs about the same to Oculus as it does to go to the movies and have snacks a few times a year.
4K DOA? Haha I actually intentionally downscale 4K content. I don't want to look at people under a microscope. 4K is great for car chases
Personally, I'd rather spend the bandwidth on true 100Hz progressive scanning than on upping the resolution. High frame rate video takes a little getting used to, but it makes a much bigger difference to fast-action sequences than does a few more pixels.
M.
"High frame rate video takes a little getting used to, but it makes a much bigger difference to fast-action sequences than does a few more pixels."
But which occurs more often? Fast or slow motion? 60Hz is usually sufficient for when it's needed such as sportscasts (thus why ESPN prefers 720p), but when things get slow (and I strongly suspect slow is the norm), we start to pay attention to more details, and there resolution has the edge.
"While 4K isn't quite DOA, there's not a lot of interest, either."
In both cases (4k and 3D), interest follows content availability and broadcasters/program makers have _never_ been particularly interested in making 3D content - it's both an utter cow to edit and requires cinematography techniques which are completely alien to anyone trained in the last 60 years (you can't emphasise something by focus, it has to be by lighting, which is even more work to get right and why most 3D productions which aren't just 3D showcases only use a very limited depth of field)
Part of the problem with 4k uptake is that most people who are interested also know that it's an interim standard on the path to 8k and that 99.9% of the existing material is 1080p at best and is so compressed at broadcast that in a lot of cases it's visually no better than 720p and often only slightly better than 576i - which in turn means that people buying 4k equipment frequently feel they've been duped when comparing their actual viewing experience with what was on the showroom floor, or are using it to watch non-broadcast media (4k computer desktops on a 44inch HDR OLED are _nice_)
3D is likely DOA for most as it generally does very little for most broadcasts. Also, it is known to make some people sick.
4K strikes me as not terribly useful for most people even if they have equipment and a 4K signal. Part of the problem is the physiology of the human eye and one's ability to focus. Also, I suspect the higher resolution would be pointless at the distances many are from the boob tube when they are watching.
There are lots of questions around 4k that can be resolved quite easily as to whether you get better quality.
If you want to see where 4k works then I suggest you view the opening credits to stranger things season 1 on full HD and 4k.
Once you have done that then you will see the difference.
Indeed, as eyesight deteriorates with age, the benefit of 4K is grossly overrated for many people.
Low RES is (mostly) OK.
We don't watch much telly, so have small screen TVs, what we do watch is recorded on PVR and watched some time (often weeks) later. On smallish telly, not really much advantage in recording HD compared to SD (when HD option available). SO we only record SD as space saved more than outweighs the minimal laity advantage on our low end TV hardware.
Is 3D really still regarded as a selling point?
3D was a "nice to have" when it didn't add more than a few quid to the cost of a TV. Passive systems had their problems but they worked really well and in particular the glasses were cheap (and compatible with RealD cinemas).
We have such a TV at home, and a reasonable selection of films. The main problem we find is that you have to "watch" a 3D programme - it's impossible to have it on and do something else at the same time.
As far as I'm aware there isn't a single manufacturer offering a 3D TV in the UK domestic market at the moment, so I really don't know what we'll do when our TV dies. Perhaps by then it'll be back in fashion.
3D seems to be hanging on in cinemas, the problem there being that they charge too much extra. People might be willing to spend it for a big action movie, but 3D adds relatively little to a RomCom.
At work we show occasional films to the public. We have a licence which allows us to do so, so long as we don't charge. Some of these are 3D and while people don't seem to be put off by a 3D film, unless it's a special event they don't seem to go out of their way to attend our 3D screenings.
We are in the middle of a system upgrade at the moment. Our existing passive 2-projector system is being replaced by a 1-projector system. The polarising filter for this system retails at around £4,000 ex VAT.
M.
Alan Brown:
"And thus we satisfy the maxim that it's pornography which has been the real driver of consumer adoption of video standards over the last 50 years."
Hmmm... I don't ever remember that anyone could easily get any porn on my 50 year old dual standard Bush CTV 167 25" colour TV when it was new, nor for quite a long time afterwards! One could now of course, if one is prepared to put up with the intermittent frame jitter it is troubled with! ;-)
"I don't ever remember that anyone could easily get any porn on my 50 year old dual standard Bush CTV 167 25" colour TV when it was new"
You can thank Mary Whitehouse for that. Early 60s UK TV was quite racy before she and her mob got involved.
In the case of _broadcast_ standards, you get what the government of the area dictates. In the case of _consumer_ standards, you get what has the most purchased content.
Martin an goff:
"
Passive systems had their problems but they worked really well and in particular the glasses were cheap (and compatible with RealD cinemas).
"
The problem with 3D on domestic television, is that it simply wasn't good enough and gave the whole thing a bad name. Especially with the frame based active system requiring battery operated glasses.
Recently I bought a 55" 4K OLED TV with passive 3D, and the 3D performance on that thing is simply stunning, and for 1080p content you get full resolution. Watching Avitar looks exactly as I saw it in the cinema, but smaller. There may be hope for 3D (for enthusiasts) but only on passive 4K systems.
I also have two Samsung 1080p LED screens (active 3D) and my kids have a Toshiba 1080p screen (passive 3D) and the Toshiba is by far the best, but still nothing like the 4K screen. The Samsungs, good as they are otherwise, are a complete waste of time for 3D content. Horrible in fact.
... and good stereo speakers nothing else counts. If you have less it will still be sufficient for the usual American TV production standards. Shenzen will have a set top box for $35. Where is the problem.
No, I have no idea what the new (US) standard entails and no intention of reading it, I think my comment has some merit despite that.
My folks, in their 70s/80s want a TV with TV shows on it. The fact that their device, which they chose without intervention will show all sort of internet and handle internet TV too is irrelevant : its all too complicated. When i put something on youtube on their tele it is met with "what channel is that".... every.time.
My nephew and niece, < 8 are hardly aware of the TV, its just another screen and its the one where touch doesn't work to boot.
Me, ~40 ... just dont care for all the contracts. I can get my stuff streaming or torrented on the big screen or a tablet.
The cable companies are here to see out the older generations. The only driver for any cable is sport and even that is covered by IP TV companies now. Why do i need either cable or a new TV?
I do have a subscription capability, though I have yet to watch anything via its 'services'. I thought that the 'entertainment' was the hunt for something to watch. After about 20 minutes or what feels like a lifetime I give up.
I get regular e-mails inviting me to see the latest totally non interesting offerings so really what is the point? I simply PVR anything of possible future interest and watch whatever I can find to pass the time, when I need a time filler. This has the huge advantage of allowing all the adverts for total crap to be bypassed. As for subscribing to another advert channel with periodic inserts of non advertisements - that is for birdbrains.
I am told that there is also something called radio or wireless, (cable co clowns please note), no 'k' but it often has programmes
"When i put something on youtube on their tele it is met with "what channel is that".... every.time."
I'm tempted to call BS on this. I'm in my 70s. I don't know any of my various older friends and relatives who would be in the least fazed by this although personally I wouldn't have an internet connected TV in the house; something I can control goes between the net and the TV.
I'm tempted to call BS on this. I'm in my 70s.
To be fair, you're obviously slightly unusual as you are reading El Reg.
I don't know many of my various 70+ year-old friends and relatives who wouldn't be fazed by this. I do know one or two who would manage, but they have the eminently sensible attitude that if it doesn't "just work", it's rarely worth doing(*).
Switch TV on, select BBC1, watch Countryfile. Works every time (unless the Magpie's playing see-saw on the aerial again).
Switch TV on, find the menu option that allows you to run the YouTube app or the iPlayer app or whatever, wait an age while it connects (if it connects at all), tediously type something into the search box, hope that what you want comes up in the first page, select it to play, wait another age while the thing starts loading and buffering and then, insult to injury, two thirds of the way through the programme the blasted thing stops working and it's too much hassle to reload it and find where you were in order to catch the end.
My parents have a STB that allows them to scroll "backwards" through TV listings. Sounds fantastic, but they still have the "wait while it connects and buffers" problem and also the problem that not all broadcasters are on the system, and not all programmes are on the system.
Until the thing is as quick and reliable as Ceefax used to be (and Ceefax wasn't particularly quick), it's all sort of "meh" to them.
My mother-in-law, on the other hand, won't even have an internet connection in the house, not even if it meant she could make a video call to her BSL-using daughter, or her distant grandchildren.
:-)
M.
(*)I have to admit here that I do have many older acquaintances, whom I only know through a mailing list, who are quite thoroughly tech-savvy
"I don't know many of my various 70+ year-old friends and relatives who wouldn't be fazed by this."
My father, for one. I only found out recently that he was _scared_ of computers in the 1970s but gritted his teeth and took ~7-11yo me to various university workshops because he could see where things were going, then put up with computers in the house through the 1980s. He can just about make a skype call to me (after a decade of encouragement), but switching away from that to anything else is impossible.
On the other hand, as an early ISP (back in the early 1990s before the telcos parked their tanks on everyone's lawn) I noticed that most of my enthusiastic technology adoptors were 65+yo retirees.
"My mother-in-law, on the other hand, won't even have an internet connection in the house"
If she's getting on she'd pretty well need one now if she lived round here. The local doctors have stopped taking requests for repeat prescriptions by phone so it's either online or a personal visit and, although it's less than 3 miles, that's a change of buses if you don't drive.
What people forget is that Intel introduced the first generations of microprocessors in the '70s, rapidly followed by the 6800 & 6502, and the IBM PC* and Macs came out in the early '80s. Those of us who were in our 20s & 30s then, 60s & 70s now, were the generation who recognised the potential of this stuff as working tools. They were around for half or more of our working lives. It's almost as difficult for us to remember a time without them as it must be for you to imagine it.
* Before they effectively hi-jacked the term everything was a PC - Apples, PETs, Trash 80s, various S100 beasts, the lot.
My parents are 84 and 81, they own an Android smartphone, two tablets, two Kindles and a desktop PC. They regularly watch streamed content without any help from me; I'm 250 miles away. Their service of choice would be BBC iPlayer though, I can't imagine them finding ANYTHING worth watching on YouTube since I can barely find anything other than game or movie trailers myself.
So not all old timers are ignorant luddites.
Personalized content means the TV people own your identity and can do what they like with it, all the way to Google and Facebook and back. An OTA broadcast won't be allowed on equipment you thought you owned because you paid for it, not until they verify your personal details, which are not your own any longer.
Wanna bet? Wait until (1) they get whispernets installed, so they don't need to piggyback on you, (2) they become standard issue so ANY appliance you get in future will have them, like it or not, and (3) they're on suicide circuits so that you can't break them without breaking the appliance (oh, and that'll be considered tampering, so no warranty for you).
> My television is never getting connected to the internet.
I don't connect mine either. It seems as if most television manufacturers stop producing firmware updates after two or three years. I have no faith that they will remain secure. I have almost no faith that they were ever secure in the first place.
What scares me about ATSC3 is that there will be a large push to have televisions connected to the internet for authentication, interaction, and personalized advertisements. Yet I've heard almost nothing about data privacy, data retention, encryption, and firmware quality.
I dread the day that I have to run antivirus software on my TV or have to jailbreak it to run privacy and ad blocking add-ons.
In SOVIET RUSSA [read: modern USA], TV watches YOU!
how many set top boxes will contain a barely visible camera hole that uses various "patented by Micro-shaft" (I kid you not) technologies to determine HOW MANY PEOPLE are watching, maybe even attempt to guess at sex, age, etc., and SLURP all of that info to market commercials DIRECTLY SELECTED FOR YOU, etc. etc. and even figure out when you go to take a whizz or grab something from the fridge...
forcing a NEW round of "smart" TVs and set top boxes may just be another way of slipping THIS in. yeah. Think of it as political Astro Glide...
It's always been different? PAL vs NTSC (probably why Japan tried to get HD asap). As for why, no idea, but wouldn't it bring the same problem this article points out? DVB-T2 is not American so would have zero backwards compatibility with anything the Americans have, requiring a brand new set top box.
As for why, no idea
My impression - apart from the "not invented here" aspect - was that DVB (and DAB) requires that broadcasters give up their individual transmitters and operate, or buy space on, a shared data pipe, known as a multiplex. Each multiplex delivers a fixed amount of data to the receiver, and that data is split up between a number of channels, some of which might be television, some audio-only (radio), some data-only and so-on. It means that small broadcasters are able to enter markets they were previously barred from because all they have to do is buy a sufficient amount of data from the owner of the multiplex.
In contrast, ATSC is designed to be used by a single broadcaster, perhaps with a few subsidiary stations.
But the US terrestrial broadcast market has always been very different to the European market, particularly with regard to the lack of single country-wide broadcasters, and the sheer amount of thinly-populated space. US cable is different again.
If you like, if DVB (and DAB) is a socialist solution, ATSC is the capitalist answer.
M.
The first generation of HDTV tuners went into the trash a long time ago. They were some DVD chips and a pile of open source junk found on the Internet. They ran hot, crashed constantly, and had compatibility issues.
The second generation tuners were popular with plasma TVs. Plasma was the only tech at the time that could show 1920x1080 without weird motion artifacts. Today those TVs aren't as bright as they used to be and some people might be tired of their 300W to 900W power consumption heating up the room. Or the dithering flicker. Or the power supply hum. They won't be around in 5+ years when ATSC 1.0 goes away.
New TVs that are most likely to be in use during the conversion should have a spare HDMI port for an external tuner dongle. Might as well get started on ATSC 3.0.
I gave up on standard telly years ago. Separating the shite from the flushed wedding ring isn't for me. The 4 or 5 shows that I like can be had without a massive cable or satellite bill showing up every month. I could afford a much better holiday every year by ditching the subscription.
"apart from games there is no compelling reason to buy into the VR Hype."
But, like 3D, it'll be back again in cycles as each new generation thinks it's "The Next Big Thing". The only downside is that the ever decreasing attention spans of what seems like a majority of the population means the cycles are getting shorter.
If all they wanted to do was add support for 4K they could have added that as an extension to ATSC 1.0 and remained backwards compatible, but they changed its physical layer to use OFDM (i.e. like LTE) instead of 8VSB because it has far better multipath rejection (and works while in motion, because some hopeful fools think people want to watch broadcast TV in moving vehicles on their phones)
Multipath rejection is important because it will allow a TV station to broadcast on the same frequency from multiple towers, instead of currently where you need to broadcast on different frequencies and use PSIP to make it look like the same station. Making it compatible with LTE is important because instead of having just one or a handful of giant towers, they could have one big tower and then a lot of little transmitters on LTE towers to cover areas the big tower can't reach due to terrain. That will be far cheaper than running a bunch of translator towers in mountainous areas out west.
Not having an ATSC 3.0 tuner built into TVs is irrelevant, you'll be able to buy a tuner the size of a pack of cards with an HDMI pigtail on one end, and a coax port and USB port (for power) on the other for $50. They ought to quit mandating TVs have tuners at all (if they don't in the US you can't sell them as televisions, they have to be sold as displays) since the ATSC 1.0 tuner will become less and less important and the QAM tuner is mostly useless as more and more cable systems encrypt even their standard definition signals over the next few years.
Rather than hold back progress, if the FCC was concerned they could subsidize purchase of ATSC 3.0 tuners for people below a certain income level via rebates or something. Given how much money the FCC collected in the recent 600 MHz auction, this would cost only a few percent of those billions.
Excuse me while I ROFLMAO! Being sorta involved when both DVB and ATSC were being developed, I never understood why the Yanks were so stuck on 8VSB. All the tests then in all sorts of environments showed that OFDM was better. Hence most of the rest of the world went with DVB or variants (Japan, Brazil).
This post has been deleted by its author
They ought to quit mandating TVs have tuners at all (if they don't in the US you can't sell them as televisions, they have to be sold as displays)
The only thing wrong with that is that they'll want to charge more for displays. The only thing I want the screen in the living room to do is act as a display for the selection of boxes connected to it. Anything off-air comes in via the PVR, anything off-net via the Pi.
"The only thing wrong with that is that they'll want to charge more for displays."
S/want to/already do/
Seriously, you can buy a 4k DVB-T2 TV with displayports cheaper than the displayport-only tunerless version (LG, Samsung, various others)
In fact, if you look at the research both in the lab & in the field, noise & multipath rejection of 8VSB is in most cases on a par with (within a couple of dB) the 64QAM used in DVB-T, and slightly better than that found in 256QAM in DVB-T2. And, despite early concerns, they managed to retrofit SFN operation to it fairly successfully; although not used much in the US due to the market structure there, the few SFNs they do have are comparable to the largest DVB-T SFNs elsewhere.
If they really wanted to improve those factors that much, they would've adopted something similar to the ISDB-T used in Japan, etc. Better noise & mutipath rejection, more flexible SFN operation, hierarchical structure allowing for low power ESB/EW receivers, etc, etc.
In reality, it seems ATSC 1.0's near-complete lack of extendability doomed it.
Lack of extendability was probably a feature to prevent creep. They would rather have the forced transition to ATSC3 than for things to creep and people to complain about extension support. They probably figured things would move on (and they did from MPEG-2 to MPEG-4 to AVC to HEVC) and that it would be better at some point so draw a line.
This seems to mean that you need an internet connectuon for the TV to work. The internet is comming to be regarded in the same was as electricity and water, an essential domestice service provision.
If you have to have internet and a screen, the bit you don't need is a tuner. So TV is doomed. Once the older generation find their TV does not work, they will learn how to use IP TV.
This "Old people can't do internet/PCs" is twaddle. Computers May have been young people's stuff once upon a time. The internet is now 35 years old. So people who were adult when it started are now over 55. My mum was a computer programmer in her 40's, and is now 91.
"Stupid people need help" is definitely the case, nothing new there, and "Samsung keep updating the Smart TV UI, and now we can't find Youtube" is also a problem - easily solved by not buying Samsung again.
People will soon realise a new TV is NOT the answer they are looking for.
"People will soon realise a new TV is NOT the answer they are looking for."
Wanna bet? People still want turnkey solutions? Turn it on, punch a number, watch. End of. Unless you can do simpler (and to do that, you'd have to go psychic), that interface is still going to be the choice of people who insist on KISS.
"Except we all know that if the ads aren't let through then we won't get to see what comes over the air. Technogeeks might be able to get around it but it will soon become the usual game of cat and mouse."
Bet you the decryption key will be buried within the actual data stream of the ad, probably in various pieces with the key piece only at the very end, meaning the ONLY way you'll be able to watch your content is by seeing the ad through to the end. Then, the content itself will slip key plot points right at the beginning, so you'll need to be paying attention from the point the ad ends or you miss most of the plot.
@ John Smith 19
If cable companies love this so does he.
Looks like he's keeping his head down at the moment...
(Caution - link to external content - wired.com!)
https://www.wired.com/story/fcc-chair-ajit-pais-silence-on-trump-tweets-speaks-volumes
or stuck up somewhere in Washington or New York
Since the switch to the ATSC standard and digital TV in the US and in Canada, the range of the OTA signal is significantly less and also the picture is either perfect or blocky at times . I wonder if the new ATSC 3 standard will improve this? I doubt it. I've gone to streaming and find it to be fine with a Roku. I can subscribe to the channels I want and can get local news. Cable is outrageously expensive. I would rather have free OTA tv but living in a rural area, there is no signal. So no new TVs here. Digital TV isn't all it was meant to be.
Since the switch to the ATSC standard and digital TV in the US and in Canada, the range of the OTA signal is significantly less and also the picture is either perfect or blocky at times
That's odd, because one of the arguments for choosing 8VSB in the first place over the COFDM that everyone else uses was that it (apparently) works better in the fringe reception areas. One selling point was that you could get the same coverage as with analogue but using 25% of the transmitter power.
One problem is that the single-station nature of the US market means fewer actual transmitters. Most of the rest of the world is replete with smaller repeater or fill-in transmitters, and one of the reasons this works is because the cost is shared.
I have no idea if this will work, but (Google Streetview) this (at the back of the car park) is a tiny repeater station serving no more than a hundred houses at Van Terrace.
In the UK Freeview (terrestrial) and Freesat (satellite) are utterly subscription-free (unless you count needing a TV licence in the first place) and where Freeview doesn't cover, Freesat does for the price of a 40cm or 60cm dish and a receiver. Some TVs have both DVB-T (Freeview) and DVB-S (Freesat) receivers built-in. Subscription services are available if you're that way inclined (sport and movies, mainly).
Apart from the waste-of-space low-bitrate "+1" channels, digital TV is working pretty well here.
M.
The new (since ATSC switchover) definition of "fringe area" is apparently "50 km from city center". Yes, I can get OTA from over 200km, often, but only in some directions, and lost quite a few local stations at the switchover. Who could have guessed that hills could attenuate signal?
I also have to wonder if the new-new standard will fix the often horrendous skew between video and audio. I first noticed it on an analog TV, during the dual-standard period. Figured out that while the local-transmitter -> TV path was analog, the networks had taken to digital transmission from network to local station. I am still baffled why, 90(?) years after Western Electric introduced sound-on-film, the boffins at Digital R' Us can't keep sound within a few seconds of picture. Some films seem to have been badly dubbed from English to English (Both cable-QAM and OTA, BTW, so it's presumably a problem in constructing the stream)
"Better Compression" will also presumably mean more situations where a line across the screen and a tick of sound (analog) gets turned into muted audio and a picture in the witness protection progam for several seconds (at a crucial plot-point, of course) .
Yeah, I'm old. Old enough to remember when CATV was becoming a thing, and political battles about actual communities versus private industry featured arguments about how the private path would lead to vast amounts of quality programming and no commercials, at a nominal fee. How's that working out?
> The new (since ATSC switchover) definition of "fringe area" is apparently "50 km from city center".
Who would have guessed that switching from mostly VHF frequencies to mostly UHF frequencies would have had an impact on signal range?
My grandparents used to be able to pick up six analog VHF stations from over 125Km away. After the switch, it dropped to only two VHF-Hi stations (11 & 13), and only intermittently. Not a single UHF station made it that far.
Since the switch to the ATSC standard and digital TV in the US and in Canada, the range of the OTA signal is significantly less and also the picture is either perfect or blocky at times .
<rant>
One thing to realise about TV in Canada (and possibly the US) is that the objective of the broadcaster is not always to provide OTA reception. Putting up a transmitter gets you on "basic cable" in that area even if the reception is crap. Hence here in Ottawa some transmitters are in cheap rather than good locations and, since the transmitters aren't all co-located, you need an antenna rotator and a tower if you want to get all the OTA stations.
</rant>
Fortunately there is the internet.... although the speed sucks in rural areas.
What'd so obsolete about it? It's not like you can measure things in degrees Celsius that you can't in degrees Farenheit. If you wanted to get REALLY serious, you should be pushing for Kevin stead.
After all, Brits are still using miles, stone, and Roman rail gauges. If they can't go all-in, why should we?
I do not think it is cable companies that want this so much.
It is OTA broadcasters that want to compete with cable companies.
The USofA was one of the first nations to finalize digital broadcast television standards, so they used some fairly ancient technology specifications to do so. Those nations that converted much later got to use much more efficient and advanced standards.
180 million households in the USofA sometime after 2021. $100 coupon per box. 5 boxes per household. Add in an equal amount of administrative overhead. $1000 per household. $180 billion dollars. Where are we going to get that? Reduction to two boxes per household? Still $72 billion dollars.
... to hold on to their model of locked-in viewers ad content.
It won't take long before people realise you can get any TV once and plug in a PC in a set-top form-factor to get all the TV you want with no more forced upgrades, artificial limitations or ridiculous charges to achieve the same quality people on the internet get for free.
Steps to implement on my ancient plasma TV:
1. Cancel AT&T U-verse (yes, I'm one of the holdouts who actually like it**)
2. Sign up for high(er)-speed DSL service on the SAME LINE. See, I only have a choice for home internet between AT&T (twisted pair) and Comcast (coax). Had Comcast once; not going back for ANY reason. Pretty sure either company will make me pay dearly for internet-only service, given what I'm paying now as PART of the U-verse.
3. Sign up for various streams: Netflix, Hulu.
4. Oh, and buy an antenna for the TV -- never needed one since I got it; had U-verse since April 2008.
So, maybe I'd break even on the monthly costs, and maybe I won't. I figure it works now, so why "fix" it? If I do it later and need a new TV/display due to ATSC 3.0, GOOD -- I can retire the old plasma!
** The neighborhood wiring is from 1993-94 and in poor shape. The techs have been improving the junctions at various neighbors' houses between me and the main digital head-node, but when they do -- including when houses sell and new neighbors move in and AT&T comes out for new service -- my line goes full-dark for a few hours. Wouldn't be so bad if the DVR would still play independently, but alas. And the equipment mysteriously degrades over time: I've gone through 3-4 DVRs and 4-5 modems/gateways/Wi-Fi & Ethernet hubs. But overall it works, and actually getting better.
(In that same timeframe, I've had only ONE tiny Vonage VoIP box. There was a second, prior unit which suffered a lightning spike that passed THROUGH Comcast's modem/Ethernet hub without affecting the hub but frying the Vonage box AND my old laptop's Ethernet port.)