I don't sense any urgency at all, either from broadcasters or viewers. The only urgency seems to be from manufacturers, who want us all to buy new tellys every 3 years.
The European TV industry’s uneasy shift to Ultra HD came under the spotlight at an industry confab organised by Astra satellite owner SES last week. Monty Python foot in UHD TV One step at a time: UHD TV isn't on every broadcasters To Do list Although after listening to TV manufacturers, researchers, market analysts and …
Virtually all of the UK based HD channels are broadcast in 1080i. Nobody in the UK really bothered with 720p, as it wasn't much of a step up in quality from 575i. However 720p is very popular in the US as they were lumbered with 480i, so many US imports shown in HD today in the UK are upscaled copies of 720p imports.
"Nobody in the UK really bothered with 720p, as it wasn't much of a step up in quality from 575i."
Maybe that holds true for broadcasters but as an end user with a 720p plasma in the living room who gets to see the difference between SD and 720p daily, I can guarantee you the 720p looks much better. I would buy a 1080p TV but I'd have difficulty justifying it to the wife until the plasma blows up. I've tried encouraging the cat to piss down the back of it (the telly, not the wife) but no luck so far.
This post has been deleted by its author
@jaywin - you're incorrect that in your assertion about 1080i being much better than 720p. Those letters are important, not just the headline numbers. An interlaced signal has reduced vertical resolution compared with a progressive one, so 1080i does not have 1.5 times the vertical resolution of 720p. Moreover, interlaced signals are harder to code than progressive, leading to more artifacts for a given compression ratio. Even worse, flat-panel displays cannot display interlaced pictures natively, unlike a CRT, with the result that you get de-interlacing artifacts, only mitigated by complicated processing which has other side-effects.
Interlace was a clever 1930s idea to reduce large-area flicker on CRTs displaying a low frame rate, and has no place in the digital era. It was commercial pressures that favoured 1080i in a VHS/Betacam-type scenario.
Also 1080i is lower bandwidth than 720p, just 540 lines shown each frame vs 1080 with progressive.
i read many years ago it was the reduced bandwidth that the providers liked as they didn't need to rent so many frequencies from the satellite providers.
720p had no traction in Europe either as acquisition format or deliverable.
Indeed not. When you have had PAL and SECAM already for several decades 720i doesn't bring much to the table especially if you get a bit carried away with compression. Some EU countries even made use of PAL+ to deliver widescreen format over analogue although in the UK I think only C4 did that and probably only for films.
They pioneered HD by monopolizing a lot of the content then charging a fecking fortune for the box and the service. Satellite is a reasonable method for delivering TV content, but no better or worse than cable or broadcast IP in most situations (obviously some people will be better served by a particular provider in some locations). Download on demand / streaming UHD is not likely to cause the sky to fall in, not much content coupled with a slow uptake of sets and improving compression should mean that connections evolve fast enough to keep up for the majority of folks. Plus, the fall back of 1080 from UHD shouldn't be drastic so marginal connections won't be terrible. This sounds like marketing bollocks, so essentially just marketing.
At least on Freesat you can get BBC4 HD, which is just about the only channel with anything worth watching. (Still loads of repeats, to the extent they now have to put 'New' in the title so you can spot the original stuff.) I don't know why they even bothered with SD, it seems to be worse than the analogue channels DTV replaced. Big step backwards!
Freeview SD is quite capable of producing a decent picture - and indeed at one time did, except on the squashed 544*576 services. However, at least to my eyes, the pictures have become softer and softer as the compression has become "more efficient" to the extent it looks rather like paint on soup.
These days you're lucky if you can even see the turd, never mind the reflections from its surface.
4k over compressed is still a better image than a sharp 1080p compressed image (by the numbers). It's the extreme end of the scales, which to meet costs/profit margins, do the real harm.
The good quality content (as many others have posted here) can benefit from the extra clarity and definition (see nature programs etc which show really good detail and crisp images).
The local cable monopoly here (Rogers aka Robers) like to call their service "Digital Quality" then imply that Digital is perfect. Given the choice of keeping the current quality or adding another channel they always go for more channels. A Big sporting event might look like 720p but the less popular stuff looks more like VCD.
I don't disagree with the content being largely crap. But there is so much content that, if I were to watch all of the 1%, I wouldn't have much of a life left.
So, for practical purposes there is enough quality TV for me. Just starting on Breaking Bad - 55 episodes to go?, GoT season 4, House of Cards 3 coming up, at least 3-4 BBC series worth considering. More choices than time. Those may not be your shows, no, but most people with some form of good taste need not watch Idol, Kardashians or Glee.
Can't say I care overmuch about UHD, but I'd rather watch a good show in 1080p than 720, if available on both. When UHD 50"s are <$1300 and there is abundant content I'll see if I want to upgrade.
Soon every TV you can buy will be 4K. It'll make absolutely no difference to you if you're going to bung 1080p into it, but it'll be there. I'm pricing up a new TV for the back room, and it'll be a 4K one, simply because the TVs with all the features I do want are now all 4K.
Don't need it.
I still don't even have HD, and I don't think it would help.
When I'm watching TV, my most common thought is "There's too much CRAP on this thing."
Crap in HD (or UltraHD) is still crap.
Every time this comes up, I'm reminded of Bruce Springsteens "57 Channels and nothin on."
Up the quality of the content, then we can talk about better picture quality.
On top of all of that, many of the shows I do watch would be WORSE in HD (or UltraHD.) CSI (in all its variations) and other detective/police shows show too much gore as is. Higher resolution gore would make me have to quit watching such programs entirely.
PS: If you (broadcasters) want to use more bandwidth, try broadcasting in multiple languages - that'd be of more use than ultraHD. The shows here in Germany are often American programs that have been dubbed in German. I'd like to be able to switch them back to the original audio - but they either broadcast just in German, or in multiple German editions. Just gimme the original language ffs.
You're unlikely to get the original english soundtrack, I'm afraid. A lot of german channels have historically been broadcast in the clear on satellite. And, unlike the UK, where we have restricted beams that are supposed to minimise overspill, that's not generally the case.
Certainly last time I tried it, you could point a dish at 19.2 from the UK and get all sorts of European stuff, broadcast in the clear, including dubbed versions of US shows.
And while that's possible, you're very unlikely to get the english soundtrack - it would make those broadcasts appealing to many, many more people across the continent. A German broadcaster can buy, say, CSI, dubbed in to German and will pay less for that right than if they were - effectively - broadcasting the original version on a pan-european beam.
If, say, the Czechs were able to watch a German broadcast, and simply select the English soundtrack, then their broadcasters would feel less need to license the same show, dubbed into their own language. And heaven forbid the makers should miss out on some of their cash!
SD? I don't see why you'd want that, or sound. I just draw flip cartoons in the corner of ex library books. Keeps me amused and I'm guaranteed to like the content (who could argue with classics like bouncing titties or the ever popular hula girl). Plus I can do it backwards and upside down and get 4x the content.
>> The shows here in Germany are often American programs that have been dubbed in German. I'd like to be able to switch them back to the original audio - but they either broadcast just in German, or in multiple German editions. Just gimme the original language ffs. <<
I did a student exchange UK to Germany in 1986 and I was amazed that the cable tv they had could do exactly that - they had English and German sound tracks switch-able at the press of a remote control button. Good to see after all this time and massive advancements in tech that they've forgotten how to do this....
So long as broadcast requires any cooperation with the people from TV licensing (who keep insisting on placing offensive adverts on BBC Radio 4), who seem to be the only vendor in the world who thinks they are entitled to repeatedly phone me to at work to question why I have chosen not to partake of their service, you can count me out. I buy my content on my own terms.
I don't have a lot of experience watching 4K contect, but for the home setting, from my experience its useless. On a normal consumer sized TV most people would be hard pressed to tell the difference between 720 and 1080 content from their couch, let alone 4K. It's a waste of useful bandwidth imho.
Indeed. Best case for decent eyesight is about 1 arcminute. At 3m from the screen (you won't be much nearer a TV, even in a cramped British living room) that is around 0.9mm per pixel, or a 1080 screen around a metre high. At an aspect ratio of 16:9, that is a diagonal of 1.9m.
So you need a TV with a minimum 75" screen to see pixels at 1080 / 16:9 / 3m viewing distance. In fact, I watch a 720 picture on a 100" diagonal on a projector that (2nd hand) cost less than 100 quid from eBay and I can't see the pixels when I'm watching my rPi playing TV (although I can make them out on the OpenElec screen when I'm choosing files).
Contrast ratio, colour gamut and frame rate are all far more important for image quality. The only reason to welcome 4K resolution is for computer displays.
TV, oh yeah, that's the thing in the lounge hooked up to a console that we use for watching stuff that's streamed or downloaded right?
I remember when you used to use them for watching stuff that was beamed to everyone, but you were stuck with whatever the broadcaster thought you wanted to watch, when they wanted you to watch it. Glad that's not around any more.
Not sure what the downvotes are for.
UHD's last best hope is as a TV/Console/PC screen in the lounge. I projector is probably better, but might be too much fuss. As has been said, there's no point having hi-res when the compression pixelates the screen with any movement. PC's on the other hand tend to do things right, but even then, the graphics cards to drive PC games at those resolutions are not in Joe Public territory yet.
Still, it might be like HD is now - there's not much content, but may as well get the better screen if the cost differential isn't too much.
There's no point. Under normal viewing conditions, most people can't tell 720p from 1080p, so 4k provides no benefit at all. It's like the 96KHz 24-bit recordings promoted to audiophiles: Current technology matches sensory acuity so any further fidelity is wasted. Humans are not bats, and nor are they eagles.
The only people who care about 4k are gadget-geeks who value the latest tech for its own sake and manufacturers desperate to kick off a new upgrade cycle now that the HD upgrade stream is drying up.
People can't tell the difference between 720p and 1080p (which is double the pixels), so something that is over 8 times the resolution of 720p is pointless?
Surely logic would suggest that it's because most people can't tell the difference between the two "HD" resolutions* that 4K is needed to really boost resolution to make a visible improvement?
* I'd question the claim about identifying 720p from 1080p - I know I can, it just doesn't look as crisp.
"Surely logic would suggest that it's because most people can't tell the difference between the two "HD" resolutions* that 4K is needed to really boost resolution to make a visible improvement?"
No. It's to do with the fact human eyes are limited in the amount of detail they can detect at a certain distance. For any given screen size there is a particular distance where the eye can begin to tell the difference in detail between different resolutions.
This page will help explain it: http://carltonbale.com/1080p-does-matter/
As an example, if you want your eyes to be able to pick up enough detail to tell the difference between 1080p and 4K on a 50" TV, you'd have to have be sat less than 7 feet away from it. Most people (in the UK at least) will find the ratio of distance vs screen size doesn't work for their living room, hence 4K is pretty pointless as a mainstream home technology.
4K enthusiasts can argue all they want with anecdotal evidence that they could tell the difference at 20 feet away. They really can't, it's human biology.
4K enthusiasts can argue all they want with anecdotal evidence that they could tell the difference at 20 feet away. They really can't, it's human biology.
Your point notwithstanding. people *will* see a difference between 720, 1080, and 4K transmissions. Ands it's nothing to do with the display, for the reasons you quote.
There is a bit budget for any transmission stream. The more compression you use, the less bandwidth required - but, for any given codec, the lower the quality of the image. And so it is with broadcast - the higher resolutions are simply given more bandwidth. So the difference you will see is that the bigger-number stream has less obvious compression artefacts.
There were rumours that at least one broadcaster turned up the quantisation when HD was coming in so that the HD streams would look visibly better. I couldn't comment, opbviously...
As someone who is only allowed to buy one TV every 10 years, I will certainly be buying into 4K quite soon, and I'm fairly sure in about 6-7 years time I'll be "researching" the next big thing because the tech has moved on again, and again, and again... (Unfortunately I'm too good at researching TVs, last 2 sets have easily lived beyond 9 years!!)
UHD will require at least twice the bitrate of HD*, so you'll get at best half as many channels in the same spectrum. When surveyed, users want more content over better quality. It's how the Freeview channels get away with 544x576, a frame size designed for 4:3 content rather than 16:9 (if using the same horizontal resolution as 720 pixels at 16:9, 544 pixels produce a 4:3 frame).
* Assumption - that HD continues to use H.264/AVC while UHD uses H.265/HEVC. UHD has four times as many pixels as HD and the bitrate tends to scale linearly with number of pixels. It's hoped that HEVC will eventually achieve a 2:1 improvement in compression over AVC for approximately the same visual quality. It will probably take many years, though - right now, the result is little better than AVC.
I think you will find users want better content, rather than more content or quality. Sadly this is misunderstood to mean there should be more channels of utter pish, rather than the available revenue being spent on fewer channels with worthwhile content.
Also WTF is it that broadcasters/ISPs will spend billions on sports coverage and not nearly as much on creating worthwhile programs in other areas (arts, drama, comedy, science/education, etc)?
I applaud this just because it means we are starting to see 4k monitors at tolerable prices.
For PC use having a big 30-40" monitor in 4k would be great as the resolution would be usefully delivering the equivalent of 4 * 15-20" HD monitors but without the division and physical arrangement problems. Great for all sorts of things beyond speciality video!
Been thinking about this for a while, even though I have a prefectly decent 1080p 3D set. I checked out the offerings at my local purveyor and was impressed by the upscaling capabilities and the native demo offerings. I have purchased the set on the basis of the 14 day no quibble return policy but also the smart functionality was really good. LG with responsive WebOS, quad core driven, will allow me to watch the Netflix and YouTube content without turning on the Xbox. Must save myself some money in electricity. And was only £200 more expensive than the best HD set.
was impressed by the upscaling capabilities
So you've paid £200 more than the "best' version of something you already have, just so that you can watch content in the same resolution, but upscaled to fill more pixels?
I have this really excellent bridge here, very cheap, and it almost reaches the other side of the river, too...
What a ridiculous overhyped statement! It is impossible for 4K to be a bigger advance than HD, because as you move up in resolution there are obviously diminishing returns. 8K may be better than 4K, but would be an even smaller jump than HD to 4K....nevertheless, I'm sure this clown will be hyping 8K once 4K TVs are fully commoditized.
I am feeling pretty jaundiced about technical 'progress' in TV since colour so, in a way, he could be right. In any case I don't think there is much to choose between the contenders.
My preference though would go to the change from analogue to digital techniques, not the variety of digital technique.
Also, it is just a stupid statement full stop. As we become capable of squeezing more pixels into an LED display, matching those additional pixels with more from the source rather than an upscale is at best an evolution rather than a revolution.
I mean if you want tangible improvements, look at increasing the effective frame rates of the broadcast.
To be fair, when industry people talk about UHD, they generally are meaning more than just pixels, and the roadmap from groups like the DVB for future phases of UHD is very much about things such as increased frame rate, wider colour gamut and high dynamic range.
And, having seen all of those things side by side back at IBC then I certainly wouldn't claim it as a stupid statement. Combining these technologies really does provide a startling improvement in picture quality, compared to the leap up to HD.
Of course, most of those demos at IBC are carefully controlled, and very few of them are actually on your typical broadcaster's stream that's been compressed to buggery in order to maximise shareholder value, so whether what we'll eventually see in our homes for UHD Phase 2/3 is anywhere near as good is certainly up for debate.
But UHD is not just about 4K. If it's done properly, it is about all those other aspects too. The fact that they are not all fully standardised yet is, in my view, a good argument for not buying a 4K screen just yet. Wait until you know that both screen and connectivity have been adequately specced for the finalised UHD standards. To do otherwise may be a little like buying an early 'HD' TV that turns out to have a weird resolution, no H.264 decoder, and only a DVI connector on the back.
Most people don't care what resolution its broadcast in. They have a 4k telly so think they are watching 4k. Like how people thought an HD telly magically made all broadcasts HD.....
Which is where all the "I can't tell the difference" bollocks comes from. My old man, nicely into his retirement years, can easily spot the difference between HD football and not on his TV, and will buy blu-ray over DVD if available for the higher quality. He's the exception because he knows roughly what's going on, so he knows to match the source to the screen.
If the source isn't up to the screen, then it isn't worth it. Also, as others have said, 4K makes little difference to TV media in the living room on even fairly large screens.
Video games, on the other hand? Bring it on, bring it on, bring it on.
I'd like to say that one one watches TV for the quality of the picture but I actually know someone that does*. I don't see much point in a 4K TV, in fact I barely see much point in a 1080p. The only (legitimate) source that I can think of that can deliver the sort of bandwidth necessary to produce a picture that pushes a 1080p display to it's limit is blu-ray and even that isn't completely true for action rich scenes. If you look at "HD" content from any streaming source, satellite, through the air the quality is so low you might as well be watching 720p or less for some of the highly compressed channels. Unless they can invent some more bandwidth (or cut out the rubbish) there's no point in 4K.
* He's wasted a ton of money buying old films on blu-ray and getting 4k equipment etc etc but it's clear the blu-ray version of most of the old films is just the same as the DVD scan. Possibly with a static side by side comparison you could see a difference but once things are moving it all looks the same.
TV pictures are good enough. Anything that requires greater resolution needs to be watched on a PC with zoom/ff/rewind etc.
I'm not sure what advantage is to be gained from watching cartoons in 4K but if you are spending your time noticing the address on the letter on the mantelpiece then its time we spend 1/10th of what is spent on TV research on script writing and country based canned laughter timing.
But, when we see what people are happy to watch on phones and tablets, and more importantly what they are watching, I'd say that for 90% of viewers TV's are 'Job Done'.
I've seen it all before. TV broadcasters vied to compete with shiny new 1080p channels, then in short order (after the novelty had worn off) they one by one decided that they could make more money from using the bandwidth for multiple 720p channels.
Bandwidth isn't free. It must be paid for by capturing as many eyeballs as possible. I expect history to repeat itself.
If you want 4K, you can get it right now by playing games on your PC (hardware permitting).
1080i is less bandwidth than 720p which is why 1080i is used for HD.
1080i is 540 lines interlaced with the next frame so at 30 frames per second that is 30 frames at 540 lines each frame
720p is 720 lines each frame so at 30 frames per second that is 30 frames at 720 lines each frame.
i'm always amazed at how many people don't understand the difference between p & i.
1080p is the current gold standard and requires ~ twice the bandwidth of 1080i effectively 2 HD channels, the providers would rather eek out 2 paying HD channels than 1 for the same money.
incidentally in the UK SD is 576i which is 288 lines per frame, 1 HD channel at 1080i is ~ 2 SD channels, 1080p is ~3.75 SD channels.
I've a second-hand 19in TV and a second hand 19in monitor. The former is utterly adequate to display such content as I watch on TV, which isn't a great deal these days to be honest, due to a dearth of new quality content. Seeing the news in higher resolution doesn't wow me, and neither does stereo video. Holographic displays - yes, that would impress. Until that's sorted out, it's more good content that I want, as drivel is still drivel in higher resolution.
We've an ultracheap 22" 1080p capable TV (but too old for FreeView HD built in) so we got TalkTalk's cheapest YouView option*. The HD channels are much better than SD, especially for sports. However, I chanced upon the Graham Norton show the other night while channel hopping and it looked AWFUL in HD: the set design and lighting were utterly crap.
* yeah, it's a bit crap if you don't pay for extra channels, but hey, 50p extra per month for a year, then another couple of quid for 6 months, seemed cheaper than a Freeview HD box; and the YouView at least lets you watch iPlayer without using your laptop.
However, I chanced upon the Graham Norton show the other night while channel hopping and it looked AWFUL in HD: the set design and lighting were utterly crap.
Many years ago, back before he moved to the BBC, I went to see one of his shows live. It was very entertaining, but in person the set looked very cheap and tacky. I think it was designed to look good on a standard CRT TV which was still prevalent back then. I suspect the effective video bandwidth of a digital signal is much higher - in theory you could encode adjacent pixels as white and black, whereas an analogue signal was much more restricted (test card with frequency bars, anyone?).
Obviously the set designers need to be updated to deal with modern technology.
I suspect the effective video bandwidth of a digital signal is much higher - in theory you could encode adjacent pixels as white and black, whereas an analogue signal was much more restricted (test card with frequency bars, anyone?).
Actually, it's quite the opposite.
Digital TV is compressed - even prior to transmission, the signal is invariably quantised in the frequency domain, so you will lose all that detail.
Analogue baseband material does not have that problem, and only loses effective bandwidth at broadcast due to noise (per Shannon's Law). Given a clear transmission channel - which was certainly possible with UK terrestrial broadcast - the displayed picture could closely match that baseband image. The SNR of a digital signal will necessarily be much lower, unless you're using a very noisy channel (poor antenna, far from source, reflection interference, you name it...)
“photochromic Reactor Light spectacles”
I think you possibly mean “Reactolite™”
Also, they only react to UV light, I hope 4k TV’s aren’t going to be spitting out enough UV to make them react. If they are, I’m pretty sure we’ll actually get a tan off of them.