Quantity not quality
Broadcasters are only interested in quantity not quality. Look at the number of low bit-rate mono DAB stations.
Only 6 per cent of broadband homes are "moderately" or "highly likely" to buy a 4K TV, and 83 per cent of consumers are completely unfamiliar with the term Ultra HD. LG 77EC980V Ultra HD 4K OLED TV LG's 77EC980V Ultra HD 4K OLED TV... pricey but beautiful These are the major findings from a new report from The Diffusion …
Or compare BBC on Freeview to the first days of OnDigital. I am sure BBC1 (in SD) looked better in 1999 than today. Back then BBC looked very DVD like compared to the over compressed mess we now have.
Or early BBC HD on Freesat to BBC1 HD and BBC2 HD now on Freesat.
All worse in the name of cramming more channels in.
Or compare BBC on Freeview to the first days of OnDigital. I am sure BBC1 (in SD) looked better in 1999 than today.
I suspect you're looking at it on a better and larger telly than in 1999 so any artefacts introduced by compression are more obvious. I find the artefacts are more of a problem than the lower resolution when comparing SD and HD.
Actually the same TV for most of the time it was already poorer before DSO. About a year before DSO I went HD, and my HDTV is good at upscaling.
32" Wega IDTV so well able to get the best from the transmission, and those early days had a much better picture, higher bit rates ect.
Upscaled SD looks fine on my HDTV, just not as good as HD especially when the BBC used about twice the bit rate they use now.
"Or early BBC HD on Freesat to BBC1 HD and BBC2 HD now on Freesat."
A TV channel occupies a frequency with a bunch of other channels which together form something called a mux. The mux is a transport stream with audio, video, and data of all these channels mixed together. The channels in the mux all share from a fixed bandwidth and software / hardware attempts to compress each channel in real time to make best use of this space. Programmes in the channels might be hinted and channels might be weighted for quality.
Channels can move from one mux to another as space is freed up. Channels on Sky / Freesat are constantly moving around. The BBC tends to keep its channels in the same mux but it still moves them. Until a few years back space was very constrained but new Astra satellites have been launched to supply more capacity.
The BBC is also known to have switched hardware encoder because they were dissatisfied with the performance of their encoding at the bitrate.
So anyway, even assuming there was a difference (and you'd have to have stream captures from then and now to say for sure), there are other reasons that quality has changed. Personally I think the quality of Freesat on BBC channels is generally excellent.
I only watch Freeview when I'm in a hotel or something but in SD it's always falling to bits under heavy load. In part that's due to using MPEG-2 and in part due more channels squashed into less muxes. Freeview HD implements DVB-T2 which is more space efficient and HD channels use AVC. So picture quality has the potential to improve substantially over what it was.
Or early BBC HD on Freesat to BBC1 HD and BBC2 HD now on Freesat.
IIRC they've dropped the resolution from 1920x1080 square pixels to 1440x1080 rectangular pixels, plus dropped the bit-rate from north of 20Mb/s down to about half that.
The excuse was parity with Freeview HD and saying BBC was not going to favour quality on one particular platform.
I bought a couple of 50-inch 1080p LCD TVs for Cdn$386 [!!!] each (US$360 equivalent, plus some taxes to fund our 'free' health care), including shipping [!!!]. Brand name is Changhong, or Chonghang, or something. Actually what happened is that I bought one, nervously, and when it arrived it was diverted by Her of the Indoors to a place of pride above the mantel. We thought that the picture was good enough, that I immediately ordered another one for the original destination. The picture has poor black levels, and it's only an 8-bit panel. Otherwise - LOL - picture quality is fine. But seriously, one can watch the usual HD helicopter-landscape shows and the images are drop-dead gorgeous. I've seen $1000+ TV that were not as good. And the power consumption of this TV is extremely low, off the bottom end of the Energuide/EnergyStar scale. And it has speakers that are loud for day to day use.
Anyway - on to the subject at hand: 4K TVs.
The same website via auction site had a similar 4K 50-inch TV, brand name Seiki, on (at times) for $600 with (at times) trivially cheap shipping (like $5). They're back up to $780 right now, but the message is clear...
The Chinese are here. TVs are going to get cheap-as-chips. And they'll have the Black Levels sorted out by next year.
What benefits?
1) Unless you are sitting about 2m from a 84" screen (4m from 160") or larger there is little point in more resolution.
2) Where is the content?
3) Hardly any broadcast or streaming does full quality existing 1920 x 1080
4) Who wants an 84" screen in an ordinary room? A motorised screen and motorised zoom on a projector makes more sense above 40" to 56" for most homes.
Most HD content made for TV is still framed for SD, usually far too close in framing.
"1) Unless you are sitting about 2m from a 84" screen (4m from 160") or larger there is little point in more resolution."
Speak for your(visually challenged)self, I am lucky that I have vision within the normal range for humans and so can see a difference at much greater distance than that. Perhaps you actually need to a) get your eyes tested, or b) actually go see a 4K screen showing 4K content.
The BBC recommended viewing distance for UHD 1 ("4K"), for maximum visual resolution, is 1.5x screen height. For ordinary HD it is 3x screen height. An 84" 16:9 TV has a height of ~41", or 1.04m so you'll get best quality from a 4K 84" screen at 1.5m, and beyond 3m you'll see no more benefit from 4K than from ordinary HD.
"The BBC recommended viewing distance for UHD 1 ("4K"), for maximum visual resolution, is 1.5x screen height."
And the BBC isn't biased in this having position at all, no sir, no way is it concerned about limiting the cost of bandwidth required to carry UHD signals and justifying the level of compression it will use like it did for HD.
Actually the calculation is based on being able to differentiate between two vertical lines separated by a single pixel and is pretty much correct in that respect. However, I don't tend to watch many TV programmes or movies that comprise only vertical and horizontal lines. It requires a much higher resolution to accurately reproduce a 30° inclined, or curved edge, without visible anti-aliasing.
(FYI, this is also why smartphone screens have gone beyond Apple's 'retina' limit which was based on the same miscalculation.)
Speaking as someone who actually has an 80" TV (non-4k) in the lounge, anyone sitting closer than about 5ft from the screen will have much more serious problems than differentiating pixels. Preferred viewing distance is 10-14 ft. You can however comfortably separate pixels @1920/1080 from about 2-3ft away with good eyesight.
First things first, you *need* HD or blu-ray content to make the most of it - low res imagery looks horrible as the pixels are blown up to massive proportions. Particularly noticeable when streaming poor youtube content. The difference between SD and HD content from Sky is orders of magnitude, but there just aren't that many good HD channels.
Secondly, gaming when closer than about 6-8ft away (which fills your vision) is pretty much a recipe for motion sickness. The brain has issues coping with that much stuff changing that quickly in your peripheral vision. Just moving a mouse around up close is a headache. I end up doing something else while the controller batteries recharge.
The problem is though, now we all have (relatively) good quality TVs, what do the production companies do? "Ooh, look, we can blur the whole screen except for the lips of the person who is speaking, let's do that all the time.", or "Ooh, look we can use a depth of field 1mm deep and have someone whose face is side on with their nose in focus and their ear out of focus, let's do that all the time". Or "Let's shoot everything against a bright window/light with lens flare if possible, and no fill in lighting so the faces are grainy, if you can make them out at all"
These effects are good, and have a place, but I feel sorry for the set dressers/costumiers who take the time to make a scene authentic and then all you see in the final piece is a blurry background that could have been anything, and don't get me started on "fight" scenes/car chases that consist of people doing not very much at all, while the camera is waggled around violently - it might be cheap, but it is no match for a properly choreographed and well shot scene.
Then there is the sound - Sound effects and music really really loud - speech really really quiet and with the treble turned down so it is hard to make out, often both simultaneously. Oh, and don't forget to get the actors to mumble and whisper incoherently, especially when their faces are turned away so you can't lip read either.
Sadly, no amount of decent telly fixes those, but if they transmitted stuff that demonstrated how good things could be, rather than transmitting blurry crap that looks equally bad on anything, they may have a chance in pushing the tech. Sorry, unintentional rant over. Mine is the one you can hardly see, hanging in front of that window with the sun streaming through it.
@Badvok
You're almost definitely wrong, and let me explain why, as this is simple physics/biology.
What is normal range? lets say "good vision" is 20/20 (some people have better, say 20/10), but lets run with "good" - 20/20 vision is a visual acuity of about 60 pixels per degree of vision this means that at a distance of 1.5x the height your total view is 37 degrees, on a 2k (1080p) screen that's 30 pixels - ie. even below average eyesight can see the pixels, on a 4k screen that's 45 pixels per degree and unless you have "good" (above average) eyesight you probably won't be able to pick out the pixels.
So..... if you look at a 1080p (2k) screen from a distance of twice the height you end up with about 60 pixels per degree, in other words 20/20 vision cannot pick out the pixels. for a 4k screen that's 90 pixels per degree - even 20/10 vision would in reality struggle to identify a difference as it's on the limit for 20/10 vision (the best vision ever measured is around 20/8).
60 pixels per degree is a theoretical maximum for 20/20 vision, more correctly that is if each pixel is a contrast i.e. could you identify a line of one pixel; but films don't consist of one pixel lines, it's more likely to be "moving pictures", so the ability it identify a static pixel doesn't really mean much in practicality.
Both physics and I agree that you're wrong, either that or you have the vision of a hawk (20/2).
@"No, I will not fix your computer"
"You're almost definitely wrong, and let me explain why, as this is simple physics/biology."
Ah, yes the old, "can't see the pixels argument" - who on earth wants to see individual pixels? It is the visual effect that gets better with higher definition. The real world is made up of curves and angles and these appear better/clearer/sharper if the display doesn't need to anti-alias them. Higher definition means less anti-aliasing = better clarity.
However, if you really are one of those people who likes to look at all the pretty individual pixels then your argument is sound.
@Badvok
>>Ah, yes the old, "can't see the pixels argument"
Hmmm... maybe I should have explained more simply, let me summarise, then feel free to go back and re-read the technical bits;
As soon as you can't make out the individual pixels, any further increase in resolution has no benefit;
For 20/20 vision (good vision) looking at a 1080p/2K screen at any distance over 2x the height of the screen you can't identify individual pixels, so if you're any further away from the screen than twice the height of the screen it is physically impossible to tell if it's a 2k or 4k screen.
So, for a PC monitor where the distances are closer, a 4K screen might be practical, for a home TV, unless you're really close, or it's really big, 4K could be pointless, note - Sony (who produce a lot of digital cinema equipment) have done several studies on this in relation to digital cinema, but the principles are the same (and with very big screens viewed at significant distances it's a little easier to understand).
@El Reg - why not do an article on this? with pictures and everything? 4k is a bit Emperors new clothes for home video (bragging rights aside).
As soon as you can't make out the individual pixels, any further increase in resolution has no benefit;
Just because you can't see the individual pixels doesn't mean they have no benefit. If you are too far away to see an individual pixel on a screen or dot in a print, you end up perceiving the average result of those pixels/dots. A higher resolution results in an average that is more accurate to the original source.
>>Just because you can't see the individual pixels doesn't mean they have no benefit. If you are too far away to see an individual pixel on a screen or dot in a print, you end up perceiving the average result of those pixels/dots. A higher resolution results in an average that is more accurate to the original source.
Yes, and no, the interpolation/antialias effect you're describing is very real, however, for this implied averaging to be relevant you must be able to achieve an effect not possible on their own, as pixel size itself is no longer relevant (as discussed above) then it's only colour and intensity, with a 24bit (True Colour) palate it's about 16 million colours, given normal humans can "only" identify approximately 10 million colours it's actually irrelevant if the screen is more accurate - normal humans can't tell the difference between 24bit (true colour) and anything higher (i.e. deep colour).
I say "normal humans" because people (women and other people with two X chromosomes) who have an extra cone (tetrachromia) can see more colours (yellow-orange as I recall), but that's quite rare, and given that it's an extra cone (not just better fidelity) it wouldn't be an even colourspace so interpolating for a tetrachromat wouldn't be the same i.e. the normal 24 bit (8x8x8) couldn't simply be boosted to 27bit (9x9x9) as you're simply boosting fidelity on the traditional cones, you'd actually need to add a channel for the extra cone (8x8x8x8), which means you'd need to film in four colours and have that extra channel on the pixel.
So, yes, absolutely - an increase in pixels to make up for for low bit depth can be useful, but not really relevant for 24bit displays.
Personally, I don't see that my personal experience of a film is going to be much improved, if at all, by tiny perceptual differences like that unless I have a dedicated home cinema room. Even then, I'd gladly swap some sort of techno-fix for quality of writing, acting, plotting and screenplay. Measuring film quality by pixels strikes me as a strange game.
It's reckoned that at 2m or so, you need a 55 inch screen or more to perceive any difference in 4K. As I sit 2.6m away that makes for something more like 75 inches. Far too big for my room. Really something for those with dedicated home cinemas.
nb. the "streamers" ought to think about upgrading their HD bandwidths. If they want to use 10mbps+, then that's probably going to make a bigger qualitative difference to the "viewing experience" than heavily compressed 4K.
On the colour depth, with current 1080 TVs, many have deep colour or wide gamut, but the source, even in blu ray is not up to it. When setting up mine all the advice I read said to turn off the deep colour options because it was a waste, if not a harm. Blu ray players often handle deep colour, so do the TVs, but apparently most, if not all discs do not have the encoding. There's no advantage as the extra colour bit depth is not there. Deep colour settings on non deep sources can look a bit odd. Something you may notice with a deep colour monitor used on a computer and just go browsing the web. Colours too heavily saturated is often the case.
There's no advantage as the extra colour bit depth is not there.
You are correct that broadcast television and Blu-ray's codecs only support 8-bit per channel colour, i.e. 24-bit colour
Where extra colour bit depth is useful is if your screen is upscaling material, such as SD up to native HD resolution as extra colours on 30-bit or 32-bit mean smoother scaling (can tween colours) and thus less banding and such artefacts.
This obviously doesn't work if say you use your Sky HD box to upscale SD channels to 1080i in 24-bit before it send them to display, which is why picture quality fans have Sky HD set to AUTO output so screens can better scale (with more colours) SD to HD.
If one pixel has 256 levels of intensity (8 bits), and you have a cluster of 4 pixels that you can control individually, doesn't that only gives you 1024 levels of intensity (= 10 bits)?
10 bits per channel. Multiply by three for red, green and blue channels.
This is exactly the same argument that happened when SD was being supplanted by HD.
At the time, we had a really nice 53-inch SD TV capable of awesome picture quality (within the limits of 480i). Our satellite receiver would receive SD and HD signals, and down convert the HD to 480i for viewing on our big SD TV.
Here's the thing. The HD transmissions, even when down-converted to 480i SD, *always* looked better than the SD transmissions. Even on a 480i SD TV.
Same thing with 4K. We now have an Apple TV gadget connected to our 50-inch 1080p TV. We can go onto YouTube and search on "4K" videos. They always look better than the normal HD videos, even when down-converted (presumably at YouTube) to 1080p.
Conclusions: You don't need to have a 4K TV to reap the benefits of videos made in 4K. There's more to video quality than just the pixel count at the display. I wonder if the people making 4K videos are simply going to more beautiful locations?
pretty much this
HD is already in a very good spot, so much so that it becomes hard to justify the price between a good 'regular' HD set and the cheapest 4K one.
Unless they come down in price 4K isn't going to go anywhere
I agree on the price bit.
It's one of those "if it's for free, I'd not turn it down" things.
4K looks very nice, and gives a softer clearer image, as for matter of fact, the TV has more to actually work with to create an image.
But is the return for investment, is the result worth the extra cost? It's almost there.
The thing with any supposed improvement in TVs is that you rarely see it demoed in the place you'd actually be likely to use it, i.e. your front room, making any improvments are very difficult to gauge as you are so familiar with your current systen in this particular context, viewing distance etc, and the mind tends to edit out inadequacies over time; it looks 'good enough'.
My TV until 3 years ago was a 50 quid Argos 14in CRT job that must have been pretty much the last model sold in the UK that had CRT. The tube was actually slightly squint, but after a couple of years the mind had edited this out and probably done a bit of image enhancement to boot, so the whole thing seemed quite acceptable. I eventually persuaded my girlfriend a new 32 inch LED was a good idea by the simple ruse of buying one, begging for forbearance and insisting she wait a week before decapitating me for the loss of what at first appeared to be half the living room space. By the end of the week, the new TV had shrunk to normal proportions and she admitted - as I was - to being rather ashamed we'd put up with the other for so long.
So anything higher resolution is unlikely to get much of a shout until the current set packs up or the price has dropped to the point that replacing it is a virtual no brainer. and even then it won't be of the kind of size touted in the article, because we really have reached the physical limit of what the room will take, and the smaller models will make the proper viewing distance to get the benefit too close to be comfortable.
Additionally, as someone points out further up the page, the amount on compression for most UK broadcast TV takes a lot of the shine off the quality in any case. Anything not on one of the main channels tends to have the look of a bad webpage JPG circa 2001, with sound to match.
"So anything higher resolution is unlikely to get much of a shout until the current set packs up or the price has dropped to the point that replacing it is a virtual no brainer."
Yup, my parents had an ancient 17" CRT that they'd had second hand off my grandmother.
That packed up the week before Christmas last year so an expedition to a suitable retailer was arranged with my little brother (on account of him being in at the time) and a 50" LG Smart TV duly appeared.
In fairness, at that size, the HD Freeview channels are noticeably better than their SD counterparts. It's also nice to have all the iplayer/Netflix/Lovefilm-AmazonFilm stuff embedded. Nothing I couldn't set up with an RPi but my parents much prefer just driving it all off one remote rather than having a Wireless keyboard, etc and introducing them to Raspbian, etc. And clicking a button is simpler than jacking in their laptop to an HDMI, setting up the right screen type (mirroring/extended desktop/etc - always defaults to the one you don't want at that moment!), having to wake it up when it goes into hibernation halfway through the film, etc.
Suffice to say, that TV ain't going anywhere until it physically dies. Not for 4K, not for nothing.
All they need now is something decent to watch...
Absolutely. If 4k takes off, I'll probably pick up a second hand one in 4 or 5 years time for buttons.
Got a 4 year old 50" Panasonic 1080p telly last year for €300 for example. Apart from having a rather chunky bezel, it's great.
On the other hand, the people I saw marching out of ASDA a few years back with cheap 40" LCD TVs labelled as HD Ready will see an enourmous difference no matter what they buy. On closer inspection, those no-name HD Ready TVs were 640x480 panels (I kid you not!!)
"Unless they come down in price 4K isn't going to go anywhere."
Nope. Unlike every other consumer electronics kit in history, 4K TVs are *never* going down in price. Well, except maybe for example that reasonably nice 50-inch 4K TV I saw being temporarily offered for Cdn$600 with nearly free shipping. Good reviews too. Too bad I had just stocked up on 1080p sets. But other than that, the 4K TV prices will *never* go down. Well, that's assuming that the Chinese don't start making such TVs, ...as they've already started. So, nope, never going to happen.
Lower power consumption, built-in fast wi-fi and a good range of ports. These are the sort of things I'd be looking for in a new TV. Would not pay one penny for 4K over HD. (I was going to put something in about the quality of the screens themselves but hope this will improve no matter what)
How long before we see a 4K 3D TV?
You mean volumetric. I suspect true 3D TV will first appear by borrowing a trick from the CRT days: rapid refresh. The main obstacle to getting a volumetric display done with a spinning LED plane is the refresh rate. To achieve a 30Hz volumetric display with 360 voxels circumferential resolution, the planar elements need to be able to refresh themselves at least 5400 times per second (to cover a 180-degree sweep in 1/30 of a second).
@ This Side Up
"No, so-called "3D" TVs are merely binoccular. Move your head and the image doesn't change. You'd need holographic tv for real 3D."
Binocular is "real 3D". It's not as if you're going to get up of the sofa and walk around to see the back of the newsreader anyway, is it?
My 80-year-old father bought a 3D TV a couple of years ago. Shortly afterwards I read about software for creating 3D photos, and he started off creating 3D photos, and then bought a Sony Bloggie3D camera, which, while not a particularly good camera, allows him to create 3D video as well as 3D photos.
When you move your head left to right while viewing those 3D photos, you very definitely see items in the background "move" in the opposite direction. It's a quite convincing effect, though obviously limited to
Given the paucity of 3D content available, the TV makers may have missed a trick by not hyping the availability of 3D photography, and making fairly basic dual-lense cameras available - they don't need to be multi-megapixel devices, anything over 1920x1080 is wasted anyway.
Static stereoscopes are nothing new. I once viewed a topographical photograph using an old-fashioned stereoscope. Both implements were in the neighborhood of 50 years old. Stereo photography still exists, but it's more of a a specialty field since you need both the stereo camera and some form of stereoscope. TVs are not well-suited for this because of the flicker (which would still exist for still photography).
I believe the next hype-level inbound is '8k'.
The hype is that at 8k your eye cannot tell the difference any more.
For me, I just think 8k has nice natural-extension-upwards ring to it.
Curved screens are a weird TV hype. 'Here's a screen where you *cannot* see all the action from all your furniture'. Really??
...I'm more interested in the quality of the programming rather than the number of shiny dots in front of my eyes. Cease the race to the bottom and deliver good shows (be that drama, documentary or whatever). Even the likes of Horizon are now little more than dumbed-down vacuous bullshit; hell, Nat Geo has "Ancient Nazi Alien Ghosts" or such cobblers on it as serious programming. Pathetic.
When I can find a HD TV that is *JUST* a TV with not walled-garden, spying on your network "SMART" TV wank; I'll buy it. Or a SMART TV where I can install a new OS and make it actually SMART and serving me, rather than the OEM.
"You could just not plug it into the network?"
So I should pay for crap I don't use? No. How's about they just don't put the crap in there? I have no issue with the TV showing me local network content; it's the reporting back to big-brother and only letting consume specific on-line services I don't want.
"So I should pay for crap I don't use?"
No. Someone else pays for the crap you don't use. Once you've designed and tested a universal telly, the cost of taking out features and re-testing exceeds the savings of doing so. It's the same reason Intel put instructions on their chips that most programs can't even see, let alone want.
I think there is a bit of user fatigue with tech becoming obsolete and "needing" to be replaced regularly. A lot of households (me included) still haven't even gone full hd yet. I still have a 720p tv and will only replace it when it breaks - and I'm a tech enthusiast.
I think a full hd tv, possibly with 3D, will do me for a replacement unless 4K sets are at a comparable price. But then my TV is 7 years old and my previous one from the same manufacturer lasted me 15 years so maybe it'll be 8K, or 16K, or direct brain connection by then.
You can see this weariness in other tech sectors too, there is no longer the same drive to upgrade your smartphone every 12-18 months that there once was.
quote: "But on a 4k set it glistens more realistically, and you can see every hair on the flies' legs."
HDTV channels like Sky are broadcast at 720p, so unless you actually have a 4k source you really won't see a difference between a 1080p set and a 4k set. That's why nobody is buying 4k sets, there is literally no need because none of the source tech (Sky boxes, BluRays etc.) is set up for more than 1080p.
I'd consider buying a 4k display for my computer (games would look good in 2160p, although I'd probably need to go SLI to keep a decent framerate), but as a TV set it just isn't necessary yet.
Yes! And a fair bit of user fatigue with the price gouging too.
Our primary set does the full 1080 while the 720 has been relegated to the rarely used basement (it has DVI inputs which tells you how old AND short lived the model was). But given how much cable wants to charge for "full" HD, most of the stuff we watch is still in the old size format. If I can't get "full" HD at a reasonable price from the cable company now, why in all the realms of Hades would I buy a 4K unit?
Forget this whole idea of a "TV" whether 'smart' or dumb - what I would love is a decently-priced mid-size (28-inch max) 4K *monitor* that I could then connect to the local content-source of my choice (which will rarely-if-ever include 'broadcast media').
Personally, I don't want big screens - they're obtrusive and take up space which some of us would rather have occupied by that rather-more-traditional viewing technology known as "windows" - through which I can watch the outdoor-wildlife in real-time without any cheesy soundtack or narration.
[Last night there were three fox-cubs playing who-dares-get-closest-to-the-hose].
> would rather have occupied by that rather-more-traditional viewing technology known as "windows" - through which I can watch the outdoor-wildlife in real-time without any cheesy soundtack or narration.
Unfortunately the view out my window doesn't include spectacular helicopter shots of the Australian outback, the Fjords of Norway, or even the more interesting Bronze age archeological sites from rural Britain.
They look dandy at 1080i on the 10 foot projection screen though.
Just bought a Samsung 40HU6900 40 inch TV for work - I do a lot of CAD, so a big monitor is very handy. Post World Cup prices are plummeting - at release in early May it was £1000, now it's £639. We bought from John Lewis for £729 inc 5 year guarantee (we have a ton of first-generation 2560x1600 panels with faults where they overheat and fail: worth paying a bit extra to avoid early adopter risks like this)
It's a nice display - there are two main issues. One is the graphics cards haven't caught up: the TV has HDMI 2.0, but there's no hardware that outputs that. The alternative is DisplayPort 1.2, which uses a hack called multi-stream transport (MST) to pretend that it's two displays to the GPU. The Samsung doesn't have DisplayPort, so I'm on 3840x2160 30Hz at 4:2:x on my late 2013 Retina MacBook Pro 15". When a suitable HDMI 2.0 GPU comes out I'll use that on my Linux box. The chroma downsampling is slightly annoying, but it's OK. I'm not bothered by 30Hz as I don't game. This was about the only affordable 40" panel I found: the alternatives were a variety of 28" TN models, which I suspect are all the same panel inside. A 28" 16:9 UHD panel would have been worse than the 30" 16:10 2650x1600 panel I previously had, hence the reason to go for 40". It also has 4 HDMI inputs, while the previous only had 1x dual-link DVI so I had 3 monitors on my desk for different machines.
The other issue is that being a TV it's laden with crapware. What TF is 'football mode' and why TF would I want it? There's also tons of 'smart' (arse) features that I don't want, like on web browser and apps. However, by not connecting the TV to the internet most of these mercifully don't work. But more annoying is the 'picture improvement features' which just serve to mangle the picture. I think I've turned most of these off now - the worst was something called 'Motion Plus' that was a special 'blur all scrolling or typing' feature. The most important feature, being the 'Source' button on the remote control to select input, works OK - a few more clicks than the usual monitor button, but I'll survive.
The other useful feature would have been picture-in-picture, but that only works if one source is TV. It's also slightly reflective (less than my Mac, but more than its predecessor), and doesn't have an adjustable stand. If this annoys me sufficiently I may find another VESA-mount stand. Put it on the power monitor, depending on backlight brightness it takes a fairly constant 70-150W, dropping to 50W in 'where's my signal gone' and 450mW in standby. Poking about with other picture settings didn't change the power numbers.
So, in summary I'm about 80-90% happy. For the money it's a decent monitor, but be prepared to turn lots of stuff off to make it usable.
For most consumers, its all about the size of the TV, not its quality. Indeed, it may be a truism to say 'the smaller the house, the bigger the television'. Where incomes are low, picture size is much more important than picture quality.
Manufacturers have improved colour and contrast. Resolution is improving slowly. To make the Ultra HD experience truly outstanding both the refresh and frame rates need increasing. We are not seeing much improvement in these areas, so far.
The 'blurring' of moving objects detracts from the viewing experience. Until this is addressed, the increase in take up of 4K (and 8K) television is unlikely to be rapid.
Frame rate is not a manufacture problem, it's a content/broadcaster problem.
As a PC user, I prefer higher FPS. It's purely a "what your eyes are use to" thing, no different than moving from monochrome sets to colour sets.
While it's true that it looks "strange", I'd put that down to filming techniques needing to catch up. As higher FPS need a different way to film (as a side note, HD needed them to stabilise the camera more, colour meant your needed to make sure nothing was washed out/over tinted).
There is no such thing as a new good big cheap TV, You have to drop at least one.
Some people go new big cheap.
Careful buyers go new good cheap or good big cheap.
Enthusiasts go new good big.
I would not buy a TV which was not good in the picture department, hence my TV was not cheap, mine is not the biggest but big enough for that film effect, great for films and games, also decent stuff off TV.
Would I swap for a cheap 55" TV?
Never.
Yeah, but from what I read on El Reg the other day, we control what you see too. I'm not sure why you put up with it. You ought to be able to fund your own productions. But I guess things have gone downhill since The Bard was in his hay day. But hey, that's your choice not mine.
But neither is 4K on pay-TV, as we’ve yet to see a 4K channel appear on satellite or cable.
There is barely any Full HD 1080p broadcasts, at least in the UK. As far as I'm aware Freeview has none and Sky HD doesn't seem to do any Full HD channels, to my knowledge all the HD channels are 720p. I have a Full HD TV and the only time that gets used to its fullest is the occasional Blu Ray. Even in the World Cup, where TV tech gets pushed, there were only 3 games broadcast in 4K and none of those were available in the UK.
It's not just the price, it's pointless buying a TV that is 2 generations better than what is being broadcast, Netflix excepted. I think the TV manufacturers may have to realise that we just don't want to upgrade our TVs like we do our phones and they have overused "the next big thing in TV" to the point that the public just aren't interested in more TV technology. The marketing equivalent of crying wolf, we've had 'Widescreen', 'Digital', 'HD', 'Full HD', '3D' (people tried 3D but didn't care about that either) - most people have just got HDTV or Full HD TVs and have no intention of throwing them straight away.
most people have just got HDTV or Full HD TVs
Most people assume that because their TV says "HD", everything they watch is in HD. They have so little clue about picture quality that sales of UHD will be purely based on pub bragging rights, and they're still too expensive for that.
Sky in the UK delivers 1080i60 (at least that's what my telly and Wikipedia says). That's an interlaced 1920x1080 image at 60 frames a second, so two successive frames make up a full 1920x1080 image, effectively halving the frame rate (most televisions do some form of de-interlacing on such an image by combining the 'odd' and 'even' lines into a single frame, and actually displaying it at half the frame rate).
This means that in most cases, provided that the original was shot at 30 frames per second (and most made-for-TV programmes are), there should be no effective difference between 1080i and 1080p (1080p will transmit two identical frames, 1080i will construct a single frame from two adjacent frames). Of course, any material shot at the full 60 frames per second will suffer de-interlacing artefacts when transmitted at 1080i.
You can also get quantization errors if the original was shot at 24, 25 or some other number of frames a second. There will be some of this type of error whenever the original frame rate does not match the display rate.
@Peter Gathercole - Sky in the UK delivers 1080i60 (at least that's what my telly and Wikipedia says)
Ah, but is that the Sky box output to TV connection delivering 1080i60? What is the broadcast resolution of the channel being carried within the Sky box output? See what the telly says when there is an SD channel being displayed, I'll bet that it still says it is giving 1080i60 and doesn't change regardless of the channel viewed, it's just a high quality signal carrying a picture of much lower quality.
So the question is - of the HD channels on Sky HD, how many carry a 1080 picture?
All UK broadcast HD is transmitted at 1080i50. 1080p25 content can be accommodated within the same broadcast stream.
99% of UK content is produced at 50 interlaced fields, or 25 frames per second (depending on artistic choice). Only things intended primarily for transmission abroad are produced at 60/30.
Most films (24fps) are sped up, reducing the running time and altering the audio pitch, rather than introducing quantization errors. American (60hz) programming can either be slowed down, have the frames interpolated, or a combination of both, which is one reason American stuff looks worse on broadcast TV to the Blu-Ray release when seen on the same television.
You couldn't slow 60fps down to 50fps without the result being unwatchable. It's always converted, and with modern standards convertors you'll not see any difference when only the frame rate is being adapted, the underlying MPEG stream isn't exactly a simple succession of 30/60 fps anyway. Interpolation really only shows up problems when you're trying to handle changes in the number of lines, like 525->625
The BBC some time ago (at least) transmitted BBC 1 HD with an output that swapped between interlaced and progressive at the GOP boundaries depending on which they were getting a better compression from.
If I'd had enough posts approved, I may have pointed you to this page: http://www.bbc.co.uk/blogs/legacy/researchanddevelopment/2011/04/software-upgrade-for-bbc-hd-on.shtml
The BBC some time ago (at least) transmitted BBC 1 HD with an output that swapped between interlaced and progressive at the GOP boundaries depending on which they were getting a better compression from.
This has now been rolled out to all services on the PSB3 Freeview HD multiplex (BBC One/Two/Three HD, ITV HD, 4hd) and I believe it is also used on COM7 (CBeebies/BBC Four, Channel 4+1, 4seven, Al Jazeera HD).
I don't think it is used on satellite - changing the interlacing mode on a GOP basis was not part of the Freesat or Sky specifications. Doing this caused a problem on early Freeview HD units, and in some cases TVs using external Freeview HD boxes (it depended whether the box passes through the 1080p25 GOPs or converts to 1080i50). There tended to be brief switches to black and audio glitches on mode switches - annoying but bearable on programme transitions, not acceptable when it could switch more than once per second (a GOP is usually shorter than 25 frames)
If you take a step back and look at it, the reason most people bought even 720p TVs has nothing to do with better picture. They changed their TVs because the Feds/Parliament changed the broadcast rules. If you didn't upgrade, your tv was pretty close to useless.
A 35mm film frame is approximately the same resolution as a 4K monitor therefore unless we adopt 4K we aren't yet seeing films/movies as they were projected in the cinema. I for one WANT to see my films as they were projected in the cinema (minus 30 minutes of godawful adverts and popcorn munching. phone using, chatting fools) and therefore, when a suitable disc player is available, I WILL be heading towards the TV shop!
Don't forget having the reels out of order!
I once watched a real piece of sci-fi schlock on DVD. It was a direct from VHS transfer. And when they'd made the VHS transfer they'd gotten the reels out of order. At first you didn't quite notice the abrupt scene shift. It only became evident when it jumped back to the first bad break. Although, the video scroll lines were intermittently evident throughout the entire film. I wish I could remember the name of it, if only to warn others away from it.
The quality of the content matters far more than the quality of the picture. I am still perfectly happy with my standard definition TV, but would like to see more money being invested in high quality programming, particularly the arts and sciences, rather than on glossy pictures full of cheap nonsense.
@jzlondon
I disagree. Sitting down to watch a film and getting that cinema feeling really does matter. I struggle with the Samsung 8000 full 1080p from bluray. It's just looks so plastic. I've fiddled with everything [ahem] on that screen, but it just doesn't look right.
In fact the 720p Pioneer kuro on standard DVD is way more pleasurable.
So it's not the number of pixels, it's what you do with them, as the vicar said.
HDTV became a standard thanks to the Blu-Ray and the Sony Playstation3 gave a huge sales boost worldwide.
Sony managers this time decided to steal money and release a Playstation4 fraud which is not the real Playstation4... In fact the real Playstation4 prototype it's still in Sony labs and it's known to feature a 16-core Cell 2 CPU that IBM built for Sony as well as a 500GB holographic discs unit....
If only Sony managers released that top-notch product instead of the ultra slow AMD APU Jaguar fraud along with Microsoft.. then now there would have been no issues releasing high-quality 100Mbps H.265 encoded 8K movies on 500GB discs for The-True-Playstation4 users.
So either Sony releases that prototype renamed as Playstation5 immediately and stops with the current fraud or there is not much hope for UltraHD to invade the market...
> HDTV became a standard thanks to the Blu-Ray and the Sony Playstation3 gave a huge sales boost worldwide.
I rather felt that HDTV became accepted as a standard because it became available at about the time when flat screen TVs became affordable. The move to flatscreens was a significant benefit to a lot of people, it save tones of space in the living room and made bigger teles practical. Digital and HD just happened to be happening at the same time and got a free ride on it's coat tails.
BTW, Sony's version of HDTV was 1920x1200 not the spec we ended up with which is lousy 1080p. Sure looked good in 1990 when they lent me their demo system for a show stand. Doesn't that make it pre-date the original PS?
Main TV in our house is 1080p, and it does have a BluRay player connected over HDMI.
Of course most content is DVD, or MP4 over DLNA even through the DVD player.
But that's definitely the "second" source - the Primary source is a NowTV box - 720p output.
Can I tell the difference - yes. 720p -> 1080p is a step. But for the convenience I'll take 720p any day of the week.
Why would I be looking to replace my TV, I already rarely use the resolution it's capable of - let alone buying another TV which I'd never use the full resolution of.
If/when it dies then I'll look at what is on the market, but there would have to be something VERY special to make me change before then.
I'd rather like a multiHDMI monitor with decent speakers. That would attract a premium spend from me, rather than "SMART" apps which I know will never see an update, and which I'll likely never use.
Most of the channels on my Time Warner Cable service were not up to the 1080 TV I used to view them. When they were HD the transmission format was so effed up that you end up fiddling with the screen format to get rid of various black bars, which of course are different on every bloody channel. Just switched to Directv and this problem is no more. However, I see no use for 4K TVs until the broadcasters can get their act together and kick the cable companies in the nuts about how they relay their material, and even then I don't think it will be a market booster, just a niche for videophiles.
Less breathy than usual but still characterised by stating the bleeding obvious while leaving out important qualifiers.
The report cited refers to US households and these are not the same as the rest of the world: for one thing average broadband speeds in Europe are > 16 Mb/s
4k will succeed only if the programming becomes available. This will be possible for films, many of which have been produced digitally in similar resolutions for years now. But films aren't enough. In order to create significant consumer demand, sports programming will have to adopt the new format. AFAIK Sony was trialling 4k at the world cup. Presumably the results are being analysed to see whether studios are prepared to make the necessary kind of investment (cameras, studios, multiplexes, satellites, etc.) to offer it.
Inasmuch as the technology for the chain has not yet been finalised (HEVC and VP9 are both still in development) it's a little early to expect a major shift yet.
But the screens are a by-product of the ever higher pixel density of our handheld devices and as such will enter our homes in the replacement cycle as our current generation of tellies will probably need replacing sooner than the last - our first colour telly lasted around 25 years, the second around 10.
I'm gonna let you in on a little US secret, so don't tell anybody else about this, ok?
Broadband speeds in the US mean fuck all when it comes to TV programming. Whether you're with Comcast, Verizon, Time-Warner or just about any other cable company, they've wired fiber to your house, or at least close enough to your house that the short cable run to your house won't seriously affect the available bandwidth. All they have to do is turn up the spigot. And since the cable stream is treated differently for billing purposes and they control the whole cable experience, they can turn it up or down as they please.
The reasons for slow TV sales are:
- most people who want one have bought a new TV relatively recently, and don't believe the endless hype about newer, supposedly better technologies
- the Register published a technical article a while back, clearly demonstrating that higher pixel resolution has little effect on perceived image quality, unless displaying mainly static images. This is due to the way the human eye/brain combination interprets moving objects. Frame rates are much more significant, so for any given bandwidth, less pixel density and more frames per second will always deliver a subjectively better picture. 4K is nothing but a marketing gimmick intended to appeal to the ignorant, not that that in itself is proof it won't work.
- fewer people are watching TV anyway, hence they don't want to buy one. Content is increasingly poor, and no amount of extra bling will polish the turd.
Yes, we need 96fps progressive, then Film at 24 or Digital cinema 48 can be interpolated. We need to migrate to 96fps instead of mad mix of 24p, 25i, 30i 25p, 50p and 60p at present.
Going to 96fps will give a huge improvement. Also Progressive higher frame rate source means that 3840 x 2160 interpolation from 1920 x 1080 will look really good, esp if the camera was 3840 x 2160 @ 96fps progressive and down sampled with anti-aliasing to 1920 x 1080.
UHD streaming or transmission at 25i 50Hz, 30i 60Hz, 50p or 60p is stupid.
The TV manufacturers want a repeat of the 'flat panel' effect. It won't happen.
For most people, TV's are a long-term purchase. Provided it still works, they would not normally consider replacing them.
Flat-panel TVs, once they became cheap enough, shifted the paradigm. People replaced perfectly functional CRT TVs, not particularly because the picture was better, but because flat-panel TVs occupy much less space than a CRT. Couple that with a significantly reduced power consumption for LCD TVs at a time when people were being made energy aware, and the CRTs went down to the recycling centres by the truckload. That enabled people to reclaim space in their living rooms so that the TV was no longer the major piece of furniture it had been, feel good about reducing their energy footprint and, by the way, have 'better' pictures (although I still know people who prefer high scan rate CRT TVs over flat-panels).
This was reflected in how fast CRT TV's disappeared from the shops once flat panel TV's got to within spitting distance of the price of CRTs. And often, it was not the high cost TV's that generated the profits. It was the wholesale replacement of hundreds of thousands (millions?) of TVs with low-to-midrange price tags that earned the money.
We won't see this happening again unless there is some overwhelming technology leap that provides a must-have feature. 3D and 4K are not that, and I can't really see anything on the horizon that would. Maybe a virtual floating screen so that you don't even need to dedicate wall space, but I doubt that is within current technology.
Planned obsolescence is the manufacturers best bet to keep TV sales ticking over (maybe that is why they use such damned poor Chinese capacitors - the single most common cause of TV failure), but I'm sure if it was revealed that this was a deliberate policy, the consumer groups would be up in arms!
I'd say that gets you about half way there. What finished it was switching from NTSC/PAL to HDTV (or equivalents thereof for your particular region).
In my case, the first flat panel I bought was an HDTV-ready set at 720 because I was looking for a larger living room screen. HDTV was still in the early draft stages at committee and HDMI hadn't been invented yet. Then the standards were mandated. At which point we still had a CRT style TV in the basement. About a year after that I went shopping for the largest one I could afford and which would still fit in my car. I looked at the stuff available in the stores and bought one that was at 1080 over 720 because the display picture quality was better. At this point, I have no reason to upgrade beyond the 1080. In fact, I OUGHT to replace the 720 because 1) it's a bit of pain to add stuff to since it doesn't have HDMI ports, 2) it has some sort of factory defect for which a recall has been issued. But it hasn't been high on my To Do list. Right now paying off one more credit card and then saving the down payment for a new car are at the top of my list (driving a 13 year old car at this point. Growing up I never remember my dad keeping a car more than 5 years).
Having ponied up for an HD TV I was shocked at how widely the quality varied on a single channel. It's more than the expected differences in source material and seems to involve different degrees of compression at different times of day. A bit of a con if you ask me.
4K will be another con like HD, unless 4K also includes a minimum bit rate.
HD simply defines the pixel size. If the bit rate is not good enough, it breaks up into a blocky mess. However because this blocky mess is still 1920x1080 its still called HD.
HD can look terrible, and often does on Freesat, as soon as there is movement. Without a high enough bitrate HD is woefull, and I fail to see how they can deliver 4K through the same infrastructure that fails to give decent HD.
I have nothing against progress in screen technology, but I would rather have seen the focus being on improved contrast ratios and more accurate colour reproduction along with improved scaling and IVTC.
Higher frame rates and resolution are nice, but there's no point putting the cart before the horse.
Why did my Samsung 1080p monitor with tuner cost me 400 $currencies and the same 1080p Samsung TV cost $2000 currencies back then?
Why is history repeating itself, with 4k monitors costing 400 $currencies now and 4k TVs costing 4000 $currencies?
If you want a 4k TV, search first on the COMPUTER MONITORS section. Perhaps they will slip some model that doesn't cost an arm and a leg, and does everything a TV does. My cable company provides HDMI-cabled set-top boxes, which means you don't even need a TV properly, any monitor with speakers does the trick.
Of course there will have to be 4K content for the sets to be of any particular interest. And apparently an enhanced Blu-Ray format that would permit 4K content to be distributed economically is not yet available.
But as soon as that happens, 4K sets should become of interest as soon as people feel they can afford one. Otherwise, they will be primarily of use to doctors viewing digitally-stored X-Rays.
...has circuitry that inverts the volume ratio of spoken dialog to background music* and sound effects, and automatically knows to leave it off for older material and on for new stuff, but knows about exceptions to the new stuff such as USA's "Burn Notice".
Bring out one of those and we'll start re-arranging the furniture and taking a crowbar to the wallet.
*especially when it's a cut from an album they want to sell you.
I remain totally unconvinced by all and any discussion of the "demand" for this (and frequently any other) new technology. It seems to me that companies within the media technologies industry have been desperately flailing around for the last few years looking for any new thing they can manage to hype enough to revive their flagging revenue streams. Blu-ray; HD; 3D; 4K; curved screens (yeah, right.... getting a little desperate there, aren't we, guys?) - lots of things they can do, nothing that the market as a whole seems to particularly want or need. And certainly nothing in the way of new tech that the market is going to avidly embrace while the vast percentage of available content doesn't support it.
I just bought a 50-inch Samsung HU8550 for $1800 (after $700 instant rebate) to replace the ancient 42-inch analog 720p plasma set that came with my new house (burn-in and crazy pixels galore). This unit is certified Netflix 4K compatible, but I fully expect the primary 4K content will be from projecting photos. A 4K TV has 8 megapixels and is ideal for that purpose. The price is about double what an equivalent 1080p unit costs today, and most likely the price will fall down to the current level within a year or two, at which point no one will buy a 1080p model, just like no one buys SD or 720p today.
The key is to buy a set with HDMI 2.0, HDCP 2.2 and HEVC/H.265, which only became available in 2014. To qualify for the UHDTV 4K label, TVs also need 10-bit color, which is not yet widespread.
Consoles. Consoles have, and will remain, miles behind PCs for proper high resolution gaming.
If you the PS4 and Xbox had been up to the job of using a 'current gen' 4k TV, or even 3D, demand for newer TVs would have been through the roof.
I'll be getting a 4k TV when I redo my living room later this year, hooked up to a gaming rig. But for normal TV viewing? I can't see the point.
This was the issue when HD TVs first approached affordable. I bought my first in 2006 and it was 4 years before I had any decent HD content displaying on it, by which point TVs were cheaper and better quality.
I bought my newest HD TV last year for nearly half what I paid for my first ever one. It is better in all ways and has Freeview HD. It has taken 7 years since my original HD TV for me to become fully content with HD viewing.
I did see a Samsung 4K demo on a curved TV and I could see the difference. It looked spectacular. Judging by the first HD 1080 demos though, I know it will be at least another 4 or 5 years before we can get anything approaching 'demo quality'. Either in terms of cost (people won't be as willing to swallow the jump up in price from Blu Ray to 4K video as they were from DVD to Blu Ray) or broadcast source (I won't pay Sky or Virgin to get 1 or 2 4K channels). When the decent 4K panels are sub £800, 4K is availalbe on Freeview and 4K video disk players cost less than £100 to buy, I'll consider.
I was in the Brisbane Sony Centre the other day picking up my SRS X9 and Sony's latest and greatest 4K TVs were out on display showing 4K content from the world cup.
Firstly more pixels does look nicer. Whether you are standing close up or further away.
But what I thought could really improve things and make the TV more like a window onto reality would be a much higher frame rate. The detail was really good on the footballers at 4K but the movement was still flickery.
I work at a major distributor in the US and the 4k sets are selling VERY nicely. (Curved sets, not so much) This is happening, right now. People ARE buying them, as in most cases, the 4k sets are selling for what a top quality 1080p set would have sold for last year. As far as content, we are seeing Netflix streaming 4k content RIGHT NOW, including their top watched shows, "Breaking Bad" and "House of Cards". YouTube also has TONS of 4k native content, including documentaries, "indie" films, movie clips and trailers and more.
So, poll all you want, nay-say all you want, but these ARE selling...like it or not.
So the poll, from a reptable company with mutliple data points, is wrong from the perspective of your single data point with a vested interest in not being laden with stock you cannot sell without massive discounting. Hmmm. If 'one' is buying a quality TV now and the price is not too different one might as well buy 4K as a means of future proofing. Just as not so long ago one might as well have bought 3D (still not much content outside of BluRay). However, at a screen size practical for many (most) homes one is buying pixels one cannot see (although in a chain of MTF processes one will still perceive some improvement) . As everyone is pointing out currently getting half decent 1080 broadcast content is problematic let alone 4K. By the time 4K content is abundant the 4K TV you buy now will be poor in comparison to then current models. Just as the LG "SMART" HDTV I bought 3 years ago is getting dumber by the second as its ability to support newer or reconfigured services diminishes (for Roku sells a cheap fix). Which basically sums up as right now there is no compelling reason to buy a 4K TV, but don't let me stop you; personnally I suggest saving your money and going to the pictures instead.