Music still sounds better in some ways/cases on (good) speakers than on (good) headphones.
Similarly, watching video on a (good) large screen is better, in some ways, than on a (good) small one.
Samsung has said that it intends to focus on UHD TVs using LCD panels augmented by quantum dot technology, instead of pushing OLED as a commercial replacement for LCD. This is the kind of decision that might mean that OLED never takes off in the larger form factors. Kim Hyun-Seok, the head of Samsung’s TV business, told …
As you say - SOME cases. But not to the extent most people bother buying dedicated HiFi equipment any more, they just use their phone or PC.TVs are still in the "everyone has one" phase, like HiFi was 20 years ago.
I'm not so sure TVs are on the way out but it's certainly a possibility.
I'm not so sure they're not.
I (and an increasing number of acquaintances) use a number of devices for video viewing, and they all interface with the TV in the media room (remember when it was simply the living room?), bedroom, etc.
The tv is simply used as a video terminal, with audio piped to the nearest convenient sound system. The only reason the various tv u its haven't been replaced with flat panel monitors is that nobody seems to make 55, 65 or 80 inch monitors at anywhere near the price of a tv set.
I may be unusual, in that we haven't had an aerial or cable point for at least 10 years (ok, we DO have a Foxtel IQ HD box, but it's wired into the network like everything else, so if I wanted to watch Big Brother live on my android tablet, I could, but then I'd have to kill myself), but I do see a definite trend away from using the tv as anything except 'another display device'
A prog which proves beyond doubt that actual real people sit around with a glass of wine and watch tv together. It's a social glue that sticks families and friends together and will never be replaced by 5 people sitting around watching X Factor on their individual screens.
Even if this does come to pass it will not be the equivalent of the current experience and we will be the poorer for it - which is why I don't think it will happen.
What is this "bulk cheap batch" you think Apple is getting? Hopefully you don't mean for the Apple Watch....that's a small screen and its sales figures will be dwarfed when you add up all the Android phones using OLED screens.
Unless Apple is going to finally introduce the fabled TV, and it will be OLED, and they sell several million of them, Apple isn't going to be moving the needle in the OLED market. In fact, they've stayed well away, due to its issues with brightness, color accuracy and manufacturing consistency.
We were promised OLED TVs for bloody ages now, always 1-2 years away (much like fusion, only in decades). Instead, we got "LED" TVs that are actually boring LCDs backlit by LEDs - which are SO much better than classic LCDs (no they aren't) - and IPS LCDs - which are SO much better than classic LCDs (only compared to LCDs...). If they give up on OLEDs and pull the same fast one on QD technology, ending up offering "QD backlit LCDs" instead, I'm just gonna flip the fuck out...
For sure, its very hard to get excited by terms like "QD backlit LCDs", but they are only marketing shorthand for tangible benefits that can be expressed in numbers, such as:
I'd be interested in the technologies that can provide a huge dynamic range in the displayed brightness - but to get the benefit then the content would have to be shot, processed and delivered on HDR kit.
For now, just reflect on how cheap room-filling televisions are these days. The communal experience is still valid, be it for a good movie, co-op video games or a thought - and discussion-provoking - documentary.
'apart from anything else the latter is always going to be easier to sell to Currys punters on the basis 4K is a bigger number than 1080.'
I actually witnessed this in Currys at the weekend - elderly couple in buying a 55" TV for 10YEAR OLD grandson for Christmas ('it's what he wants') being told to go the 4K route as 'loads' of programs are already being broadcast in that format...in fact, it's the 'HD standard these days'.
When the not-quite-hip-enough-for-Apple-Store assistant had this questioned by Grampa - who could not see any difference between the 1080 and 4K demo pictures - he was told that it was probably due to his glasses (!) and that the higher resolution was better to have as 'more is better'.
OLED life is poor. They are not "real" LEDs in the sense used in massive auditorium panels but combination of phosphors and screen printed diode like electroluminescent dots.
Also the colour isn't as good as real LEDs (Only Sony has one, and only a demo). RGB LED backlights and improved filters mean that high end LCDS are as good but cheaper than OLED
OLED life is getting better all the time and is now good enough for most of us: we used to have tellies for > 10 years but I suspect that the norm is now about 5 years. You have to be prepared to tweak anything to get the colours okay.
RGB backlights and filters increase the complexity (massively so for 4k) and cost of LCD with the hope that scale will reduce this over time.
OLED scalability is based on the printable dream. If that ever happens then it will overnight become cheaper than LCD. Obviously, at the moment Samsung can't get good yields on large panels and is concentrating on screens for phones and tablets.
so you prefer LED's being filtered through an LCD crystal? and a colour filter? (the LED's are white (or near as possible) TV's are nothing like a jumbotron display, they do not use individual leds per pixel, the LEDs are merely a different light source and replace the old CCFL tubes)
I don't think you know much about TV tech..
The main issue with OLED's has been lifespan and differential aging (blue dies a lot quicker than the others)
This has got a lot better.
LCD TV is just terrible,they suffer viewing angle colour shift, crap contrast (they use load's of tricks to try to fix this, but all rubbish,gray scales are rubbish showing banding. and numerous other problems
Now they have phased out plasma, the choice without OLED, is shit (yes, I know the programmes are!) TV pictures...
Thats just plain Plasma bias BS. Greyscale is always reference on the top LCD models,what other numerous problems? Green blobs,? orange tints?, glaring Image Retention? Noisy fans? Loud power clicks.Dull picture unless curtains closed- Not a daytime TV in general are Plasma.
Cheap plasmas or cheap LCDs aren't good but the flagship 4k Sony & Samsungs etc are stupendous TVs. BUT FOR THE RECORD BOTH PLASMA & LCD HAVE ISSUES & have done for years.
Sony has captured deep blacks on LCD TVs years ago when their w series LCDs left plasma trailing.
The king of the hill was the Pioneer LX or KRP KUROs 60"s & might still be for PQ alone:-)
I really want OLED to work. It fits as the next great tech. The piece has a very interesting conclusion and it very well could be the case that OLED *IS* the future of media content but as it concludes on the smaller screen.
Good piece, good interesting conclusion, at least from a discussion view point. It'll also be interesting to see if an AppleTV sporting an OLED display, if that ever happens, could awaken the OLED market again -- could this be why LG are staying in the market, they are committed to making displays for Apple and hope to piggy back off their success in the OLED display market? Hope so, would like to see the tech succeed.
Video gamers spend more time in front of "a TV" than most TV viewers. In this market, the immersiveness of a bigger display panel will beat any colour or contrast benefits of a tablet.
I wouldn't put much store by Colour reproduction as a deal-breaker either: bear in mind that 8% of males (and 0.5% of females) have colour-deficient vision - about 40% of these aren't even aware that they have such a condition until tested.
About cinema and television.
Why would people troop off down to the flicks to watch a big screen, when they could all have small personal TVs at home?
Sure, cinema attendances declined for a while and people changed their viewing habits, but the essential big screen experience didn't disappear. It just got augmented by other formats.
You can rinse and repeat for many other "new" technologies.
It would be interesting to get figures on how many hours are spent watching films in cinemas per week, compared with TV. I don't know if the data is available but you could for instance find a 1-cinema town and ask how many tickets they sell per week, multiply by 2. Scale up based on town population Vs national population and compare to official TV viewing figures.
My total guess is cinema viewing is tiny in comparison, like 2% of TV watching 'man hours'. Anyone able to do some guesstimates or provide data?
What will be interesting to see is, if TV migrates to tablets and computers (as it has in our house - we have no communal TV at all), will the overall amount of TV viewed drop?
I suspect a lot of TV is there just because its there, not because people are actually interested in it.
I've seen this both on TV and in cinemas - the camera pans quickly and you can't really see clearly what's happening, like it's not really that smooth.
I don't understand how that happens on a modern TV or projector - is it actually part of the recording, the display technology, or my eyes/brain? It's certainly worse in some cases - F1 on NowTV streaming for instance - but considering I see it in big budget movies in the cinema I am starting to wonder if it's me?!
The answer is that cinema features are filmed at twenty-four frames per second. That's enough for the illusion of movement, but not for smooth panning. Cinematographers and directors go to great lengths to distract the viewer from this phenomenon when tracking their subjects.
The reason you're seeing it on the F1 Now feeds is different, and could be display-rate mismatches between the source stream (50fps if it's a British broadcaster) and your tablet's display (60fps, as most display LCDs are) - you'll get a kind of "6:5 pulldown" - six displayed frames fed by 5 frames of input material: one is doubled. The streaming server could also be dropping alternate frames to save bandwidth, thus reducing the time resolution to 25 images per second.
The rest of this is long and a little rambly, so you can stop reading here.
This low frame rate is a legacy of the technology that was available in the 1930s. At that time, the available mechanisms could not pull a new frame into the camera any faster without tearing or slipping. (24 frames a second isn't fast, but remember that the new frame is pulled up into position in the tiny fraction of a second that the projector's shutter closes, so the mechanism needs to be much faster).
Raising the frame-rate of cinema presentations is possible (has been since the early 1970s), but experiments with the viewing public showed that audiences do not like the effect of high frame rate cinema: it looks less "real" than the slower rate, and the reason has nothing to do with technology:
Traditionally, lower-budget TV drama was either broadcast live or captured on videotape, because video is far, far cheaper than film production (not least because videotape is reusable and takes can be reviewed instantly). Video did have one advantage over film, however, which is that it has a time resolution of 50 or 60 fields-per-second, which makes motion, and especially panning, much smoother. Higher-budget TV shows were still filmed on 16mm or 35mm cinema stock, at 24fps, because this allowed exterior and interior shooting (cheap drama was studio-bound; video cameras were too at first) and it got around the issues of selling your programming to a station with an incompatible video system. A desirable side effect is that these telecine presentations looked like "a real film" rather than "a TV show".
But, thanks to the historic use of video for cheap TV drama, a high frame rate is now almost indelibly associated in the audience's mind with low-budget videotaped TV shows or live events. Simply displaying a film feature at a high frame rate will suggests the same low-budget, "unrealistic" experience to cinema viewers, or will make it seem like they're watching a live broadcast - by making it look like the actors are "right there", it also makes it look more like they're "just actors", thereby destroying some of the suspension of disbelief.
The recent release of Peter Jackson's"The Hobbit" was offered at both 48 and 24 fps, but audiences responded that the 24 fps showing was "grander" and more "epic" than the 48. The 48fps was reported by contrast as being "like watching TV", and "fake". The 48 fps presentation was either dropped for its sequel, or very few cinema owners took it on.
If you've got a TV with motion interpolation, as most modern LCDs do, watch a BluRay source of a big blockbuster film first at its native 24fps, and then with all of the motion gubbins turned on. The latter might look more "real" but it also looks less "cinematic".
And in the same effect from the opposite side: pretty much all TV productions are now filmed digitally, and broadcast at 50/60 fps, but big-budget drama is either shot natively at 25 or 30 fps or is de-interlaced down to a 25/24 frame-rate in post production to a achieve a "filmic" look.
Like it or not, 24 frames per second is considered as a sign of "quality" by the viewing public.
The NowTV is actually on my (reasonably decent but not amazing) plasma TV but maybe that's irrelevant?
I was aware about the 24fps thing, but I thought other things mattered e.g. TVs boast of 600Hz (or whatever), 5ms response, etc.
Is it really the case that everyone sees that nasty judder even on top-end TVs then? I guess I thought all that fancy processing in a £2k tv compared to a £500 one magically fixed such things!
And - thanks for the great answer.
The problem with frame interpolation currently abused on HDTV's is that it tends to be too agressive and applied to the entire movie. Whereas what would be better would be for it to intelligently sense panning movement and only apply it then.
My own thought is that, now that cinema presentation is mostly digital, there's no need to stick with the same frame-rate all through the feature. Certain types of action, particularly close combat, would definitely benefit from a higher frame rate, but that doesn't mean that the rest of the feature should be similarly high-rate. (Cinema projectors project each 24fps frame multiple times as it is, because this minimises the perception of flickering)
I just wonder if it's possible to do this change of display-rate in a way that won't provoke jarring changes in a viewer's perception.
[late to the game]
Nice backgrounder on the history, but...
No mention about digital TV using compression, especially TV of the broadcast kind frequently using **too much** compression (so they can get more channels of tat in the same RF bandwidth)? Even if it's not 'excessive' compression, the quality of the displayed picture at home is subject to the vagaries of the decompression implementation in the TV in question. Not all implementations are equal. Some even have bugs!
See also: key frames, with all the detail explicitly present, and intermediate frames, where the content is interpolated by whatever chip is doing the heavy(ish) lifting.
It's all a swizz anyway. If there's enough bandwidth in the system to cope with high quality pictures of fast moving action, there's no need for compression. If limited bandwidth means there's a need for compression, especially of the lossy sort, there's always going to be loss of detail, (de)compression artefacts, or both.
Ye cannae change the laws of physics.
Biting the hand that feeds IT © 1998–2020