Re: I will wait for a while
"Currently LG's OLED TV's have *HUGE* input lag due to the processing." "Better smooth motion algorithms (I still see ghosting around moving objects if you set it too strong)"
Who can explain to me in simple terms *why* the TV is doing any significant processing of the input signal, and why there are user specifiable settings that change how it is processed (other than the usual brightness contrast etc). Don't answer yet, read right to the end first thank you.
Is it purely to do with compression/decompression algorithms/standards (algorithms, not implementations) and perhaps probably to do with (too much? poor quality?) compression of the input signal?
Is the signal now so massively compressed that sometimes you need (say) 5 frames of picture buffered up in the receiver before you display the earliest of the (say) 5 frames, which is always and inevitably going to lead to (say) 5 frames of lag? 5 frames at 25fps is 200ms. That's a lot.
Common poor implementations of compression/decompression also seems to lead to the joys of having the audio and video out of sync by just enough to be really irritating.
And the joys of picture artefacts with no way of knowing whether the broadcast signal or the set you're watching has the problem.
I can see how UHD might possibly have some attractions in the right circumstances given a suitably high quality relatively uncompressed source. Compress it to hell and back with carp software in the middle and where's the point?
Fifteen years ago I used to understand a bit about DSP technology, picture encoding/decoding, motion estimation, keyframes/intermediate frames etc. So I got some knowledge of the fundamentals. Then I changed jobs and stopped paying attention.
I don't see how we've ended up where we are today, with pictures (and sound?) so overcompressed that the (broadcast) system is so far from "high quality" (rather than high definition) that it's almost not fit for purpose. And I don't see how any practical IPtv world can avoid the same issue.
Example: watching Doctor Who on Freeview HD BBC TV a couple of years ago on my then-new Sony HDTV, and realising that the large-area picture artefacts were almost as bad as on my 1993 Voyager CD-ROM of A Hard Day's Night . E.g. quantisation issues in large areas of smoothly changing colour, leading to "stepped rings" around a bright patch of light on a relatively dark wall, reminiscent of too-low-bit-depth encoding. Tolerable (expected?) in 1993. But in 2013? Sorry about the carp description but hopefully it makes some sense.
All input gratefully received.