
Does that mean that large monitor screens for PCs will also become commodity items - with 2160 pixel panels the standard?
When you can buy a 4K telly at the supermarket, along with your bangers and mash, you know Ultra HD has entered the mainstream. Retail giant Asda is now stocking the Polaroid-branded P55D600, a 55-inch Ultra HD screen for just £699. If you want to really push the boat out, you can heft the 65-inch version into your wonky-wheeled …
You don't have to dream. Just buy one of these TVs, and get a display adapter that will do the correct level of HDMI, and plug it in. If you don't like HDMI, the TV will probably have component video and maybe DVI as well, looking at the back of the TV under review.
Whether it would be any good as a monitor is a moot point, but it would be an interesting exercise.
I still say that sitting that close to a 40"+ monitor would be an uncomfortable experience, but I thought that back in the early 1980s the first time I saw a 20" black and white monitor on a Sun 2/50 after working on 12" VDUs, and look where we are now!
Be careful, I took the dive and bought a Panasonic 48" 4k for my office. Over hdmi on normal graphics cards you can only get 30hz refresh at 4k unless you have an hdmi 2 compatible graphics card.
It worked fine at 1080p but defeats the point.
This make everything jerky as hell even mouse movement, most TVs don't have display port that is needed.
My only option was to stump up for a gtx 980 currently the only card with hdmi 2 out at 60hz
Monitors and TVs are two different beasts in this respect with regard to inputs for PC.
I'm typing this looking at a 28" 4K monitor. I can (just!) detect the difference from my previous 26" 1920x1200 display (or so I tell myself). But (of course) I'm looking at it from 0.5m, rather closer than I normally sit from my telly! So I reckon that with a 56" screen you could probably see a difference with 4K if you sat 1m away. As most of us sit at least 2m from the gogglebox, that would imply a 112" screen, which would rather dwarf most living rooms (and, at current prices, break most bank balances - or, at least, marriages).
The wry comment I was trying to make is that if you are relying on the manufacture of TV panels to make UHD monitors available, you would have to accept the size of the panel as well.
I cannot really see any TV manufacturer making a UHD TV smaller than 32", and looking at the story, 40" was the smallest TV referenced. That's where 40" came from.
This time, I was not making any comment about whether UHD was really going to increase your computer experience (although I have in the past).
Well it probably will take up less desk space than my current 2x23 inch setup... but you get _twice_ the screen space!
Unfortunately display makers will probably make UHD computer monitors in 23 inch and smaller. Pretty much the only use for that is when you want to make vector fonts look passable on a screen.
"I cannot imagine making the desk space for such a beast!"
We adjust to any increase in size or speed very soon - and then can't believe we ever used something as slow/small as the previous one.
My nominal 21" CRT monitor in about 2000 cost about £700 and was very bulky and very, very heavy. Replacing it with an LCD was disappointing - not the same contrast and clarity. However - modern monitors have mostly sorted the visual problems - and weigh very little.
I even started taking wide-screen monitors into the office as a "test" for spreadsheet work - as the 4:3 company issue ones were like looking through a keyhole.
I currently use a 1920x1200 27" - but find it time consuming to access the overlapped multiple windows in a run I do once a week. Going to a second wide screen would not solve the problem of screen height. A screen that is larger both vertically and horizontally would do the job nicely.
I have a 30" 4k monitor - picture quality excellent, no ghosting. Major downside - little software is written to handle high DPI and becomes ususably small. Windows 10 is a bit better and many system apps are now high DPI ware, but the new fallback for apps which are not is little more than crappy pixel scaling and looks terribly fuzzy and worse than just running a 1080p monitor..
For the time begin I would stick with 1440p or 2560x1600 if you want 16:10; it's a happy medium between gaining screen real-estate and not being headache inducing ....
It was a black day for me when my 21" CRT gave up on me. Samsung were nice enough about it and sent me a replacement monitor but they'd pretty much ceased with CRT's by then and said it had to be a flat panel.
Definitely agree with the premise that 27" is about right when you're sitting .5m from it. Tried sitting right up to a 40" to play PlayStation games and it doesn't work - to much of the screen is outside your peripheral vision so you keep having to look around.
Re. "...40"+ monitor..."
BSA Tau = Five minutes.
From plugging it in, saying Wow!, being amazed at the huge screen size... ...to thinking it's normal. "Could be just a bit larger actually..."
5 minutes.
I've had this happen so many times, with formerly 'huge' TVs to formerly 'enormous' 27" monitors, that I've noticed it and timed it.
5 minutes.
"Yes, but only if you want a 40"+ monitor"
My currrent monitor wall is larger than that (4 x 17" 1280*1024 displays)
Replacing that with something that has slightly fewer pixels overall is justifiable simply on the grounds of not having to mess around with Xinerama and it'll have twice the height. Yes, it'd need a pricier video card to drive at more than 30Hz but it's still a viable option overall and I can retire the 4-headed quattro that Nvidia's given up supporting.
As for desk space: just mount it on a pole or something. I do that already - there are a few cheap-o monitor-arm suppliers kicking around so this isn't a £300 cost - more like £50
I've had a 42" Panasonic plasma for about 5 years now and have yet to see anything within the "consumer" price range that would make upgrading worth the investment.
When this one gives up the ghost I'll probably get a second hand Series 60 to tide me over until OLED makes a dent on the scene but even then I'd be quite happy with a 1080p panel; My eyes aren't nearly good enough to make out any extra detail at all on "smaller" 4K sets from typical viewing distance.
Maybe in a few years time seeing a demo clip of Pacific Rim 2 in the wider colour gamut afforded by 4K will change my mind but until then I think I'll stick with the Luddite position.
"These LED based screens still have an inferior picture to a decent Plasma."
It's only a backlight - and plasma chews more juice than an equivalent CRT.
OLEDs are nice but I'm not convinced that the blue will last more than few years.
What looks very promising in terms of efficiency and gamut is mono LCD with quantum dots behind.
I have 2 37" monitors (barely) on my desk as well as a 47" TV connected as a third monitor over on my left in a corner. As a sports photographer I'd like to think that all this screen estate is absolutely essential for my work editing photos and videos. I'd like to but I'm too busy playing world of warcraft on my MASSIVE screens :)
WTF is the 1st Rule?
Multiple screen real estate is REQUIRED to handle video/picture manipulation, do our jobs efficiently and with great purpose, to further our pointy headed bosses' careers!
What is this gaming you speak of? Surely it's against company policy and no IT professional would EVER waste company time with such fluff....
With the eye on entertainment these are aimed at, it is often overlooked that the 55" are exactly the same as four 1080p monitors. They really make your spreadsheets pop, it's enough desktop space for developers to have all their tools open, you can work on photos with a better eye for details and the big picture.
If you will remember when we went to larger screens and higher resolutions years ago, going back felt like building a ship in a bottle, or watching a game through a knothole in the fence. So too this.
"it is often overlooked that the 55" are exactly the same as four 1080p monitors"
Sorry, but that makes no sense.
You're comparing physical size to resolution. You need to specify both.
A 55" 4K TV is much smaller physically than the screen space of four 1080p 70" monitors.
A 55" 4K TV would have the same screen space as four 27.5" 1080p monitors, is that what you meant?
Great, so now they'll start overcompressing the shit out of their HD channels, to make 4K look worth having, just as they did for SD when HD arrived.
Is there anything on TV anywhere in the world where seeing it in UHD in an average living room would actually make a difference to the viewer?
> Is there anything on TV anywhere in the world where seeing it in UHD in an average living room would actually make a difference to the viewer?
It's more pixels, you fool - stop talking heresy. The pixels are what counts, not whether you can tell the difference.
Next you'll be moaning about having a 4K screen in a smartphone form factor, which is obviously even better.
Is there anything on TV anywhere in the world where seeing it in UHD in an average living room would actually make a difference to the viewer?
Almost certainly - start overcompressing the shit out of their HD channels, to make 4K look worth having, - will be enough for 90% of their subscribers to start going with 4K instead.
To be honest, I'm still not that convinced of the benefits of HD in some contexts. IIRC there was quite a thing about football matches being in HD, why? As long as you can see the players and the ball clearly, do you need to be able to identify individual blades of grass?
Movies in particular benefit though, and will likely benefit all the more so from 4K, and at least (unlike 3D) it'll apply across the whole movie rather than being used for the benefit of a 2 minute scene.
Movies in particular benefit though
Meh. HD doesn't make the plot, dialogue, acting, or directing any better. Personally, I don't much care about the rest.
We have an HD TV (couldn't really get anything else when the old one died), and maybe a hundred HD channels. We rarely watch them because the SD equivalents come earlier in the online program guide, so we never scroll as far as the HDs. When I do happen to see something in HD, the increased resolution doesn't do anything for me.
I remember when we (my family and I) used to timeshift programs using VHS VCRs recording in EP mode. That didn't hurt our enjoyment. Indeed, I find I'm less interested in television these days, with my 500 channels and on-demand and Netflix and HD and DVR and blah blah blah.
Now if you'll excuse me, I have to go yell at those kids.
HD was pretty much a waste of time for any sensibly sized TV. HD is a little sharper and brighter, but not so much that it's worth paying the extra for. Ever notice that the Freeview EPG relegates the HD versions of channels to the hinterlands? It's almost certain that everyone is merrily watching the SD broadcasts on their HD tellies, and they don't care.
UHD will surely be a big hit with the mugs who think their TV needs to be the size of a wall, though - I'm sure the SD Freeview signals will look great when blown up to 50 or more inches.
"As long as you can see the players and the ball clearly, do you need to be able to identify individual blades of grass?"
If you ever watch any of the old classic games from the 70's you will notice that you actually see more of the players than you do with the current 'shot from a passing satellite' view that seems to be the norm.
Most Freeview HD compliant kit that I've played with prompts if the same programme is available in HD.
Our cable provider pushed through an over-the-wire box update a few weeks back that does this, among other things. Yet another obnoxious intrusion when I'm trying to watch something. They're just pushing me closer and closer to dropping them entirely.
And we have WOW, which has been rated the least-despised cable company in the US by people responding to Consumer Reports surveys. I can only imagine what things would be like with, say, Comcast.
> HD was pretty much a waste of time for any sensibly sized TV. HD is a little sharper
I did a back-to-back comparison of Earth's Natural Wonders broadcast on BBC SD & HD (1080p), watching from ~9' on a 49" screen, and it's not even remotely close - there was significantly more detail on the HD broadcast.
> UHD will surely be a big hit with the mugs who think their TV needs to be the size of a wall
I recently replaced my (ancient) 32" CRT TV. Given that any LCD TV was going to be further away (i.e. against the wall, rather than having a 2' deep cabinet), it would have taken a ~42" TV to occupy the same field of view when sat at the same point. I originally thought that a ~40" TV was too big and a ~50" TV was silly. I was not right.
Ultimately, the TV industry is pushing 4K - with the exception of Sony, all of the major manufacturer's 1080p units have a noticably worse image (when displaying 1080p content) than their 4K units (IMO & IME).
> I'm sure the SD Freeview signals will look great when blown up to 50 or more inches
It doesn't. Watching from ~9', some channels look "OK", but some look utterly awful. DVDs fairs a bit better, although they still look a tad soft, especially when compared with Blu-Rays.
"[Freeview] Watching from ~9', some channels look "OK", but some look utterly awful"
Is this a surprise, given the excessive compression apparently used on some of the advert-delivery networks? 57 channels and almost nothing with a watchable picture, never mind nothing on.
Genuine question: is UK (Freesat) satellite any better in compression terms?
It's funny, because we're not even fully in an HD world yet. The TV broadcasts are mostly only 720, rather than 1080, and yet we're upping the ante even more. Cricket is better in HD, and so I'd imagine tennis and golf are too. I'm not sure it really matters for slower moving sports with bigger euqipment like footie.
There's a brief mention in the article of HDR. And I think that could be a big leap forward with 4K. I've never even bothered getting a Bluray player, DVD is enough for me - I don't believe there's enough of a difference to be worth shelling out so much extra per disc.
I've also got very poor eyesight though, so my opnion is of less value in judging what other people are seeing. Although given how poorly set up many peoples' tellies are, I think it's safe to say that they're not really caring.
But I do struggle in those films and shows where the director wants to shoot everything in moody darkness. I've even had to resort to changing the picture settings, if turning off the lights doesn't help. And HDR might be excellent for that.
Otherwise, it'll have to wait until my TV dies. At which point I'll get whatever's sensibly priced. If we aren't on to 16K by then...
Although the smart stuff can bugger off. If I want smarts, I'll get something not coded by gibbons, which is what the TV companies seem to use to write their user interfaces. At the moment that's a Chromecast, so the UI is my tablet or PC.
I used to think the same as you with Blu Ray and DVD, and even when I eventually did buy a BR player I was a little disappointed that the movies didn't look too different - I certainly didn't see any great picture quality revolution like you see on the demos in-store.
That was, until I watched a DVD after exclusively watching movies on BR for a couple of weeks. Back-to-back the difference is easily visible to me, and on the whole BR discs aren't much more expensive now than DVD was 5 years ago. DVD is cheaper still now, of course.
I have to say though, I do enjoy a well-encoded 3D movie very much on my passive 3D screen (I'm very sensitive to flicker). I realise I'm in a minority here, but I've always had a penchant for stereoscopic 3D tricks. Jurassic Park 3D is surprisingly good, you could almost imagine it was shot for a 3D presentation. I was hoping that the new LG 4K screens (with passive 3D again) with their double resolution would be able to play 3D 1080p at full 1080p per eye instead of the half-vertical res I get right now, but apparently they don't work like that. Shame. Still, resolution is slightly less important (since your brain merges the two images) in 3D presentation than flicker or crosstalk, both of which are great on the passive system.
Personally, I'm looking forward to a 4K screen with passive 3d that works well and allows 3D 1080p content to be displayed perfectly. In order to replace my not-very-old 1080p set, it will need to also have HDR and an upscaling option which will turn each 1080p pixel into 4 4k pixels would be nice for gaming. Fingers crossed something will appear in a year or two when the HDR specs are final.
It's great that you can get something so cheap, even if you do have to put on your slippers and go to Asda. But the limitations are going to be the things that niggle. Perhaps not now, but a couple of years down the line.
When you decide you really do want Netflix in 4k for their latest new series, for instance, and the built-in app not only doesn't do it now, but likely won't have an upgrade. And you realise that there's only the one 4k capable HDMI port, which might be fine right now when you just want to plug in the BT box, but become rather limiting once there are more sources. Just like early flat panels had the composite, and SCART and S-Video connectors which were great when they first came out, but before long the single DVI or HDMI was no longer sufficient.
I really hope that, if we do get some sort of UHD labelling that it's a bit clearer than the old 'HD Ready' stuff, and much more future-proof.
It would be great, for example, if people like Netflix, as well as letting their software be built into sets like this, would come up with a clear and simple branding so that you can see at a glance if it has a 'Netflix' logo or a 'Netflix4k' logo, which will tell you if it's going to give you 4K, regardless of the screen resolution.
And a spec for UHD should probably also include things like a minimum number of 4K sockets (or a standard way of highlighting how many), and clarity on whether or not, for example, the HDMI-ARC port that most people will use to link to their AV gear is 4K compatible or not.
So, sure, you can buy one right now. But in my view, it's still sensible to wait.
Well, any labelling system needs to be clear about what exactly it means, for both consumers and others.
The biggest issue with HD Ready was that, unless people did their research beforehand, they often assumed it meant one thing (ready and capable of receiving HD broadcasts) when it really meant another (capable of displaying stuff from an HD source plugged into it). The (separate, and later) HDTV spec meant there was a receiver/decoder inside.
The original spec didn't mention resolution (not such an issue with 4k), and was a bit vague on connectivity, too.
Given that there are likely to be both Phase 1 and Phase 2 specs for UHD, and that some of those may include HDR, and higher frame rates, and that older versions of HDMI won't support 4K, let alone those extra things, it would be good to have some clear marking so that consumers aren't confused.
"it would be good to have some clear marking so that consumers aren't confused."
Some cynical people (who, *me*?) might suspect the original "HD Ready" / "Full HD" labelling was intentionally confusing and misleading, given that a lot of those early flat screens the industry wanted to shift were only 720p.
If the current wave of sets don't support supposedly forthcoming UHD features such as HDR, it'll be a convenient excuse to sell people a replacement for their "outdated first-generation" UHD set in three years time. But I'm sure there's no vested interest in the manufacturers not making this clear to early adopters. ;-/
It's great that you can get something so cheap, even if you do have to put on your slippers and go to Asda.
From observation, the majority of TV sets sold by supermarkets are high on specification, low on quality. Asda or Tesco are almost the last place on earth I'd look to buy tech goods (albeit marginally ahead of CurrysDixonsCrapphonewhorehouse).
Ok, there are just following the rest of the TV makers it seems
It seems that trying to get a 4K TV with a FreeSat tuner built in is next to impossible at the moment.
I won't subscribe to Sky, VM or BT and I really don't want another STB.
My current 7yr old LG 1080p TV is on the way out so I was looking to upgrade to a 4K tv in the near future.
I would really like to know why TV's this FreeSat are so scarce these days. Perhaps Rupert M has told the makers not to be so silly and let him supply all the Sat TV signals to the TV's?
It's not that the DVB-S/S2 is different, it's the EPG data that's the problem. In most countries, where satellite has simply been a matter of pointing your dish at the sky and tuning in, EPG data is something that's picked up from the tables on each transponder, and you don't have things like specific channel numbers.
In the UK, other than for enthusiasts, satellite has always been sold as a package, with an operator (like Sky or Freesat) bundling up channels, providing platform services like an EPG, and subscription management.
That requires specific things to be done, especially if the operator wants to have 'advanced' functions like series link. As a consequence, the software for a platform-based receiver is a lot more complex than for a more open one. Perhaps if things like Sky Italia and Sky Deutschland spread their iron tentacles across the continent, and open satellite becomes less common, we'll see satellite sets with modular software that can be adapted to lots of platforms easily, but at the moment, I expect the makers simple don't see much return for the effort of creating special software for one country.
What package(s) has/have FreeSat? I am sure that there are a few readers that would be interested?
what part of 'Free' don't you understand.
Ok, there are some Encrypted Channels if you are into Pron etc but for the rest of us? The 100+ channels are more than enough.
I guess that I'm an enthusiast in that I eschew VM, Sky and BT Ad-laden packages.
Nah, I'm from Yorkshire so it is in the blood that I'm a tightwad when it comes to money.
Freesat itself is a package, albeit one of FTA channels with no subscription, in that it's provided as a complete proposition, with a list of participating channels, their numbers, and a unified programme guide.
That is what both BSB and Sky did too, and is the model of the current BSkyB and of operations like Sky Italia.
It's at odds with the view of satellite on the continent, which is much more of satellite as a distribution method. You'd point your dish in the right direction, and get Pro Seiben, Sat 1 and so on, and set your box up how you wanted, and the amount of EPG data you get will vary, depending on what you're watching.
In the UK, thanks to the early introduction of packaged systems, that way of using satellite is very much an enthusiast approach. Our broadcasters like the idea of a packaged platform, because it gives them things like a fixed and known EPG position that they can include in adverts.
But to make that work, the TVs need the right sotware. The Sony ones linked to here have, as far as I can see, the requisite DVB-S/S2 tuners to pick up the channels, but not the software to provide the EPG services that make up the rest of the platform. So, for instance, while you will get the channels you want, the numbers will be all over the place, and it's unlikely you'll get much more than 'now/next' info on any particular channel.
I don't want a broken computer to use as a TV*, don;t want such a huge TV either (even if I could afford one) and as the optical characteristics of my Mk1 eyeballs are slowly deteriorating with time, I can't see any sense in buying monitors with even higher resolution than the ones I have.
(Boy! Take this email to the residence of Mr L Reg sharpish and there's sixpence in it for you if you're back within the half-hour!)
*IMO a multimedia linux box'd do a better job, and be cheaper, too.
Not that I'm going to get one now, but I hope the uptake in 4K tellies will push the providers to sell 4K content over them thar Intartubes - at 55MBPS.
I want that to happen because as of now, there is now way the existing infrastructure can deliver that kind of bandwidth to individual homes. Therefor, people will have their 4K tellies and will not be able to watch 4K content which, of course, they will feel entitled to. That means that the pressure will be enormous to actually deliver on that 55MBPS requirement.
And that means that all this 4K nonsense is going to improve my Internet bandwidth. Since I watch TV on satellite, all those yummy MBPSes will be for me.
So go for it.
Therefor, people will have their 4K tellies and will not be able to watch 4K content which, of course, they will feel entitled to. That means that the pressure will be enormous to actually deliver on that 55MBPS requirement.
More likely the content providers will auto-adjust the resolution to suit the bandwidth available to the customer, so most people will end up watching in 'ordinary' HD oe less while convinced they're watching UHD "coz it's a 4K TV, innit?"
I did thanks.
Brought myself a HiSense 65" 4k screen from Amazon about 6 months ago, cost £875.. Best toy I've brought in a long time. The screen is far more vibrant than my old 55" LG.
The only downside to a 4K TV is it will only operate at 30Hz (limitation of HDMI 1.4a), so its not really great for games, but adjust the resolution to 2560x1400 and your away at 60Hz. I've got a gaming PC rigged to mine with a GTX 970, games like the Witcher 3 look and play amazingly!
I honestly don't see how people like Samsung justify asking for £2,000+ for similar screens!
Basic Internet Round these parts is a solid 5MB (49Mbps), with options of 10MB (100Mbps), or even 20MB (200Mbps).. If you can afford it. I'll likely be moving up to the 100Mbps Line sometime in late 2016, after my current line contract expires.* And these are all VDSL Speeds. Had I wanted to I could go back to the Cable Co. (Liberty Global), and get upwards of twice the fastest VDSL 40MB (400Mbps), right this instance. For ~90-ish Euro's a month. But I find my 50Mbps to be quite fast enough at the moment.
*If not an even faster one by then...
Excellent, I would buy one for my living room pc
...except I can't plug it in.
No TVs with displayport 1.2
No video cards with HDMI 2
Seriously - the common devices which can generate 4k pictures and video easily (4k streaming video being the only other option), which everyone has one of, and you can't connect them to a device which has a selling point of being able to display 4k pictures and video?
TVs with displayport! All those photo editors and architects, video editors and gamers would be able to plug big high detail screens into the PC they use every day. Instead of being restricted to 28", or 30hz
4k has 3840 pixels across the screen. Panning an image across the screen one pixel per displayed frame at 50Hz will take 77 seconds. Anything moving on the screen at a rate faster the one minute per screen width is just a blur with multiple adjacent pixels changing on every frame and all those in-between pixels may as well not be there.
It is OK for stills although at normal viewing distances few people will be able to notice the difference from 1080p, my old eyes can't, I don't generally bother going past 720p for video.
I guess TV is going through the same more pixels is better bollocks that cameras went through despite the source (and in the case of cameras lens) not being able to support those pixels.
" Panning an image across the screen one pixel per displayed frame at 50Hz will take 77 seconds."
Which is why frame rate is as important for TV/movie work as resolution. There's a solid push to take UHD to 144 or 320Hz in the long-term - or move away from rasterisation altogether.
UHD is good for certain types of documentary and for providing a distraction during boring bits of films/sport (e.g. counting the strands in the tennis net). When the content is sufficiently gripping, the resolution can go to hell. Something like Fritz Lang's 'M' is still a great viewing experience on an aged 405-line CRT. 'Big Brother' is shit even in whatever technology can manage in 100 years time.
the first thing the reviewer mentions is how crap the smart connectivity is.
who gives a fuck? why is it important? im buying a screen, not a computer
whats the screen like? that should be the first thing addressed.
its a symptom of the kids these days.....does it have a widget? yeah>? great lets get one! wait a minute, its actually shit at what its supposed to do....
This.
But more, even if the "smart" aspects of it are really good no-one cares (or should care) because they won't be in a couple of years time when some new streaming service has come along. As you say, picture quality is the important thing but if you do want a smart setup then what matters are things like:
- Does it have enough of the right connectors
- Does it have separate Power On and Power Off remote commands
- Does it accept HDMI-CEC commands.
Personally, I find the additional improvement in sound quality worth paying the extra for with Blu-ray. Also, new release Blu-ray discs are now not much more expensive than the DVD equivalent, so what's the point in buying the DVD? IWGMC to protect myself from the downvote storm.
Where is the evidence that typical viewers (rather than technical enthusiasts) care about picture quality? They have never been much troubled by gross distortion in geometry, in luminance or colour rendering. None speak ruefully of 405-line TV.
Apart from the introduction of colour in the 1960s, of Teletext in the 1970s and recently of flat displays, the average viewer might not even have noticed. Other "improvements", like digital transmission ("Oh, it's digital, so it must be better") have been driven by advantage to the broadcasters, not to the user and by a desperate hope that the technology has not reached a plateau, where users no longer buy when a shinier version is available and showing off a TV is as naff as showing off a pocket calculator.
"Where is the evidence that typical viewers (rather than technical enthusiasts) care about picture quality?"
Too true. Ofcom stats state that the majority of people, even those with HD TVs, don't actually watch the HD channels that are available to them. Which is somewhat contradictory to the forecast that 4K TV sales will increase by 147% in the UK over the next year, but is probably true. Bet they won't watch that either.
I'm a bit baffled why you'd buy a 4K TV at the moment - there simply isn't enough 4K content (or indeed high enough broadband speeds to cater for it if it's Net-streamed). Heck there's not even that much 1080p content either unless you have a large Blu-Ray collection.
For me, a 4K TV should be as dumb as possible (will there *ever* be a non-smart 4K TV sold?), with as many 4K-capable inputs as possible (to attach whatever 4K external devices turn up, including a PC with a beefy graphics card).
Heck, there's even little point in putting a TV tuner in a 4K TV at the moment :-) Mind you, 4K DVRs are going to need to start at 4TB HDDs and go upwards...this assumes that 4K bandwidth will actually available OTA (maybe with satellite, but I doubt it for terrestrial). That's something I never see mentioned in 4K TV reviews - will there ever be a "Freeview UHD" or "Freesat UHD"?
But, will fibre come anywhere close to being as practical as satellite delivery? I doubt it.
Really?! About Eighteen (or so..), months ago I might have agreed with you, that this Internet TV thing was just a silly fad. A pipe dream if you like. I can remember dabbling with this "tech" all the way back to the ends of the last millennium (~1999-2000), watching live streams of independent broadcasters on the Web via primitive ADSL. The Content was mostly primitive, by todays standards, and really felt more like an alpha test to see what could be done, and if anyone could actually find it. Much to my chagrin I totally blanked out in during that whole Five year window where the local Radio Station would stream live Sporting events over the Internet. Before those governing bodies discovered that there was Gold in them there Hills, and put a stop to that practice.
Taking that rather cynical view towards IPTV, as a whole. You'd be right to address the above point. Problem is though IPTV is morphing into the next delivery method, at least here on the continent, since about 2008 I'd say.
But, if you really want to read the tee-leaves then Just look at the latest Disney Stock Price... This decline is do much for the same reasons above. Why pay for sub-par ESPN to watch whatever they care to broadcast at you.. When you could spend the same amount on MLB.tv, and follow your Team 'round for the entire 162 Game Season. and have the option to use the Radio Only part of At-Bat for free?! With that MLB.tv Sub?!
Its almost a shame that DT, or BT are gonna end up being the ones to make IPTV a thing... In essence becoming that one thing I despise above all else a Cable Co. But, this will be the final outcome at some point. The only question is will net neutrality be enough to keeb that side of the Net at bay, should we still actually prefer or Live TV though the Air, Geo-Sync, or Cable? Its perhaps actually why I refuse to use such a service either internet via Cable Co. or TV from an ISP. is so far, that I don't want to see any monolithic like US Cable Co's suddenly cropping up 'round me.
"But, will fibre come anywhere close to being as practical as satellite delivery? I doubt it."
Its happening now, mate. I use IPTV over fibre to watch HD sport and the quality's stunning.
" I don't want to see any monolithic like US Cable Co's suddenly cropping up 'round me."
No, nor do I.
The things I am waiting for.
'Flat' OLED displays to become affordable and for them to work out the kinks (Like uniformity, endurance and lag). Samsung are reported to be getting back into the race. I personally was hoping that Panasonic would get into OLED. Even if they use LG's panels. No sign yet though.
HDR and high bit depth displays (I think one guy on the Value Electronic HDTV shootout said that displays need to go to 16bit or even higher to make full use of HDR.
HDMI 2.2 or newer on ALL ports (Same for video cards).
I'll mention lag again because it is very important for gaming. Currently LG's OLED TV's have *HUGE* input lag due to the processing.
Better smooth motion algorithms (I still see ghosting around moving objects if you set it too strong. Also full 1080P/4K pixel motion resolution only seems to come when you enable interpolation even at it's lowest settings).
Support for high refresh rates. 4K needs 120/144Hz.
CONTENT! CONTENT! CONTENT!
Oh and The Hobbit at 48FPS? NEVER!!! Basically the BluRay alliance did not put it in their spec.
There are too many compromises for me right now. Personally I am waiting for that operaiton to become common that puts new lenses in my eyes (giving you super human clear vision apparently) that was recently reported on so I can get rid of my thick spectacles to actually appreciate any of this.
"Currently LG's OLED TV's have *HUGE* input lag due to the processing." "Better smooth motion algorithms (I still see ghosting around moving objects if you set it too strong)"
Who can explain to me in simple terms *why* the TV is doing any significant processing of the input signal, and why there are user specifiable settings that change how it is processed (other than the usual brightness contrast etc). Don't answer yet, read right to the end first thank you.
Is it purely to do with compression/decompression algorithms/standards (algorithms, not implementations) and perhaps probably to do with (too much? poor quality?) compression of the input signal?
Is the signal now so massively compressed that sometimes you need (say) 5 frames of picture buffered up in the receiver before you display the earliest of the (say) 5 frames, which is always and inevitably going to lead to (say) 5 frames of lag? 5 frames at 25fps is 200ms. That's a lot.
Common poor implementations of compression/decompression also seems to lead to the joys of having the audio and video out of sync by just enough to be really irritating.
And the joys of picture artefacts with no way of knowing whether the broadcast signal or the set you're watching has the problem.
I can see how UHD might possibly have some attractions in the right circumstances given a suitably high quality relatively uncompressed source. Compress it to hell and back with carp software in the middle and where's the point?
Fifteen years ago I used to understand a bit about DSP technology, picture encoding/decoding, motion estimation, keyframes/intermediate frames etc. So I got some knowledge of the fundamentals. Then I changed jobs and stopped paying attention.
I don't see how we've ended up where we are today, with pictures (and sound?) so overcompressed that the (broadcast) system is so far from "high quality" (rather than high definition) that it's almost not fit for purpose. And I don't see how any practical IPtv world can avoid the same issue.
Example: watching Doctor Who on Freeview HD BBC TV a couple of years ago on my then-new Sony HDTV, and realising that the large-area picture artefacts were almost as bad as on my 1993 Voyager CD-ROM of A Hard Day's Night [1]. E.g. quantisation issues in large areas of smoothly changing colour, leading to "stepped rings" around a bright patch of light on a relatively dark wall, reminiscent of too-low-bit-depth encoding. Tolerable (expected?) in 1993. But in 2013? Sorry about the carp description but hopefully it makes some sense.
All input gratefully received.
[1] http://www.ew.com/article/1993/05/14/hard-days-night
"And you can bet on even bigger numbers to come, as first generation flatscreen adopters prepare to re-enter the market, as part of the traditional (replacement) cycle of life."
They might wait! The *early* early adopters bought panels that supported composite, VGA, and DVI, then got the royal screw job when "they" decided for rights restriction purposes that most HD devices would only actually output HD via HDMI. They then had to buy quite expensive adapters to use the panel properly. I've heard of plenty of these people deciding "Hell no I'm not buying a 4K panel just to be screwed again", they plan to wait quite a while to make sure HDMI 2.2 is REALLY all they need, that they won't just decide in 6 months "Well, actually you need HDMI 3" or something.
My PC for the last 7 years (XP) has had two 22" monitors running at 1680 X 1050. That gives about 96 DPI.
Now was time for new PC and to make it seem worthwhile I decided on a 4K TV as the monitor. I opted for an LG 49UF770K. So this is a 49" screen running at 3840 X 2160 which also works out at about 96 DPI. So no need scale the native resoution.
I combined this with a Geforce GTX980 graphics card. The connections is via a HDMI 2.0 cable into HDMI 1 on the TV (important). This is the only HDMI input capable of chroma subsampling of 4:4:4. The manufactures of TV's do not always spec this feature or may call it some fancy name. If the TV cannot do it then you will get a duff picture. The graphics card and TV run together at 60Hz.
Its all complete madness really with the equivelent of four 22" monitors but fun at the moment.
Windows 10's new shortcuts to resize and position windows has become very useful.
That was my choice recently. I went for a 55" LG OLED curved job.
Couple of reasons. 1) Content. No easily available content to view. By easily available I meant I switch on my TV and either power up my PS4, Blu-Ray or Sky Box. 2) Viewing distance.
This 2nd point is the most important point that most people miss when buying a new TV. I looked on loads of technical forums and websites and did numerous calculations that showed I needed a 102" 4K TV to be able to see the difference between 4K and "regular" Blur-Ray at the distance which I sit from the TV. I've got a reasonable large living room, so where I sit is at least 3m from the screen. Even then it wasn't far enough away.
As for gaming on it? I've attached my PS4 to it and it looks so much better than either my S7 samsung or even my very old Pioneer 436SX plasma that I bought back on 2006.