Energy?
I wonder how much more energy these things take? Presumably at least double the power for what sounds like a minute increase in quality.
Rich.
Almost every TV maker exhibiting at the IFA consumer electronics show in Berlin last week used the event to launch LCD TVs capable of a picture refresh rate of 100 images a second. You couldn't move for bright-eyed booth staff there to demo 100Hz technology and persuade us to replace our LCD TVs already. Toshiba_100Hz …
i always assumed that lcd tv's updated at 60 or 75hz like lcd monitors do.
After visiting a large retailer this weekend and seeing the wall full of large flat HD tv's i can appreciate the 'try before you buy' approach, the variability of the quality, all from the same source, was amazing.
I used to work for a company that produced systems that would do the sort of real-time picture motion analysis needed to display a picture at a different rate than if was originated in. A phenomenal amount of work was needed to achieve about 98% accuracy.
Without an accurate idea of what is happening in the picture things go badly wrong.
Sure, it is theoretically possible to work out that in frame 1 the football is at position A and in frame 2 it is in position B so in your new frame 1.5 you place it half way between A and B, but it is not so simple! Firstly if the motion is not linear (say it is not a football, but something moving in a circle) it will look very strange to move in what will end up as a zig-zag!
Also, big problems occur when you try and identify the same item in each frame... say it is not football, but snooker: A red ball moves from A to B, but in frame 2 the ball at point C in frame 2 looks more like the ball at A in frame 1 than the ball at point B does - even though that is actually the correct one. What happens is that instead of drawing frame 1.5 with a ball at a point half way between A and B you put it half way between A and C!
The problem of solving the awful images you get from many real-life video sources is way beyond the ability of circuitry that can by built into a consumer TV - no matter how much the sales drones simper over carefully selected special cases!
"Take a look at the screen in a shop, running real TV programmes, first and then make up your mind."
I was wandering around a telly shop the other day looking at big LCDs and doing just that. I got down to the back and there were four large screens, all from different manufacturers, that just kicked the doodahs off everything else in the place for picture quality. They were the plasmas.......
TeeCee
Now, I can't afford buying a new X3000 or X3500 Sony Bravia 1080p display and so I will end up buying a Sony W3000 1080p one probably either 40" or 46".
How comes that Sony put 100Hz on the V3000 720p display and not on the more expensive W3000 ? To get 100Hz on a new 1080p Sony LCD display I'd need to spend a lot of bucks for a X3000 or X3500 model which I just can't afford--they are too expensive.
Instead of the pure marketing hype 24Hz/24fps support they should have included 100Hz in the W3000 model as well, which is way better than the 24Hz progressive marketing hype stuff.
24Hz/24fps is just marketing hype because no one in a theatre is watching movies at just 24Hz nowadays, professional projectors at least double framerate to 48Hz progressive and always with motion compensation algorithms engaged (3:2 pulldown and such is a thing of the past, modern implementations use motion compensation algorithms to recalculate proper motion vectors in the time domain, you know...)
Thought that Hz business with LCDs was not very relevant, hence why most are 60Hz and it makes no difference (certainly no flicker)?
Given the source for HD material is 24 or 25fps, or 50 or 60 fields (depending on progressive or interlaced source), and that LCDs all display progressively anyway, I don't really get 100Hz on an LCD.
Or is this just marketing fluff way of saying they have ultra-fast refresh rates, which is something entirely different really.
To compare with 100Hz CRTs would be bad as 100Hz in CRTs has always been pretty nasty in implementation.
Films don't need to be recorded at anything higher than 24fps because that's the optimum speed for your brain. At 24fps things appear to move at natural speeds. If you use more fps then the images appear to be jerky and everything moves too slow.
More fps is total marketing smack. Created to sell video cards and push shoddy TV designs. Point of fact is that the new fangled TV"s still aren't up to the quality standards of old high-end CRT's. I go shopping about every three months for one of the "new" TV's but so far the only advantage I see are bragging rights to your buddies.
.
Films are done at 24 fps because that's how fast your eyes \ brain is going to be seeing it.
Now with computer games, one frame represents a single, 0-length instant in time so there's a sharp jump from one frame to the next. So 24 frames per second in a computer game looks horribly choppy and we expect games to run 50 to 100 frames per second to look good.
Film, however, has exposure time so one frame actually represents a longer slice of time. There causes <i>motion blur</i> which is why film looks so much smoother even though we rarely see it above 30 frames per second. Ditto for digital cameras that have exposure time.
So basically, going past 30 fps or so on film isn't done because it isn't needed for it to look good.
Seems like they're tinkering with problems tht just aren't there. I wouldn't dream of buying an LCD until they can a) manage to show black rather than dark blue and b) show more than 16 colours.
Spot on, "Too True" - I did a little test in Crummys the other week, looked at all of the large screens in turn to see if I could guess what tech each one used. Got about 90% correct. Plasmas are far better than LCD in general, but ones with full HD resolution are too expensive.
To do a proper test, one would need to, at the very least, play around with the controls on shop floor TVs to back off the brightness/contrast/saturation to sane levels, before attempting to use the fancier controls that they all have.
By this time next year, we'll have SEDs to play with as well...
For my eyes, the major issue with 50Hz CRTs was thir incessant and annoying flicker. I seem to be super-sensitive to this: I always found it difficult to watch old-stylee 50Hz CRT TVs, and whenever I sit at someone else's PC with a CRT monitor, I can't bear it if the refresh rate is anything less than 70Hz.
Flicker is never a problem with LCDs. Even if the refresh rate is only 50Hz, a lit pixel does not have enough time to go dark (I suspect it never goes dark at all) between frames to make the flicker visible.
So once the flicker problem is solved, then next biggest problem for me is black levels. I've never seen an LCD TV which ever goes darker than a dullish grey, no matter how it's set up (strangely, even if it's switched off...).
So forget about hyping refresh rates. The biggest sales-draw for me is black levels. And, at the moment, that means plasma. The second a 1080p plasma comes out at a reasonable price, I'm off down the shops....
- Andrew
... to see a decent picture. Too many sets default to +5 brightness, +5 colour on reset. This is to make them stand out in the store, amongst all the other sets. Presumably this is because (like spam) it works. Too many people buy the brightest one.
Sony make (made) a couple of 26in LCDs (Bravia KDL-26S2010, 26S2030). The case was identical between the two, and when both were set to the same settings, the picture was identical. However, press the "default settings" button, and one went all bright and garish, the other less so. Why? One was intended for "hifi" stores, the other for Curries/Comet/Argos.
"Take a look at the screen in a shop, running real TV programmes, first and then make up your mind."
I was passing a bank of such screens in my local 24 hour Tesco late one night last week. Happened that they'd turned the sound off on most of the TVs, except for one that was tuned to a digital radio commercial station.
Just as I passed (and remember, this was Tesco) it blurted out those immortal words: Sainsburys! Try something new today!
Even more curious, its not just the TV section that churns out commercial radio at night. Shelf stackers, for some reason, can only work to loud radio accompaniment. So at night commercial radio plays out loud and clear adverts for leading competitors on speakers throughout the store.
I wonder if any of their head office staff have ever taken a trip round their stores at night to notice how much they try to divert their customers to competitors at off peak times.
Some people above getting confused between frame rates and refresh rates.
http://en.wikipedia.org/wiki/Refresh_rate
Some dvd player software has frame interpolation for playing movies, which makes horizonal panning look significantly smoother. Unfortunately it also gives me motion sickness :-(
If the theaters show films at 24fps, at this so called perfect rate for your eyes and brain then it's too slow for me. When there are large simple objects moving fast on the screen I can see the pop between position changes on almost all display devices (except the laggy ones but we all know the down side of those). The 100hz is probably just BS marketing for higher refresh rates.
doubling the refresh rate on CRT tvs is definitely useful to fight against flicker as each phosphor will emit light in a burst rather than LCDs that emit a steady glow.
Having a couple of less significant bursts lessens the amount of flicker.
I think the only reason to do so on flat panel screens is another marketing ploy or because the 'in betweening' processes are now very good. Doubtful!
I remember my dad arguing with a sales man who told him a television refreshed at 100 Mhz
Also he had an argument with a man in a bike shop who said that one of those training machines you attach your bike to generated no heat.
Imax has patented most frame-rates higher than 24fps, they are a big obstacle to anyone (other than imax) going out and building a film camera for recording high fps as a normal speed for playback. Imax films are usually projected at 72fps (especially stereo) allowing the extra information of stereo-imaging and still get stable image on a bigger screen. They also often use 60mm plates and run the film sideeways instead of vertically through the projector.
The amount of recorded motion-blur depends on the shutter angle, a 360 degree shutter angle would be constantly open shutter (only possible on cg images, otherwise the film wouldn't have time to move to the next frame) TV movies like Band of Brothers / Saving Private Ryan used a very narrow shutter angle to cut back the motion blur on any frame and introduces a stutter to the images as an artistic effect. The explosions didn't streak like other fast moving images would but seemed to have much more detail as there was little blur.
Reasonable motion estimation and image interpolation hardware has been around for decades, the old sony digi-beta tape decks had a rudimentary image interpolation that was good enough to watch slowed down video with - for the interframe interpolation of a 100Hz display you don't actually need very high quality interpolation in order to achieve the desired effect. It's only if you're retiming with very large time differences that you'd need a longer temporal sample than 2 frames and something nice like per-pixel optical flow analysis.
It's also possible to add motion blur to images using this interpolation method, it could be used for smoothing out and softening as well as just sharpening.
Practically, is the ghosting effect what happens when I see "static texture" among a frame that's only moving slightly? I thought that was the compression of the source, I had no idea it was the TV - I see it all the time and it's very distracting.
@24fps: You are right, motion estimation/motion compensation and needed interpolation algorithms working in time and space faster than realtime are usually used and loaded on very fast DSPs (single or even multiple) to do the job properly and with the needed accuracy.
The "static texture" effect you are referring to could be caused by bad or buggy algorithms in the firmware of many displays being sold by various manufacturers, or due to not enough DSP power to achieve realtime results at the highest precision which can cause spatial/temporal artifacts, the so called aliasing effect. Or it could be a badly windowed set of filters either in the time or space domain (obtaining an high precision windowing for filters it's an hard task to achieve, even harder if computational raw power is limited due to production costs,which means if cheaper DPSs get used).
I'm puzzled by the current trend to 5ms (or faster) response times of LCD monitors. I've always thought that the slower response was a great feature, since it eliminates flicker.
Admittedly, I don't play computer games, and I can live with slight blurring of DVDs - if it means that when I'm reading/writing text, there is no flicker at all.
Otherwise, we could go back to 60Hz CRTs!
Refresh rate != Response time.
The whole deal is different on LCD cf. CRT.
On a CRT, the 100Hz was a major boost because the pixel was "lit" only when the electron beam swept across it. From that moment on that pixel gets steadily darker till it goes completely black. So your eye perceives a much more steady picture if each pixel is lit 100 times per second, compared to only 50.
On an LCD, each pixel stays at *whatever colour it was last set to* indefinitely - until it is asked to change colour. Changing colour may mean going darker, or lighter.
The "smearing" that LCDs subject you to is more about the "response time" of the pixels. I.e. from the moment they are asked to go black, till when they *actually* go black is a fixed time (usually quoted in ms).
So an LCD tv with a faster pixel response time will *automatically* reduce the horrible smearing that the early sets produced. No 100Hz tricks required.
I think this is just a marketing thing. My bet is that these TVs are not doing anything clever at all, (as someone pointed out, actually accurately interpolating intervening frames is a mammoth task) just running with short pixel response times so that any changes on-screen take place quickly without smearing.
NOT to be confused with 100Hz technology as used on CRTs.
I think we can all agree that FPS has nothing to do with flicker.
Raheim Sherbedgia, Joe Cooper,
Thanks for your replies. Unfortunately what you said is just not true, at least not for me anyway. I have a lot of video material in both 720P24 and 720P60 (some of it the same footage) – WHAT A DIFFERENCE!!!!
60FPS is comparatively like looking through a real window, it is sooo much more believable (although it could possibly make some people sick). Returning to 24FPS makes the vodeo seem jerky; it’s looks awful once you sampled the higher rates. My cinema experiences have never been the same since.
Also the picture seems less detailed at less FPS – not a good feature for a HD movie; there’s a good explanation for that. Your eye will be linearly tracking something that is moving in steps, hence the retina of your cornea will see the subject ‘saw-toothing’ back and forth. Higher FPS will reduce the step size hence the subject will appear less blurred; hence you can make out more detail.
I play my PC games at 60FPS, smooth as silk. I’ve also seen games at 30FPS 60Hz, I can easily see that the half rate setup has less motion detail.
I’ve also Intersampled some video material using Dynapel video interpolation software (just like the monitors do in the article), the results weren’t great (problems with previously hidden items coming into view) but the resulting video stream was so much smoother compared against the original. I preferred it.
Ask yourself this: if you can’t perceive extra details at higher than 24FPS, why do broadcasters (especially for live events) still use the ‘lower quality’ interlaced format?
To re-iterate what another poster said: don’t knock it until you’ve tried it!
Did you notice how ‘filmised’ the later series of Red Dwarf was? A great many did!
Adding motion blur to video is a cheat and kinda defeats the object of HD.
Good on James Cameron – I say bring it on!
I think the previous poster is right about pixel response rates.
Too many LCD TVs still blur moving objects too much for me to consider them although I saw a Panasonic one in a shop recently that looked quite a bit better than the others in this respect. My Dad has bought himself an absolute shocker of a cheapo 32" LCD. It managed to burn out highlights AND turn shadows to black at the same time. I did my best with the settings but there was nothing I could do about peoples faces turning to mush when they moved them and then snapping back into sharp detail when they stayed still.
Every year the chipset makers try to justify their latest new super kit as better than the last. They had a lot of old IP sitting around that they used on CRTs and so dug it out, stuck it on last years LCD chipset and flogged it to the ODMs. Now at IFA we see the latest kit.. well done El Reg for saying it how it is.
(okay most of this is conjecture but I used to work for a LCD chipset maker..)
1) It's fairly obvious to me that perception of movement depends on how far the image of a thing moves, physically, per frame. That's why a talking head in a movie theater can appear perfectly smooth but panning across a landscape can appear horribly choppy. (Yet smooth again on a TV.) I can't imagine that there's a magic "optimal" number of FPS that makes everything seem smooth, regardless of how big it's projected and what's going on in the frames, unless that number is in the hundreds.
2) Nothing especially sophisticated has to be done to display intermediate frames for MPEG encoded video. Most frames are described as deltas from the previous frames, with certain rectangles being moved and transformed in certain ways. Just divide these transformations by two and you get double the framerate. It'll all look like a bunch of moving rectangles but that's what MPEG video looks like anyway.
... the Teleophiles
People claiming that the higher frame rate/refresh rate of Brand X makes a noticable difference sounds suspiciously like the audiophile arguments that have bored me for years. As with audio (as Apple et al found out) , someone will figure out that what the great unwashed (and with regards to TV that includes me) want is 'just good enough'.
Now maybe some kind company would broadcast something worth watching on these super-tv's. Or a way of making all those Youtube vids watchable :)
I am quite sensitive to flicker, like others above I get a headache looking at a PC monitor running at the default 60Hz.
I have what was once a state of the art Sony 32" wega flat 100HZ tv - model fx60. I found that I had to turn off the clever 100Hz processing because fast moving sports like tennis looked very jerky - expecially with digital TV it magnified the mpeg artifacts very badly.
The "noise reduction" was also a curse. It still has to runs at 100Hz refresh, but when set to dumb mode it doesn't have any flicker at all, and quality is pretty good.
I'm holding out before replacing it, plasmas suffer screen burn (but not as badly as older ones) and the good LCD tvs are still 1500+!
@Paul: the jerky picture you are getting on your old Sony Wega was simply due to the fact that the motion compensation algorithms had to be kept pretty basic and at a quite low precision in order to be run on cheap DSP, offloading chipsets and usually MIPS based CPUs (or all of them put in a single package like the nowadays quite used Mediatek chipsets found on DVD/DiVX players for example) simply due to production costs. Better the algorithms used, higher the precision engaged=a more power processor/DSP had to be used and at the time those 100Hz CRTs were developed 16bit DSPs were still pretty common and just being replaced by 32bit ones.
There are two main factors concerning quality of 100Hz implementations, the quality of the algorithms being used per se and the quality of the implementation put in the firmware running on the CPU/DSP/chipsets found on internal tv sets motherboards used to process and display the signals. More complex results need faster and more expensive hardware and that's the point. That's even the simple reason why some TFT displays have excellent picture and 100Hz feature works perfectly while others are just not up to the task.
Of course it would help if manufacturers like Sony started putting firmware updates freely on their websites for download and delivered bug fixes to the public since modern Plasma and TFT could be easily flashed thru an USB port if they have one or even thru the same HDMI if they were designed to achieve that.... it's just a question of why manufacturers don't deliver as they should and prefer having frustrated customers that have to deal with programming bugs found on their expensive Tv displays....
LCD is cheaper to make... but harder to make good.
In Full HD (that's the pointless 1080p for the cockswingers amongst us) LCD can be impressive, but next to a Plasma there is a such a pronounced difference it's untrue, Plasma can portray a degree of naturalness (is that a word?) that LCD cannot manage.. LCD always looks vivid and bright and makes it eye catching (and after a period of time, eye straining) if you walk into a TV store usually your eyes are drawn to the over saturated, bright LCD.. spend the time watching and you'll realise a Plasma is far better, as above, at least it can display black as black!!!
Go and look at a Pioneer 4280D Plasma, sure it's £1500 but you won't find a home TV that offers anything like the quality, detail, depth of colour....
24 fps has been standard for 35mm film since the start. Increasing this would be a mechanical nightmare. 24 times a second, the film must be moved while the shutter/gate is closed (which must be as quick as possible to avoid missing motion) and then brought to a dead stop before the gate opens again. This puts enormous strain on the sprockets and the holes in the film. This wouldn't be so bad if it just needed uprated cameras, but the same thing happens in the movie projectors too. They would all need replacing with beefier models, the films will have to be printed on stronger plastic, and all the splicing tables would need to be bigger (a 4 reel film becomes an 8 reeler etc.).
No harm in shooting at double speed though, if it's all going through digital intermediates (as most movies are now) but the 24 fps print could possible show double images where the shutter was closed.
To the person who said you can interpolate MPEG easly:
Nope, absolutely not true. MPEG motion vectors apply to square chunks of the display and rely on additional picture information being sent to correct errors caused where moving objects don't line up with these squares. Also, the motion vectors don't have to bear any relation to the actual motion in the scene - they are chosen to give the best compression. Whilst most of the time they will track real motion, there are a significant number of cases where they won't.
Now, getting back to the 100Hz issue:
There are 2 distinct approaches to 100Hz. Some sets may interpolate motion, others simply insert darkened frames to introduce a level of flicker. LCD stutter is caused by the eye lingering on each frame, rather than tracking smoothly to follow motion. Adding flicker solves this problem by removing the image before the eye 'settles' on a particular frame.
With the flicker approach, you end up a smoothness of motion similar to that seen on a 50Hz CRT. As with the CRT, 50Hz sources look nice and smooth, whereas film sources judder, just as they do at the cinema. The big downer, of course, is that you have perceptible flicker, which is one of the things LCDs were supposed to have got rid of.
Interpolation will smooth all sources; however, it's impossible to interpolate all material and there will always be a significant number of cases where the extra frames are "wrong", producing unpleasant visual effects.
What's best? In a world where TV producers would forget their stupid obsession with "filmising" everything, the flicker method will look fantastic. Unfortunately, there's an awful lot of 25Hz material being broadcast, so you don't really get much benefit in most cases. It'd still be my choice, however, as 100Hz interpolation artefacts are simply horrible.
Nonsense to the people who say 24fps is optimum. It nearly always takes me a few minutes to get used to the horribly slow frame rate at the cinema. I use a 21" trinitron at a rock-solid 85Hz all day at work. I'm not hyper-sensitive, although I do notice 60Hz CRT computer screens.
Anyway, I always thought the 100Hz TVs just refreshed the same image faster, like a video 24fps inside a window on a computer a much faster refresh rate. If they are doing temporal interpolation I am very disappointed, no matter how clever they are. It means you really can't trust the what you see on TV to be the truth! Speed up the sources.
And on the 'filmified' Red Dwarf: it looked appalling. It even made the lame jokes worse. A bit like Hyperdrive. And Dr Who. None of which benefitted in any way from being 'filmified'. I think they look worse for it and we'll look back at this trend like we look back at the horrendously over-lit colour TV of the late 70s and early 80s.
Nonsense to the people who say 24fps is optimum. It nearly always takes me a few minutes to get used to the horribly slow frame rate at the cinema. I use a 21" trinitron at a rock-solid 85Hz all day at work. I'm not hyper-sensitive, although I do notice 60Hz CRT computer screens.
Anyway, I always thought the 100Hz TVs just refreshed the same image faster, like a video 24fps inside a window on a computer a much faster refresh rate. If they are doing temporal interpolation I am very disappointed, no matter how clever they are. It means you really can't trust the what you see on TV to be the truth! Speed up the sources.
And on the 'filmified' Red Dwarf: it looked appalling. It even made the lame jokes worse. A bit like Hyperdrive. And Dr Who. None of which benefitted in any way from being 'filmified'. I think they look worse for it and we'll look back at this trend like we look back at the horrendously over-lit colour TV of the late 70s and early 80s.
Further to the problems highlighted with direct MPEG interpolation: how do you get that information into a TV? Unless it has an inbuilt DVD player and cable/satellite decoder there’s no way it can be done. All it can do is interpolate the video stream as it comes in.
As for processing power: I use Elecard Moonlight player to ‘double frame rate’ convert my 1080I30 sources to effectively 1080P60 - all I can say is wow!! However, it needs all of my (fairly modern) PC processing power to do it in reatime – and that’s just filling in the alternate lines. Imagine what full frame interpolation will need; I sincerest doubt a quality realtime frame interpolator squeezed into a TV will be of good quality, certainly not for a reasonable price anyway. Granted specialised hardware could do it, but that won't be cheap either.
Let’s cut out the middleman – let’s process video at 60FPS.
Matthew Smith,
I thought movie cameras and cinema projectors were going all-digital? If so all mechanical shutter problems will be eliminated.
Motion picture film used to be very slow. Trying to squeeze through more frames in a second may not have allowed enough light to fall on the film to properly expose the image. I assume however that film technology has moved on, but unfortunately there is still this legacy issue to deal with. Some of these digital picture analysers (see Snell & Wilcox) are in the tens of thousands of pounds and there is no way even Sony are putting that kind of power into a £2000 TV set. What can they do anyway when the original to-digital transfer is done badly anyway?
For years they have been advertising HDTV sets in Taiwan and China (where I live) for you to watch your horribly compressed NTSC MPEG-1 cable TV on. Hilarious. An ITT tube set from 1978 gives a much more appropriate picture quality for this sort of junk-quality source material. (My folks in the UK still have such a set, and as a PAL TV it gives a much better picture than is necessary for the NTSC cable broadcasts I watch in Taiwan!)