back to article The future of cinema and TV: It’s game over for the hi-res hype

Digital video guru and author of The MPEG Handbook, John Watkinson, examines the next generation of TV and film and reveals it shouldn't be anything like what we're being sold today. The next time you watch TV or go to the movies, bear in mind that you are not actually going to see any moving pictures at all. The movement is an …


This topic is closed for new posts.
  1. Anonymous Coward
    Anonymous Coward

    Generally I've been in favour of this for a long time. Upping the frame rate rather than paying attention to resolution that is.

    Going to HDTV made a lot of sense, even with my poor eyesight I could see the pixels, but UHDTV seems a little excessive for little gain. I'd much rather see an increase in framerate than an increase in resolution. Heck that was all the buzz for a short while, you had a number of TVs outputting 300fps or higher using the same double frame methods used in movies. If we could see a push for higher framerates backed by a large industry body, and mirrored by hollywood it'd probably be far better for us than the current increase in resolution.

    in conclusion 120fps > uhdtv

    1. Test Man
      Thumb Up

      The thing is, technically of course upping the frame rate would make the picture much much better in comparison to upping the pixel count, but would the masses accept it?

      We've already seen things where technically the better solution is generally rejected, even as recent as the Xbox One digital sharing feature that, amongst others, was dropped. There's also some detractors that don't like the look of the 48fps The Hobbit film.

      So I'm not sure if people are ready to accept a 120fps UHDTV film, for example. Still, it might make live broadcasts look much more vivid. And maybe might make those silly 100Hz interpolate feature that are on some current HDTVs obsolete (hate them - picture always looks naff and it ruins "the feel" of footage that is filmed on film).

      1. Dave 126 Silver badge

        I haven't seen The Hobbit at 48 fps, but I have noticed strobing in the cinema in most of the big action 'event' movies of the last ten years... especially in big action tracking shots, such as in Star Wars III.

        Ridley Scott seemed to have embraced the strobing in the opening battle scene of Gladiator, to give the viewer an impression of how the participants in the battle might feel disorientated.

        Good article.

        1. bonkers
          Thumb Up


          The article makes a good case for higher framerates and I agree totally. However we already have a solution that will halve the pixel count and double the framerate - its our old friend interlacing - a 1080i (note i not p) screen is what we need, within the existing frameworks, then move to higher non-interlaced framerates.

          Interlacing is a really good method, only if the frames are shot at 48/50/60fps, like with a video camera - its rubbish if the source is 24fps film since the second field is simply delivered late. Interlacing with say 2 x 540 line (1080i) conveniently bridges the dilemna of motion and detail since on static shots its indistinguishable from 1080p.

          It is justifiably unpopular in the codec world because you don't know if the two fields are part of a single exposed image or if they are two exposures, there is no easy way to benefit from the second field - shot at a different time - when trying to convert from i to p. . Also converting 1080p to 1080i looks blurry and is a waste of time since the temporal information is missing.

          1. bonkers

            Re: interlacing

            Downvoted twice - any reason?

            I can't see why, its a balanced presentation of the case for and against interlacing. Interlacing is an easy existing method of increasing framerate at the expense of "specmanship" resolution, but there are issues with exactly what information it holds that make it difficult to know how best to upscale or interpolate the video.

            1. Jim 59

              Re: interlacing

              Oh dear. It looks like we have a rampant down-voter in here Many folks have been downvoted for no reason, eg Tastman, Bonkers, me, Dave 126, karlp and several ACs so far. It seems that if you mention a number in your post, the troll will down vote you that many times. Funny, huh?

              El Reg forums have hosted high quality discussion and comment over the years, and have become almost as amusing and informative as the magazine itself. I hope it doesn't fall to the trolls. It could end up like Youtube comments, which so bad most comments are nsfw.

              1. .stu
                Thumb Down

                Re: interlacing

                I downvoted you because you are making too much of a fuss about it.

          2. Charles 9

            Re: interlacing

            There's a BIG problem with interlacing today that becomes apparent with high degrees of motion: tearing. Basically, if the motion is too, when the interlaced picture is reconstituted, we can actually perceive the different fields.

            It's for this reason some networks in the US broadcast sports (infamous for high amounts of motion) in 720p instead of 1080i.

          3. ridley

            Re: interlacing

            One problem with interlaced is when a moving image is displayed on a large screen the difference between the images becomes visible as a herringbone pattern around the edges of the moving object.

        2. Jim 59

          The above comment was down voted why ?

          Why was Dave 126 down voted there ? Somebody having a bad day?

          1. Rampant Spaniel

            oh joy so we have to shoot 1/240 instead of 1/48 :-) Less need for nd's but will hurt in low light.

          2. Jim 59

            Re: The above comment was down voted why ?

            Phantom down-voter got to my post within 12 minutes. Impressive. Timing this one...

        3. Anonymous Coward
          Anonymous Coward



          That's why he got a downvote from me.

          1. Tubs

            Re: "disorientated"


            Disorientated is currently correct in English.

            Disoriented is the American English variant.

            However, when the soon-to-be Americans left our soil for lands free of imposed religion, we - in old English - also used the word "disoriented". It is us, the English, that are at odds for changing the word when the Americans retained the original.

            I blame the French!

            1. Steve I

              Re: "disorientated"

              Unless you're Asian, then it's "Disorientaled"...

            2. Uffish

              Re: "disorientated"

              Kudos for your erudition. I just checked the online version of Johnson's dictionary - he has "disorientated" so it's good enough for me.

          2. Robert Baker

            Re: "disorientated"

            What's wrong with disorientateteted?

            Seriously, "disorientated" means that you've lost your orientation; "disoriented" means that you've gone off Chinese food, or no longer support Leyton Orient FC.

            Still, dumb downvotes are no strangers to these boards; I've been downvoted for pointing out the (unpopular but true) fact that copyright violation is now a crime, as if voting against this fact will somehow make it false. It won't; piracy is indeed now a crime, and no number of votes against this statement will cause those laws to repeal themselves.

            1. veti Silver badge

              Re: "disorientated"

              To a serious pedant, 'disorientate' means 'to turn away from the east'. To lose one's sense of relative position or direction would be 'disorient'.

              However, you need to be a real, table-chewing-level pedant to actually care about that. 'Disorientate' has been sanctioned by over 200 years of usage and, I think, every serious dictionary. The only reason to continue to hate it at this point is if you're the kind of person who gets really, really worked up about redundant syllables, and in that case "utilise" would be a much better target for automatic downvoting (it's "use", dammit).

      2. Greg J Preece

        We've already seen things where technically the better solution is generally rejected, even as recent as the Xbox One digital sharing feature that, amongst others, was dropped.

        ....and then turned out to apparently not be nearly as nice as MS made it out to be. But I'm sure their glorified demo mode was worth it for all the inconveniences it brought.

        No-one rejected digital sharing. They rejected the excessive bullshit the console heaped on the consumer, and when they did, MS threw a tantrum and took digital sharing away as punishment. There is literally no reason at all why they couldn't have both.

        1. karlp

          ...... and you turned out to be wrong in your correction.

          The digital sharing thing wasn't what some people made it out to be, but it was very cool, and was not, to be clear, a demo of any type.

          While MS didn't respond as some may have hoped, the reality is that this topic is complicated and involves both technical and human (read: Contracts and agreements) angles.

          While not perfect, MS seemed to have the best concept of a digital ecosystem I have yet seen. I and my friends preordered multiple consoles because of it. We are still debating whether we want them anymore.

          We desperately hope they bring them back, and have reached out to number of senior MS execs to express that. Hopefully we will see a return at some point in the future, or at very least a better compromise.

          Karl P

      3. JEDIDIAH

        Confused Lemming.

        > We've already seen things where technically the better solution

        You are confusing technical superiority with personal preference.

      4. Daniel B.

        Oh you had to ruin your argument.

        We've already seen things where technically the better solution is generally rejected, even as recent as the Xbox One digital sharing feature that, amongst others, was dropped.

        You had to choose the worst example? We've got x86 vs. RISC, 30/60fps vs. 24fps, full dynamic range audio vs. Dynamic Range Compressed loudness, watching SDTV content in 4:3 vs. "stretch-o-vision" grossness, and you choose the draconian DRM thing as an example?

        There's also some detractors that don't like the look of the 48fps The Hobbit film.

        This is a much better example. I've heard a lot that >24fps movies look too weird to the human eye. Others say they don't like HD because you can see all the imperfections on the actor's skin or whatever. I'd rather have better image/motion resolution, and yes we should be doing real 60fps filming by now!

    2. Naughtyhorse
      Thumb Down


      Are you talking about pixels or compression artifacts, or is your eyesight so bad you need to sit with your nose pressed against the screen?

      Oddly enough the people who invented TV _DID_ think long and hard about human vision, and did investigate it. I think maybe there is a difference between research not being done, and the author not being aware of the research being done - but the system developed in europe {insert your own joke about Never Twice the Same Colour - that was broke straight out of the box} considered cone angles, persistence of vision, typical eye acuity and also engineered an analogue standard that worked with 1950's technology and lasted for kinda 50 years, and was flexible enough to accommodate Colour, Nicam stereo, teletext and macrovision, hell even SMPSUs without ANY revision to existing equipment. (of course existing equipment could _use_ colour, nicam etc etc)

      that said I do agree with the basic principle of 'enough with all the dots already, can't you make the ones we have move smoother' although i had tended to ascribe this the the smoke and mirrors stunt pulled with variations on the theme of delta compression algorithms all going tits up when every frame is significantly different from the last.

      PS you are aware that a light flickering at 50 Hz appears, to almost everyone, as a constant unvarying light. so a 120Hz(p) screen starts to sound a lot like monster cables bullshit. Bigger number so it must be better.... now where have I heard that recently?

      1. Danny 14

        Re: orly

        But a lightbulb IS a constant stream of light, it isnt being chopped 50 times per second, it is merely powered by oscillating since wave 50 times a second. If it was a digital light switching on and off 50 times a second (shuttered rather than refreshed) with no lag you would probably notice something.

        Cinefilm was exactly that, presentations of shuttered stills relying on your eye to maintain the previous scene.

        1. h3

          Re: orly

          Now interestingly when it comes to lightbulbs the new energy efficient ones give my migraines. But the old tungsten ones didn't at all. (I have tested it as well as I can). I think the light is different. (I am somewhat al-right if they are covered but if they are not I get migraines.)

          1. Robert Forsyth

            Re: orly

            Tungsten bulbs still vary in intensity, but they do not go completely dark.

            Florescent tubes flicker in your side vision, but stop flickering when you look directly at them.

            Tungsten's spectrum is smooth from red to blue, whereas others tend to have gaps and be more blue.

            In the eye, the quick acting rods are more sensitive to yellow.

            1. Nick Ryan Silver badge

              Re: orly

              Just to add to the flickering (light) topic here... human eyes have considerably more movement sensors on the periphery than in the centre, interesting balanced by having almost no colour sensors in the periphery where we see in monochrome and the brain fills in the detail with what it remembers (or guesses from experience).

              As a result, many household bulbs don't flicker when looked at directly but look (!) at them from the corner of your eye and you'll see the flicker. This flicker can also be seen when the light is reflecting off a surface. It's one of the (many) causes behind offices fitted with fluorescent bulbs giving staff headaches.Interestingly the flicker is also one of the reason that these bulbs often come in pairs (or more) as gives not only gives fail-over in the event of tube failure but reduces the impact of the flicker through it being masked by neighbouring tubes.

              Incandescent bulbs also flicker due to the power supply frequency but the effect is negligible as they operate by heating an element and this element does not cool enough between cycles for the flicker to be noticeable. However you can make it so you can see the flicker if you use a high output bulb and reduce the output to minimal using a dimmer switch.

        2. This post has been deleted by its author

        3. Matthew 3

          Re: orly

          If you'd like to see the effect of alternating current on a filament bulb take a look at The Slow Mo guys' video. Well worth a look.

        4. Naughtyhorse

          Re: orly

          fluorescent isnt constant

          and NO you wouldn't notice something (in fact you don't notice something... much thats part of why fluorescent light is unpleasant), which is kinda why I said as much.

        5. Anonymous Coward
          Anonymous Coward

          Re: orly

          A lightbulb powered at 50 Hz actually flickers at 100 Hz because the polarity of the voltage does not matter, try to put a diode in series with the bulb and you see flickering at 50 Hz.

          The 100 Hz flicker from a 60W ligthbulb is about 8% so it is close to a constant source of light.

      2. Vic

        Re: orly

        > a light flickering at 50 Hz appears, to almost everyone, as a constant unvarying light

        That depends on how you look at it.

        Human vision is essentially made up of two parts - the high-resolution, full colour bit in the centre of yuor vision (the "foveal view"), and the low-resolution, monochrome bit[1] that surrounds it.

        The monochrome sensors are *much* more sensitive and *much* faster to react; they may well see a 50Hz[2] LED as flickering, but once you turn your eyeballs to look at the light, you will see it as steady...


        [1] Most of what you can currently see is actually viewed in monochrome, with your eyes & brain filling in the colour. There's an excellent party trick you can show people - get someone to look straight ahead, and slowly bring into their view a white card with a vivid red dot on it. Ask them not to look at the card, but to tell you what colour the dot is. It's black. As you move the card towards the centre of your subject's vision, the dot will suddenly change to red - and you can then back it out along the same path and it remains red.

        [2] I can't remember the cutoff frequencies, but a 50Hz TV will flicker visibly when you're not looking at it, and stabilise when you do. I sometimes wonder if that means that content on 50Hz broadcasts needs to be more enthralling, so that you never look away :-)

  2. Pete 2 Silver badge

    It's not what you see that counts

    > ever higher pixel counts, an approach that disregards how we actually see moving images

    ... it's what you can sell.

    Let's face it, while some people sit close enough to their screens to make high definition worthwhile, most don't. Just like most people don't watch TV or media players in a perfectly darkened room, so specifications for contrast ratios are irrelevant. Likewise for pretty much any other specification-led consumer product: cars, hi fi, computers, cameras, phone and the list goes on.

    The problem is that you can't say or show yer average consumer a "better" product and just have them see/hear/smell/taste/feel that it's better. If you're lucky they might just recognise that it's different, though since change is often unwelcome or even plain bad <cough>3D</cough>, that's a double edged sword.

    No, it's far better to bamboozle them with figures - occasionally even relevant figures - of real or imagined origin to "prove" that your product is better than the other guy's/ Luckily scienec and technology education is so mind-numbingly bad that only a tiny fraction of the population has any chance of knowing what a specification means and almost nobody, anywhere, ever (though I did work for an electronics company a long time ago -one of the engineers took his stereo amp back to the shop after running bench tests with our mil-spec test gear) has any chance of validating those specifications.

    Although tech-specs aren't the only way of persuading the public. While some credulous techies might be taken in by the babble that accompanies them - and is often promoted by magazines and articles in their reviews - most people just get hostile when confronted by pages of meaningless numbers. For them the solution is just to wrap up the goodies in an attractive enclosure - or failing that, a shiny cardboard box will do.

    1. Anonymous Coward 101

      Re: It's not what you see that counts

      "... it's what you can sell."

      See the drivel in Hi-Fi magazines for examples.

      1. Anonymous Coward
        Anonymous Coward

        Re: It's not what you see that counts

        Not to single out Monster Cables, but I'd like to single out Monster Cables.

        1. Anonymous Coward
          Anonymous Coward

          Re: It's not what you see that counts

          "Not to single out Monster Cables, but I'd like to single out Monster Cables."

          You know you really want some oxygen free gold hyper core mark 2, special edition with sheathing of pure unicorn intestine.

          1. Rampant Spaniel

            Unicorn intestine sheathing? Thats so last month, you need the pixie love juice version with sheathing woven from gandalfs shower plug hair. Still cheaper per gram than HP ink. Then again it's cheaper per gram to take water into space than make ink apparently.

            1. Irongut

              Oh I really need one of those cables to connect an iPod to one of those vibrates the table speakers the MTV flog.

              It'll sound just... awful as always.

          2. Donald Becker

            Re: It's not what you see that counts

            Make certain that it's transparent unicorn intestine, so that you get a magnified view of the stranded wire within. Of course it has to be a prime number of strands, to avoid generating harmonics that muddy the width and height of field.

        2. Darryl

          Re: It's not what you see that counts

          And, the A/V world's equivalent - Monster HDMI Cables.

          1. Anonymous Coward
            Anonymous Coward

            Re: It's not what you see that counts

            My favourite is gold plated TOS (optical) connectors.

            1. Montreal Sean

              Re: It's not what you see that counts

              But but the gold TOS-link connectors are so shiny!

            2. Squander Two


              I think everyone I know is now quite bored with my ranting about using analogue tech on digital systems. People stream some audio from a server in the US but think the conductive quality of the two or three metres of cable in their own house matters.

              1. Rampant Spaniel

                Re: Gold.

                I don't know, it sounds more like your wife wouldn't let you buy the expensive cables ;-)

                As for UHDTV, having seen real 4k (and uhdtv shouldn't me much different from 4k) now side by side with 1080p on decent monitors I'm convinced enough to lay out the cash for a monitor. As for a uhdtv tv, I'm waiting to see how the content delivery turns out. ymmv and thats fine :-) We all only have to justify it to ourselves (and our wives). A higher frame rate and higher res would be nice, and any bump in monitor res is very much welcomed by this tog!

              2. Vic

                Re: Gold.

                > but think the conductive quality of the two or three metres of cable in their own house matters.

                For home kit, you're absolutely right.

                For professional kit, it's a little different - if you're setting up a PA, it needs to work. The couple of quid extra you pay for gold-flashed connectors is nothing compared to the hussle of having a dodgy lead in a live show. Think of it as insurance :-)


                1. Naughtyhorse

                  Re: Gold.

                  Ill call bollocks on that!

                  almost never known a lead to go because of shitty contact between the plug and socket. it's always a break in the wire. Save your money, buy decent robust looking connectors... and blow it all on a soldering iron and a decent pair of box jointed side cutters :-)

                  1. Vic

                    Re: Gold.

                    > it's always a break in the wire.

                    Not when you're plugging / unplugging every night, it isn't. Connectors are important.


            3. SteveK

              Re: It's not what you see that counts

              "My favourite is gold plated TOS (optical) connectors."

              I thought you were joking.

              I *hoped* you were joking.

              I now see you weren't.

              I feel unclean.

            4. Steve I

              Gold plated TOS (optical) connectors.

              Please tell me you made those up...

              1. Steve I

                Re: Gold plated TOS (optical) connectors.

                "Please tell me you made those up..."

                Dear god, you didn't...

          2. h3

            Re: It's not what you see that counts

            At least speaker wire theoretically should be better if it is made out of solid silver than whatever the absolute cheapest 99p stuff uses.

            (I use pretty cheap stranded reasonably thick copper speaker wire think it cost about a tenner in total).

            HDMI or any thing else digital is just a blatant lie with no basis whatsoever in anything.

        3. Naughtyhorse

          Re: It's not what you see that counts MONSTER CABLES ROOL

          I always always always always use monster cables to hang up my work suit in the wardrobe overnight, and it NEVER has creases in it in the morning!

          I don't know _what_ you are talking about! they are marvellous, and so inexpensive too! infinitely superior to the crappy old wire coat hangers i used to use. my clothes seem somehow... crisper in the mornings

        4. BongoJoe

          Re: It's not what you see that counts

          I'll stick to Van Dam cabling and Neutrix plugs . If it's good enough for Trevor Horn's studios then it's good enough for me.

        5. Anonymous Coward
          Anonymous Coward

          Re: It's not what you see that counts

          Halfords sell 30Amp gold fuses for your car whixh provide " optimum performance with maximum power flow, noise elimination & pure signal transfer."

          Frankly that bullshit makes Monster cables look positively mundane

          1. Rampant Spaniel

            Re: It's not what you see that counts

            Geez, it's simple. Gold is a very dense metal and the microgravity caused by this dense taurus of gold at the connector acts to focus the photons resulting in less scatter. It says so on the box and we all know marketing coke hounds are all post doc physicists moonlighting because they secretly like turtlenecks and those weird little rectangular spectacles.

      2. Anonymous Coward
        Anonymous Coward

        Re: It's not what you see that counts

        Depends what you mean by HiFi magazines.

        Use your favourite search engine to seek out the works of P.W. Belt.


      3. h3

        Re: It's not what you see that counts

        Have you ever heard one of the decent Meridian Systems ? (They sound divine).

        Stuff being made really well and it costing loads annoys me less than stuff like earphones being engineered to last 18 months. (Actually designed to break at around that point). At least according to the IET magazine then that is done. (Certainly by HTC / Monster).

        (I don't think Technics/Panasonic do that but I might be wrong).

        1. jason 7

          Re: It's not what you see that counts

          I love my Meridian gear.


          Beacuse its pretty much impervious to tweaking. It just works as it should out of the box. You can just get on with listening to the music.

          I used to hate it when a person would post so proudly that they had bought a new £2000 Naim CD player only then to be shot down by being told he had to buy the optional £500 power supply upgrade "to make it sound right!" Charming! Why not make it sound right to begin with?

          Talk about treating customers with contempt. But enough of the loyalists swallowed it.

          And as mentioned earlier, VanDamme cable is all thats needed really. If it's good enough for a lot of recording studios then its good enough at £2 a meter from my hi-fi too.

    2. Kristian Walsh Silver badge

      Re: It's not what you see that counts

      I admire the author's work in video compression, but I think he's being unfair on the benefits of higher resolutions. Higher framerate suits some types of material, but not others; ditto higher resolution

      Shooting a film with the kind of set detail, costumes and production of Anna Karenina [ ] at 48fps would add nothing to its visual appeal, but viewing at 4k rather than 2k or SD certainly would. The scenes move slowly, and your eye has time to take in the opulent settings which are a major part of the experience.

      On the other hand, would 4k or higher really make Fast and Furious 6 [ ] much better as an experience? A fast-moving, quickly-cut film like this, though, would certainly benefit from higher motion fidelity than 24fps offers.

    3. Vic

      Re: It's not what you see that counts

      > while some people sit close enough to their screens to make high definition worthwhile, most don't

      The "worthwhile" nature of these resolutions comes about as screen sizes grow. We've got a load of 4K kit here, and it looks *beautiful*. On an very big telly.

      I still cannot for the life of me understand why people want to stream HDTV over 3G/4G to a mobile phone. But that's what people pay for...


  3. Steve Todd

    You don't have to use simple frame discarding

    To get from 48 fps to 24 fps, you can use the same sort of motion compensation algorithms that you use in image encoding to blur background objects in motion. Indeed one of the tricks of digital encoding is to drop detail from these parts (just take a look at a footie match on HD Sky, pin sharp blades of grass resolve into green blur when the camera moves - but then Sky over-compress their content).

    1. Neil Barnes Silver badge

      Re: You don't have to use simple frame discarding

      >> but then Sky over-compress their content...

      *Everybody* overcompresses their content. But then, Joe Public never sees the original studio quality, so he doesn't know what the compressor has decided he doesn't need to see; he doesn't know what he's missing.

      Some compress worse than others; the two driving factors are the bandwidth currently available and the choice of compressor manufacturer - not all coders are the same.

      A canonical example is the static snooker table shot: the stat-mux decides it needs very little bandwidth - after all, nothing's moving and you can see every grain of chalk-dust on the table - and then the score is superimposed and is covered in jpeg fingerprint.

      1. Dave 126 Silver badge

        Re: You don't have to use simple frame discarding

        >But then, Joe Public never sees the original studio quality, so he doesn't know what the compressor has decided he doesn't need to see; he doesn't know what he's missing.

        C'mon, give Joe Public a break - he will have often have seen the same film by broadcast, video-on-demand, DVD and BluRay - and probably does notice the difference.

        My favourite test? Watching a BBC Nature documentary on iPlayer- every time a flock of birds takes off from some exotic estuary, the screen becomes a mosaic of squares.

        1. Neil Barnes Silver badge

          Re: You don't have to use simple frame discarding

          Fair enough, but each of those is a compressed video stream. I'd be very upset if he can't see the difference at different bit rates and with different coding rules - though on the other hand, he was quite happy for years with VHS 'quality'...

          What Mr Public as a rule has *not* seen is the video as it leaves the editing suite, as uncompressed bits of around 3Gb/s. Certainly, some of those bits can't be seen (in the same way as some of that sound can't be heard) - that's the way perceptual compression encoding works. But there's an awful lot of picture that just disappears into the bit bucket.

          Unfortunately I'm not in the video trade these days, so I write with too much authority - but I wonder how much improvement is possible on compressed video by decreasing the image resolution and increasing the frame rate? It's certainly something I'd like to see.

      2. Ian 55

        Re: You don't have to use simple frame discarding

        Cynical me thinks it was one reason for losing analogue TV - it didn't do this sort of thing.

      3. illiad

        Re: You don't have to use simple frame discarding

        you haven't seen how badly sky compress the *normal* SD stuff, so they can spare bandwidth for their HD...

        I was watching an old, and good episode of STNG, and the face of the person in the background was so 'blurred', I could not see the features!! and I do remember that a year ago, this was not so!!

        1. Vic

          Re: You don't have to use simple frame discarding

          > you haven't seen how badly sky compress the *normal* SD stuff,

          A vicious rumour from way back when said that the extra compression wasn't to "make room" for the HD content, it was to give it a reason to exist...

          That's second-hand, though. I haven't even been to Sky in a decade.


  4. Yet Another Commentard

    Is this low framerate why...

    Bodies of water/spray often look so rubbish on DVDs?

    It seems often the spray has moved and changed "too far" between frames and seems jumpy and disjointed.

    Or is that a compression artifact?

    1. Vladimir Plouzhnikov

      Re: Is this low framerate why...

      "Or is that a compression artifact?"


      1. Yet Another Commentard

        Re: Is this low framerate why...


    2. Albert Stienstra

      Re: Is this low framerate why...

      Wrong. It is artefact:

      artefact < Latin phrase arte factum > artefact < Latin phrase arte factum [something] made with skill. The other word - I will not write it - is auditory pickup with subsequent wrong spelling, broadcast and validated by Google using their "most votes win" algorithm.

      1. Daniel B.

        Re: Is this low framerate why...

        It is written artifact in the US. The other spelling makes my brain coredump as it looks like someone sliced an o out of the Spanish spelling. If you're really going to go into Language Pedant mode, I'd likely note that the British spelling of check as 'cheque' is also a Spanish word. Maybe the colonials were able to preserve some words better being further away from Spain?

    3. Charles 9

      Re: Is this low framerate why...

      It's both. The water spray defeats the motion estimation used to compress the video and as a result you get the quality loss that you describe.

      The big reason consumers can't get the full value of high-definition video is that each medium has to keep to a bandwidth budget: a fundamental limitation of the delivery systems. In the case of the DVD, it couldn't deliver a stream faster than 8Mbit/sec (due to the fact the original DVD drives topped out at 9Mbit/sec: 8Mbit allowed a margin for mechanical error). BluRay streams are limited to 54Mbit/sec (to stay within the 72Mbit/sec transfer limit of BD 2X). HDTV broadcast channels in the US are limited to 19Mbit/sec due to the frequency allotment set by the FCC (a 6MHz band), and so on.

      1. jason 7

        Re: Is this low framerate why...

        Also another factor is that if you check a lot of DVD movies from the past 5 years is that many of them rarely come in at over 5GB.

        A lot will fit on a single density DVD. Very few use the full 8-9GB of space available.

        Many of my earlier DVDs in my collection would come it at around 7GB+.

        I dont know why. Maybe to reduce the quality so Blu-ray would appeal more?

        1. Charles 9

          Re: Is this low framerate why...

          The primary reason many movies kept to 5GB was to avoid going dual-layer. Dual-layer discs are more expensive to press, have lower tolerances for defects, and introduce a bit of a pause in the playback as the pickup laser switches focus between the two layers.

          BD drives IIRC are more sophisticated and incorporate a buffering system that allows for layer jumps to occur in the background (the buffer used to cover the transition until the drive catches up). Also, BD discs are more durable and the demand for high-quality 1080p film footage is greater. 25GB (BD single layer) only allows for about an hour of video at the BD specification limit of 54Mbit/sec. As you put extras into the disc, that limit starts dropping. To do something longer, you have to either trim the bandwidth or use the second layer.

  5. Richard Wharram

    Frame rate obsession

    So now it's OK to resume my frame rate obsession like when I used to play Quake and UT all night?

    1. Evil Auditor Silver badge

      Re: Frame rate obsession

      Who cares about your frame rate obsession. The real question here is why don't you still play Quake and UT all night?

      1. Anonymous Coward

        Re: Frame rate obsession

        If he's like me, it's because he has lost the code to unlock the 2nd+ levels.

        1. Anonymous Coward
          Anonymous Coward

          Re: Frame rate obsession

          Or maybe he's just found the latest Counter Strike offers more fun.

        2. Irongut

          Re: Frame rate obsession

          theodore check out Quake Live. It's free and you don't need codes to unlock anything. Unfortunately it just does deathmatch, I miss single player Quake / Quake 2.

      2. Richard Wharram

        Re: Frame rate obsession

        "Who cares about your frame rate obsession. The real question here is why don't you still play Quake and UT all night?"

        Wife. Kids. Aging reactions :(

    2. mark 63 Silver badge
      Thumb Up

      used to play Quake and UT all night?

      well if you actually resume playing Quake and UT all night,

      I'd imagine with a ten year jump forward in hardware you'll be getting some pretty respectable frame rates

      1. Anonymous Coward

        Re: used to play Quake and UT all night?

        "I'd imagine with a ten year jump forward in hardware you'll be getting some pretty respectable frame rates"

        Albeit on a modern display it will look like Minecraft.

        Talking of old, blocky things, and iD software, who round here remembers being impressed by Doom back in '93? And in fact being genuinely scared whilst playing it?

        1. Greg J Preece

          Re: used to play Quake and UT all night?

          Talking of old, blocky things, and iD software, who round here remembers being impressed by Doom back in '93? And in fact being genuinely scared whilst playing it?

          I genuinely thought it was rubbish. I never liked Doom, at all.

          Now Heretic, on the other hand...

          1. Robert Baker

            Re: used to play Quake and UT all night?

            I genuinely thought it was rubbish. I never liked Doom, at all.

            What!?!? What can I say to that...


            ...ah yes, that's the word I was thinking of.

        2. Naughtyhorse

          And in fact being genuinely scared whilst playing it?

          first time i played quake i think i lost about 2 stone, and all of it was brown :-)

          them was the days!

        3. P. Lee

          Re: used to play Quake and UT all night?

          >And in fact being genuinely scared whilst playing it?

          Doom + Aliens mod. =:o

          1. gd47

            Re: used to play Quake and UT all night?

            Does anyone ever forget their first encounter with a shambler?

            Incidentally, I still have Quake 3 on my system (although I haven't played it for many years).

            Just out of interest I've just run the time demo...789.3 fps!

            1. Anonymous Coward
              Anonymous Coward

              Re: used to play Quake and UT all night?

              "Does anyone ever forget their first encounter with a shambler?"

              Nope. But try and convince the kids of today that it was scary, and they'll not believe you. Cue "when I were a lad, we lived in a shoebox".

        4. Adam 1

          Re: used to play Quake and UT all night?


          1. M Gale

            Re: used to play Quake and UT all night?


            And while we're at it, up down left right a+start.

            Oh hang on, wrong game.

        5. Daniel B.

          Re: used to play Quake and UT all night? @Ledswinger

          Talking of old, blocky things, and iD software, who round here remembers being impressed by Doom back in '93? And in fact being genuinely scared whilst playing it?

          Phobos Lab. Dark. Spooky soundtrack. GAAAH WHAT IS THAT PINK THING! *fire rocket launcher at point-blank range* *die due to splash damage*

          Yes, I'd say that game was the first one that genuinely scared me back in the day! I also managed to pass both Doom1 and Doom2 w/o cheat codes, found them on the 'net 'till a year after I had finished both of 'em.

  6. knarf

    American television runs at 60 fields per second, whereas in the old world we get along with 50.

    The reason of this is because AC Power UK 50Hz and in the ISD its 60hz and this is why the frame rate is different. It was cost of NOT producing a cycle generator in each TV, nothing to do with quality of picture. The mains supply gave a free cycle generator of 50/60Hz

    1. Paul Shirley

      Really? So which 3rd of the population get to use the correct phase that matches the broadcaster?

      The usual justification is simpler: poor power regulation in early sets meant you didn't want a 10Hz beat frequency causing a very visible ripple on screen. Those of us old enough will remember watching waves of distortion moving slowing up or down screen as the transmission and power drifted in and out of sync on crappy TVs ;)

      1. NXM Silver badge

        This is true, and a result of a cheapie power supply in the TV that dipped near the zero crossing point. It caused an irritating dark band that moved very slowly up or down due to a slight frequency mismatch. What made it worse was that high value capacitors were pretty pricey, the circuitry had a low ripple rejection ratio, and the capacitors tended to dry out over time and lose their capacitance.

        The author's assertion that interlacing caused judder isn't right though. On old analogue transmissions, its true that interlacing halves the line count (and so halves the bandwidth required while keeping the same frame rate), but the undisplayed lines weren't thrown away. Since it was analogue, the 'flying dot' in the camera was immediately displayed on the receiver, so the motion captured could change not only between the interlaced fields, but even between the top and bottom of the picture as the dot made its way down. That's why analogue transmissions were much smoother.

        These days its really obvious that digital TV cameras capture the entire scene in an instant, for each frame, so what you get is clearly a series of stills rather than a fluid motion. But what would you rather have? In the old days we got uncertain colours with really low colour bandwidth, bleed, sports jacket syndrome, FM noise causing snow, and farcically complex video recorders.

      2. Jim 59


        Good point but regarding phase, beats will only happen if the phase difference changes. I think.

      3. Naughtyhorse


        The oldest tv I worked on had a series resonant LC network to generate EHT. (gave about 24kV plus or minus another 50kV i reckon!)

        Once i worked out what was happening in there I tripped the bench supply, put it all back together and told the customer it was a right off :-S

        Back in the days i thought 50kV was a lot!

      4. Colin Ritchie

        Ah, waves of distortion lapping round your eyeballs and the valves keeping the room nice and warm on a cold winters night. Pass the cocoa love, the Old Grey Whistle Test will be on in a minute.

      5. Danny 14

        "Those of us old enough will remember watching waves of distortion moving slowing up or down screen as the transmission and power drifted in and out of sync on crappy TVs ;)"

        Ah yes with a manual sync knob on my old PYE set ;-)

    2. David Black

      and flicker too

      Saying that the 25 fps etc. was nothing to do with vision isn't quite true, the frequency was set based on the preception of flicker in an incandescent bulb, 25 Hz being about the point where most people find the flicker imperceptible (sure there are some that argue that they can see flicker up to 100 Hz, but hey there are audiophiles that get upset with the sound of distant bats aparently!). The 25 Hz obviously went to 50 Hz when we jump to AC and bingo.

      A great article, very interesting. I know that there are other coding techniques using models etc. that actually "inerpret" what is being viewed then render that. These have the potential to make the whole framerate/resolution discussion a little redundant. If you're in the mood for a follow-up piece, I'd love to know more.

      1. Tom 7

        Re: and flicker too

        Not really relevant to watching 'HD' and stuff but my peripheral vision can pick up up to 70hz or so flicker when I'm reading a book or typing shit into my laptop it can be irritating and an excuse to shout at the kids.

        If you have human shaped objects moving across the screen your peripheral vision will pick out the individual images rather than seeing them as moving - so even at 50hz someone appearing at the left, centre and right of the screen will look more of a blur face on but three noticeably human images in your peripheral vision.

        I will continue to use my peripheral vision a lot until most script writers are denied the oxygen of publicity.

      2. Vic

        Re: and flicker too

        > other coding techniques using models etc. that actually "inerpret" what is being viewed

        H.264 has various modes that do that. I don't believe any broadcasters are currently using those modes, though.

        > These have the potential to make the whole framerate/resolution discussion a little redundant

        No they don't. They *might* cause the mean bitrate to be reduced, but you still have the huge requirement at synthetic edges, even if it's no longer a true I-frame. That either means you need headroom (which costs money), or you get fields of artefacts at scene changes...


    3. Mage Silver badge


      Very early in late 1940s or in 1950s the lock to mains was abandoned. But Studio lighting was one of the original issues. Some early TVs then had a moving hum bar.

      It wasn't to use the 50Hz or 60Hz directly but to make the hum bar stationary.

      The explanation about interlace and dynamic movement is pretty good, but there was investigation and comparison of progressive vs interlaced in the 1930s. Interlaced 25fps 50Hz is better than 25fps progressive especially if there is no simple large item moving quickly the eye can track. If the eye can't track, then our dynamic resolution is poor.

      So interlace did give some improvement. But there is no doubt that 1366 x 768 (already a native panel resolution on HD Ready TVs) with 75 fps progressive would be better than UHD at lower rates and probably better than 1920 x 1080 progressive. There is little point to higher resolution as onboard upscaling can allow an image the size of the wall.

      Also if the source is twice resolution and frame rate (150fps) and downsampled/antialiased) to 1366 x 768 75fps progressive the quality is better.

      As for so called 3D (stereoscopic)? Pointless.

      1. Kristian Walsh Silver badge

        Re: Mains

        Beating from filament bulbs isn't a factor, as there's very little luminance variation across the AC supply cycle: the filament of a domestic tungsten bulb takes a half second to cool to darkness; studio lamps ran even hotter.

        However, noisy loads on the receiver's mains supply could add spikes that interfered with the incoming signal. Spike resistance is the reason why B&W television uses negative modulation (zerol-level signal = peak brightness) the spikes would appear as dark spots within a light background, and halation and/or poor beam focus on the display tube would cover them up.

        But, interference from heavy motors in the home (a/c units particularly in many parts of the US) would produce a more regular interference pattern, but at a 60 Hz frequency. If the scan frequency and AC frequency were far apart, this darkening of the image could flicker over the field, which is far more distracting than a darker band that slowly crawls up or down the image.

        On refresh rate, there's one problem with high-refresh for cinema... it's not cinematic. Viewers report the video-like refresh rates of 48fps as being "cheap", because normally, only video-originated (and so, cheaper) productions display this trait. Conversely, TV dramas are shot on digital video, but at 24/25/30 fps progressive to achieve a more "filmic" look. Although in one instance, when a music awards ceremony was broadcast live at 720p/30fps, viewers complained that it wasn't live, because mentally they associated the low frame rate with drama.

        Sure, 24fps isn't very realistic... but neither was monochrome vision, or for that matter giant gorillas hanging out of the Empire State Building, long-lost lovers meeting by chance in Morocco, or farm hands rescuing princesses on route to destroying the greatest threat to freedom ever constructed. If you notice the frame-rate, someone has failed in their job.

  7. Richard Ball

    film vs tv

    There is a big difference between watching stuff that was made in a TV studio on a TV, and watching a movie, shot at 24fps on a TV. In the case of TV-studio material, it is footage that was truly created at 50 pictures per second, but with each picture having a lower vertical resolution than the display device. I'm not saying that when you stick them on the screen everything is magic and perfect, but it means that the motion experience for the viewer is very different, and it is much, much less bad than when you're watching a film-movie on TV.

    The TV interlacing system was an engineering compromise - and I think a useful one at the time - not a con. TV wasn't originally created to show cinema movies.

    1. Haku

      Re: film vs tv

      The Hobbit.

      When it was announced that the film was to be shot and shown at 48fps there was a practical mass outcry from people (the vast majority who had never seen anything above 24fps on a cinema sized screen) saying it was going to have a "soap opera effect" due to the doubling of the framerate.

      I have to admit I was a little worried that the higher framerate would have a detrimental effect to the film but I was actually pleasantly surprised and thoroughly enjoyed the film at the higher framerate on the large screen.

      Think about it, when you're watching a 24fps film on a humungus screen then your brain has to fill in more 'gaps' when there's motion than it does when you're watching a 24fps film on a small (sub 30") screen, so doubling the framerate on a humungus screen makes you feel more immersed because your brain doesn't have to fill in so much when there's motion.

      1. Paul Shirley

        Re: "your brain has to fill in more 'gaps'

        I think it's simpler. In real life you don't see noticeable motion blur in most circumstances, sight effectively goes blank while your eyes are in motion - we evolved to discard that motion blur and see sharply. Using motion blur to hide frame rate problems is artificial and film looks artificial because of it. Fake but artful.

        Higher rates coupled with reduced shutter times are closer to real vision but lose the artistic effect we've all been trained to accept. When film makers deliberately do it on 24fps material it's jarring but you're probably right about being easier to mentally process at suitable fps. Certainly feels that way in games, turning off motion blur at 60fps is usually an improvement... and sometimes the difference between achiving 30fps or 60fps!

      2. phuzz Silver badge

        Re: film vs tv

        I did see a comment on a US website from someone complaining that the 48Hz version of the Hobbit "made it look like the BBC", I assume it was meant as an insult.

      3. Jolyon Smith

        Re: film vs tv

        If you need to "think about it" in order to decide that it works or it doesn't then it quite simply hasn't worked.

        Either you watch 48 fps and you immediately appreciate the difference and feel it is an improvement, or it feels false and artificial and far from being "immersive" pulls you out of the illusion and is a constant reminder that you are watching the results of a pixel wrangling server farm in a data centre in Wellington/Hollywood.

        Or you don't even notice the difference.

        All I know is that far more people that I know who saw the 48fps Hobbit fell into the "No difference" or "Hated it" group than in the "Wow it was great" category.

        For all that your "thinking about it" might have led you to conclude, or whatever theories of human vision anaylsis might suggest we should prefer, simple reality presents a stark contradiction.

        To which the common response is "We only prefer 24fps because it's what we're used to".

        No. That is an argument for explaining people's ability to spot the difference and even describe it ("it looked like a video/computer game, not a movie). It does not explain the fact that people simply do not like it.

      4. Naughtyhorse

        Re: film vs tv

        it's your eye that fills in the gaps not your brain

  8. richard?

    Easy for hardware

    Also increasing the frame rate vs resolution would be easy to implement in TV hardware - sets already show far more than 25/30 fps, and I happily send 60 fps over HDMI from my PC.

    The biggest cost for these new formats is the increased resolution; if they forget that I'd love to get an improved experience on my perfectly adequate current HD TV.

    You'd never get the telly makers to go for it, but imagine if one of the big broadcasters picked it up and sold it as an upgrade requiring no new hardware or just a new STB not a new TV.

    1. monkeyfish

      Re: Easy for hardware

      Yep, TV manufacturers would much rather the new standard was higher resolution. That way they can sell you a whole new TV again. Higher frame-rates with no new hardware? Don't be ridiculous.

  9. thefutureboy

    What a great article

    Easy enough for a n00b like me to understand, cheers!

  10. Anonymous Coward
    Anonymous Coward

    Can't wait for UHDTV

    Why? Bugger all to do with the TVs or programs, but the hope that finally we will get more than the rubbish-for-2013-era 1080 lines on every (OK, almost every and certainly all affordable) monitors & laptops.

    1. Darryl

      Re: Can't wait for UHDTV

      Yes, but the problem will still remain that you'll pay a hell of a lot more $/£/€ per line of resolution on that TV than you'd pay on a monitor.

  11. Cliff

    Good article, but does the punter care?

    The bulk of the public will not understand nuances, accept what they're marketed, and watch crappy soap operas. Offer them 2 TV's one with 2300 doodahs and the other with 2200 doodahs, the 2300 model is clearly the better one for the money. Every other measure is ignored in favour of the doodahs, as well we know. Fact is the average user isn't trained to see the difference between a good and bad picture. They don't shudder when they see gradient banding, they don't race to throw up when they see mosquitoes on subtitles. As long as they can follow the story, most don't care that much either.

    Worse still is the guy who plays 300kbps MP3's on an iPod or similar, the one who demands to rip his dvd at 1080p to play on his tablet. People go for the numbers, and don't even know how to appraise the quality!

    1. Joe Gurman

      Re: Good article, but does the punter care?

      Re: "1080p on his tablet" – works just fine on me iPad. In fact, the higher res looks better there than on any low-dynamic range HDTV display I've yet seen.

      1. JEDIDIAH

        Re: Good article, but does the punter care?

        > works just fine on me iPad

        Apple products are pants when it comes to what subset of the h264 standard they support. If you are interested in any other video encoding or different container formats then you might as well forget about it.

        Then there's all of the other stuff from the original media besides the video.

        1. Danny 14

          Re: Good article, but does the punter care?

          ripping DVDs "properly" to play on a larger screen size is sometimes a good idea if the object you are using to play them has a shit upscale routine.

        2. Tom 38

          Re: Good article, but does the punter care?

          Apple products are pants when it comes to what subset of the h264 standard they support.

          Nonsense, they are in fact extremely clear about what H264 profiles each device supports. I've never seen one claim support for a feature that it doesn't support. What it doesn't do...

          If you are interested in any other video encoding or different container formats then you might as well forget about it.

          …is support arbitrary containers that they don't care about. You won't find good (any) mkv support on an ipad, but re-mux to mp4 and it will work perfectly.

          If you care about what a particular device supports, the tech spec page lists it, eg the ipad mini supports "H.264 video up to 1080p, 30 frames per second, High Profile level 4.1 with AAC-LC audio up to 160 Kbps, 48kHz, stereo audio in .m4v, .mp4 and .mov file formats;"

    2. Marcelo Rodrigues

      Re: Good article, but does the punter care?

      "People go for the numbers, and don't even know how to appraise the quality!"

      Up to a point, yes. No doubt people will buy the TV with most "X" (whatever it is). But they can see the difference alright: I have seen the reactions.

      I have a plasma Panasonic, 42". It isn't the best of all (of course), but it is fairly good. And is better than most, here in Brazil. One thing I always hear is "your TV has a much better image than mine!". From people who knows nothing about it - just watching something random.

      So, yes. People will buy numbers and marketing - but (sadly) then can see the difference when rubbed on their noses. But, by then, will be too late: they will already have bought into the drivel.

    3. Vic

      Re: Good article, but does the punter care?

      > Fact is the average user isn't trained to see the difference between a good and bad picture.

      When Sky Digital first launched, I walked into a TV shifter shop. Dixons, Currys, one of them.

      The salesman was making a huge fuss about how clear the picture was, how much better it was than I had ever seen. I had to tell him to stop lying to me.

      He was initially affronted, but calmed down somewhat when I explained to him that I knew what digital TV looked like, and it certainly wasn't supposed to have a very clear VHS hum bar in the middle of the screen...


  12. Spiracle


    As I recall 24fps was settled on by Warner Bros/Vitafone as an economic compromise between running the film strip fast enough to get acceptable optical sound response but not so fast as to increase their print and distribution costs too dramatically (higher fps means more celluloid transported round the country and thicker stock to minimise breakages).

    They set it at literally the slowest fps that they could get away with and, like the apocryphal Roman chariot that set rail gauges, it stuck.

    1. Not That Andrew

      Re: 24fps

      Which rail gauge? Standard, Metre, Cape/Colonial/Japanese, Russian, Indian, Iberian, Three Foot or Bosnian?

    2. Kristian Walsh Silver badge

      Re: 24fps

      I remember a documentary about the development of IMAX that included the history of this. The arguments against faster film were primarily that running fast increases risk of damage - the studios at the time were always looking for something to increase the realism of their productions, and a faster frame-rate would have been the next best thing to full colour.

      It also noted that the film companies were doing viewer experiments at the time, and these determined that the minimum frame rate was around 22 fps for smooth motion perception, and that beyond 70 fps there was no noticeable improvement in motion perception.

      However, 24fps, or more accurately 18 inches per second, was as fast as 1930s projection equipment could run reliably without the risk of mangling or damaging its film. The film strip moves only during a very, very short time (the shutter blackout period), and needs to move completely into place with no slippage or mis-registration. The time that the frame was displayed for gave the mechanism a chance to settle and self-calibrate before the next frame had to be pulled into place. Increasing the framerate meant increasing the chance of the mechanism jamming or losing synchronisation entirely.

      It's interesting that IMAX projectors don't use traditional sprocket driven projection. Instead they create a "rolling loop" in the film that flicks the next frame into place because even by the early 1970s, a sprocket driven system couldn't shift the larger film frame into place fast enough during the shutter blackout period.

  13. Whitter

    It's not just pixel size and refresh rate

    Don't forget colour resolution while we're at it.

    1. Vic

      Re: It's not just pixel size and refresh rate

      > Don't forget colour resolution while we're at it.

      Actually, chroma resolution really isn't that important. Even contribution encoding - where you are encoding video from, say, an OB to feed into the broadcaster's network - typically subsamples to 4:2:2. Broadcast video is generally lower - often 4:2:0. And all that happens before the compressor, which drops resolution again.

      We're actually not very sensitive to chroma changes; luma is where it matters.


  14. James Chaldecott


    Whenever I see an article mentioning frame-rate I remember a documentary I once saw about the history of IMAX. That mentioned that one of the people involved had previously experimented with higher frame rates, but the idea had failed due to the cost of supplying new projectors to cinemas.

    Up to now I'd never managed to find out any more details, but I just managed to find it.

    The man in question was Doug Turnbull, and the process (and company) were called Showscan ( It looks like he's come back to the idea for use in the digital domain.

    The page that lead me to him was this very interesting High Frame-Rate Cinema article:

    I notice that John Cameron agrees with Mr Watkinson: "Cameron went so far as to say that a move to 4K imagery was meaningless as long as the frame rate stayed 24."

    1. James Hughes 1

      Re: Showscan

      I saw Doug Trumball at Siggraph 94 where he talked about the IMAX stuff, and also the Back to the Future ride at Universal studios. Was an interesting presentation.

      1. Monty Burns

        Re: Showscan

        "Back to the Future ride at Universal studios"

        Which is now, and has been for a few years, The Simpsons. I think I preferred the Back to the Future version.

    2. Florida1920

      Re: Showscan

      Douglas Turnbull directed special effects for Kubricks' "2001: A Space Odyssey," and directed "Silent Running," among other accomplishments.

      Thanks to Mr. John Watkinson and El Reg for a very interesting (to me) article!

  15. Craig Foster

    Pirate content solves some of these problems

    You'd be surprised how good a 720i broadcast re-encoded to 720p and played on a media player capable of 720P@60Hz with motion-interpolation can improve the picture.

    Still can't compensate for the scenes with confetti or a flock of seagulls. Actually, nothing can improve the look of A Flock of Seagulls...

  16. Anonymous Coward
    Anonymous Coward

    Havn't pc games known this since the beginning of time? Hence why the bottom range of pc monitors is 60fps and the top is 120fps, and games are always trying to get machines that maintain 60fps at a minimum....

    1. conan

      I think the 120Hz monitors are there for 3D glasses, where you can blank out each eye alternately and still maintain 60Hz for each eye.

      But you're right - I have no problem with growing pixel counts. When I'm gaming I have vsync on in order to limit my PC to the framerate of the monitor (60Hz), there is no interlacing, every frame drawn is different from the previous one. If I want a higher framerate I just go and buy a 120Hz monitor and my PC will keep me happy. Plus a higher pixel count means I can sit closer to my monitor and have a wider field of view (oculus rift?), which again the game will allow me to adjust.

      Film always comes up as the driver for this kind of tech despite the games industry being much bigger ( It's no surprise that the smaller industry now has the inferior tech.

      1. Paul Shirley

        A while back Gamasutra (I think) had a long discussion about the use & abuse of motion blur in gaming. It was surprising how few understood why it makes sense on a 30fps console (the same reasons film does it) but not on a 60/75/100fps PC game.

        Till now it's not been a real choice for developers though, with current gen consoles frequently only able to support 30fps without heroic efforts. Hopefully the next gen will change things.

        ...and I miss my 160Hz CRT ;(

        1. Anonymous Coward
          Anonymous Coward

          I think my brain rebelled against the letter r while writing that.

    2. Jolyon Smith

      PC games have a different problem to address...

      You need high frame rates on PC/video games because in those cases there is no "motion blur" to provide the hints to the visual system about rates of movement for objects in the frame.

      That's why HFR is important for video games.

      The only reason it is important for movies is that it makes post-processing the CGI cheaper - you don't need time-expensive algorithms and processing to "create the motion blur" necessary to make fast moving CG graphics integrate seamlessly with the optically captured images.

      This is also why, when you then skimp on those motion artefacts in your post processing, the result is something that looks like a video game. If your movie consists primarily of CGI elements, it is rather predictable that it will look like other CG media.

      But don't make the mistake of thinking that HFR is intended to improve the experience for the audience. It is merely a way to cut costs for the movie maker.

      1. Danny 14

        Re: PC games have a different problem to address...

        I dont miss my 21" 120Hz CRT. Hernia inducing if you needed to move it to LAN parties.

  17. Haku


    On the subject of video and framerates I've been ripping my Red Dwarf DVDs to mkv but as the first seasons were interlaced it means having to deinterlace it or else it won't look very good on a PC.

    Simple deinterlacers will take the interlaced 25fps footage and turn it into progressive 25fps by blending the two fields together (worst way) or discarding one field and scaling the remaining field (better but loses resolution), both of which produce 'ok' looking footage but aren't a patch on the original interlaced footage played on a CRT.

    However if you know where to look and how to do some simple AviSynth scripting you can get some seriously good quality deinterlacing happening. Check out this 14mb 1.5minute Red Dwarf clip, 720x540 running at 50fps:

    1. Dr Wadd

      Re: Deinterlacing

      Why not leave the ripped video as interlaced and allow the playback device to deal with any deinterlacing issues? I've been ripping all my DVDs with Handbrake, for interlaced material I tell Handbrake to set the interlaced flag in the output file, RaspBMC picks that up and handles the deinterlacing for me.

      1. Haku

        Re: Deinterlacing

        I've tried manually adding the :interlaced flag into the advanced tab in Handbrake to encode the video straight from the ripped VOB files but the resulting 25fps interlaced mkv has a lower image quality and isn't as smooth as the deinterlaced 50fps progressive version.

      2. Out2Lunch

        Re: Deinterlacing

        Handbrake worked for me but I have been using AppGeeker DVD Ripper for the past 8 months, I found that Handbrake only process common DVD sources that do not contain any kind of copy protection.

        Great piece of software that keeps me happy ever after.

  18. Pointer2null

    the original interlaced design was also chosen as we simply didn't have the bandwidth to transmit all 625 lines (UK PAL) each frame. Transmitting 312 per frame and interlacing effectively cut the bandwidth required by half. (Your good old VHS machine cut this in half again!)

    1. Vladimir Plouzhnikov

      AFAIK the original reason for interlacing was that the then luminophores in TV screens had a very short memory and by the time the beam was painting the lower part of the screen, the top part was already beginning to fade. Interlacing cut the time for painting the screen in half and also spread the fading more evenly by combing the lines.

      That this, in fact, also proved to be a form of analogue compression also helped, of course.

  19. Anonymous Coward
    Anonymous Coward


    The interlace on old TVs was partly a bandwidth constraint. If the lines were shown sequentially then the persistence of the screen phosphor had to be longer - with a longer flyback gap between frames for the decay.

  20. Ed 13

    Frame rates

    The flicker at silent film rate of 18fps is just as likely to trigger photosensitive epilepsy, as at 24fps. The general sensitive range is 16 to 25Hz.

    I'm still slightly unsure about the enthusiasm for higher def projection (tv or cinema). Cinemas are getting smaller and my TV is about the same size as it's always been. I put up with VHS for over twenty years!

    It's not the picture itself that counts but what's in the picture.

  21. pear


    I think you fall in to an uncanny valley, quite frankly it looks weird, even if it does looks sharper etc.

  22. Cybergit

    Thanks very much, that was fascinating!

    What a great article - from the opening remarks about 24Fps being a result of muffled sound not a desire to improve picture quality, I was hooked.

    Glad to hear that some solid science rather than marketing hype is being applied to this problem.

    Thanks for improving my coffee break!

  23. Anonymous Coward
    Anonymous Coward


    Great Article, nice read.

  24. AJames

    Early hardware issues

    Good article, and it's time some more thought was put into future video standards.

    Historically there were some hardware issues behind those early decisions on interlacing and the power-line-frequency frame rate in addition to the ones mentioned. The key point about interlacing is that it reduces the bandwidth required by half, which was a significant factor in designing the early TV electronics and creating the broadcast channels. It was still a factor in the more recent switch to digital channels.

    As for the decision to use the powerline frequency, it's true that one reason was to prevent visible strobing, but the other was that it was just easy and cheap to use the powerline frequency as a reference because it's so reliable and tightly controlled. Accurate crystal oscillators weren't always so cheap as they are today.

    1. Anonymous Coward
      Anonymous Coward

      Re: Early hardware issues

      "[...] but the other was that it was just easy and cheap to use the powerline frequency as a reference because it's so reliable and tightly controlled."

      This raises several questions. In the UK there were still some DC mains supplies until 1946. The mains AC frequency is accurate over the long term for clocks etc - but in the short term as it varies depending on power grid loads. There will be phase differences between different legs of three-phase supplies.

      In the early 1940s quartz crystal controlled clocks were large devices. A learned tome on electrical time-keeping said they would only ever be laboratory devices.

  25. Anonymous Coward
    Anonymous Coward

    Compression... Video.... 'Audio' next....?

    More excellent stuff from John. More Mr Reg please! Following on from what the eye sees or doesn't see... I'd like to know what the ear hears or doesn't... Perhaps John could address that in a future digital versus analogue article with respect to music and film surround? i.e. What Bitrates / Sample rates sound best versus what is sold to us? + Is there truly a difference listening to classics on vinyl versus CD?

  26. Beachrider

    What about screen size...

    I have a large screen TV (60") with 720p, 1080i and 1080p programming available. I sit 12 feet from the screen. The audio comes from pretty-good speakers, too.

    This is a very different experience than when I see similar programming on an iPad. I wear glasses for both (reading glasses for the latter). I know that some people watch a lot of programming on iTouch or various smartphones. It just isn't the same, at all.

    The resolution is needed to product crisp images. Sound isn't the major discussion here, but it is very important for relaxed viewing.

    It is just so waaaay better than the 1980s with analog TV and VHS!

  27. This post has been deleted by its author

  28. Charlie van Becelaere


    I say balderdash!

    I remember reading a Superman comic back in the 1960s that explained that the human retina would retain an image for about 1/24 of a second. (Superman was scanning a film by passing it in front of his eyes at the appropriate rate.)

    If you want to call Superman a liar, you're either more foolish or more courageous than I!

    I'll get my coat - it's the one without a pocket full of Kryptonite.

  29. Stevie


    "Fact is, the rates used in television were chosen to be the same as the frequency of the local electricity supply because of fears that different rates might interfere or beat with electric light."

    Well, that and the more likely fact that making that choice makes the engineering of the timebase/flyback circuitry easier, cheaper and more compact. If you want to increase the frequency of the scan you'll need an oscillator capable of producing the desired frequency and capable of remaining stable over a range of temperatures. We can do that today, but back in the down of TV when these decisions were being made?

    I also suspect the choice of interlacing was more to do with the overall perceived distortion of picture than any nefarious agenda. Consider: If you paint a picture line by line of something that is moving, by the same reasoning used in the article the bottom of the picture is out of registration with the top of whatever is being imaged. A figure moving across the screen from right to left will be perceptibly leaning backwards (and be blurred and distorted). By interlacing you distribute the effect over the height of the image, getting more distortion but less perceived "lean" as it were.

    I take issue with the statement that 48 FPS movies are generally approved of by audiences. The last movie released in both 48 fps and 24 fps versions that I took any not of was "The Hobbit" and people complained that the higher frame rate "looked wrong" or "made everything look cartoonish" in comparison to the slower frame rate version.

    I've no doubt the industry is a behemoth that has a very high inertia, and I've no doubt there's a move afoot to make everyone think they need to throw out their home theater system for the Next Best Thing (though there's always a saturation point at which people will ask "why the f*ck am I buying all this stuff over again?"). I just don't see that there's any great need for a panic over poor picture quality when one considers what is being shown on the screens today.

  30. Michael Habel

    60FPS Vs. 50FPS

    Gee could this be down to the fact that the US runs it Mains at 120v @ 60Hz.

    While most of Europe runs in 240v @ 50Hz?

    Nagh I didn't think so either...

    1. safa208

      Re: 60FPS Vs. 50FPS

      "60FPS Vs. 50FPS

      Gee could this be down to the fact that the US runs it Mains at 120v @ 60Hz.

      While most of Europe runs in 240v @ 50Hz?

      Nagh I didn't think so either..."

      Yep, on TV that is so, on a CRT you have magnetic deflection of the beam. so any external field that enters the tube will affect the picture. In the TV-set itself you may have inductors and transformers that that will emit a leakage field with the same frequency as the mains (and harmonics thereof). You also have other external devices with transformers that can also influence the picture. This can of course be fixed by proper screening of the CRT, but this is expensive.

      So if you have the same frame-rate the same as the disturbing field you will have a static image distortion which is much less annoying than a moving one.

  31. Bob H

    I've been trying to bring up the subject of 300Hz in the industry for years (I work in CE), but few people seem to want to standardise beyond what the marketing department want to sell. UHDTV is a total research c*ck waving exercise that has gotten out of hand because the IDTV manufacturers have released that 3D was a washout and they need to fill the gap in their order books. Most of the 4k TVs that will be sold in the next two years will be useless because they have insufficient frame-rate and/or poor frame-rate-conversion.

    I saw one of those fantastically cheap UHDTVs that are being sold in the US just recently, the motion reproduction looks terrible compared to more expensive models which are just needlessly expensive.

    One thing more I must say: More John Watkinson! I still have his books on my shelf and I consider him the A/V equivalent to Andrew S. Tannenbaum.

  32. CaveatVenditor

    Shurely shome mishtake . .

    . . . or have I mis-remembered schoolboy fiziks lessons?

    In optical diagrams (and in the real world) light travels from the source to the eye, not the other way round.

    Check out the diagrams in Fig.1 on the first page. The arrows are pointing the wrong way.

  33. Tinker Tailor Soldier

    Analogue media?

    This article is a little revisionist in that it fails to detail WHY some of these standards were chosen.

    1. Film shutter speed was chosen because early emulsions react more slowly and you need to have the shutter open for longer periods of time to capture an image. Also, you physically need more film to lug around for the same movie.

    2. Interlacing was chosen to minimize flickering on televisions. Phosphors decay, so if you just scan them vertically you get a pretty annoying flashing/stroping experience. By writing to alternate lines, you help mitigate this.

    When there is enough existing footage that is interlaced or at lower frame rate, the standard pretty much have to have way to represent them (for a while) since this reduces the cost of transcoding. Marketers take over and so you get silliness like recording interlaced on modern devices.

    On the subject of interlace, good video decoders don't do just bob and/or comb de-interlacing, they do a certain amount of motion tracking between scan lines to prevent irritating vertical edge effect.

    I just hate it when people neglect to mention the reasonably good engineering decisions that get in the way of their pet projects.

  34. Anonymous Coward
    Anonymous Coward

    "Silent movies ran at 18fps"... Wrong. They ran at 16fps until Kodak decided to reduce flicker by upping their 8mm home movie camera's frame rates. It also sold more film of course and the rest of the industry followed their lead. If such an inaccuracy appears in the first couple of paras, why should we trust what's written in the rest of the article?

    1. Anonymous Coward
      Anonymous Coward

      I noticed several other glaring inaccuracies. For example, the video standard in the United States for decades prior to the relatively recent conversion to HD in over-the air broadcasts (and the standard still used by many cable companies) is NTSC. NTSC frame rate is 29.97 fps, and has a 525 scan lines - not that you would know that by reading the article.

      Someone needs to find better sources than Wikipedia before writing a technical article, especially when writing about historical standards.

      1. Bob H

        Perhaps you should read the article authors Amazon listing?

    2. Francis Boyle

      Standard 8 ran at 16 fps. Super 8 ran at 18 fps (when silent - the sound version ran at 24 fps). But pre talkies 35mm ran at pretty much whatever speed the camera operator (and the projectionist) cared to crank it at. I'm simplifying of course - if you want the detail read Kevin Brownlow on the matter.

  35. Anonymous Coward
    Anonymous Coward

    organic transcoding

    I remember as a wee laddie at polytech watching a telly (had to be repaired each week as the cheap paper phenolic PCB kept cracking lands, probably why the original owner skipped it), the picture was mostly snow but enough useful pixels came through that it was good enough to watch. The sound survived better than the picture which was good because it required some concentration to reconstruct the picture and after awhile the brain can't process too much more.

    Now everyone is lazy and wants a superlative picture and sound handed to them.

    1. cordwainer 1

      Re: organic transcoding

      Which one of Monty Python's four Yorkshiremen are you?

  36. BigScaryTiger

    Bring back Betamax!

  37. epsilon

    Frame rate v frame size

    Really enjoyed the great article !

    At least as far as broadcast TV is concerned, the other topic that particularly concerns me but wasn't really considered in the article was reduced picture quality caused by over-compression.

    I know this is one that has been discussed at length over the years but is still relevant. It is still a significant problem with SD TV, perhaps less so with current HD TV but may again rear its ugly head as we try to squeeze yet more information into limited bandwidth. If we want good quality pictures, we need to ensure that the broadcasters/regulators do not continue trying to fit the proverbial quart into a pint pot.

    So, it seems what we need is decent resolution, better frame rates and *sufficient bandwidth* to carry the signals.

  38. Jim Wilkinson
    Thumb Up


    Interesting comment on interlace. For the record, it was a neat attempt at the time to halve the bandwidth by deleting information the eye could not see so well. Crude, but effective. There's no excuse now with far better methods to get the transmission/recording bandwidth reduced which, for the most part, keep the important parts of the picture intact. In an ideal world, we'd shoot the original material at obscenely high picture rates and resolutions, then use the best techniques to get the bandwidth down to the available transmission rate or recording capacity for optimal reception at the eyeballs. We're not there yet, but we've now at least got away from the crude analogue compression techniques.

    Nice article BTW.

  39. Jolyon Smith

    A great article with just one point that I'd take issue with...

    HDTV was - and is - a significant and dramatic improvement over SDTV... on a big enough screen.

    The author neglects to address the dramatic increase in size of screen (in the domestic setting) that coincided with the roll-out of HDTV around the world. Not so much in the US, but they already had big-ass back-projection CRT 60"+ sets when we brits were still squinting at 26" tubes. But LCD and plasma screens made big screen luxury practical even in the rabbit hutch houses of Blighty.

    And once you get to/above 40" the static resolution of SDTV does become intrusive. Maybe more so in slow, relatively static images, but then not all TV is high-action sports so it isn't unreasonable to devote at least a little effort on making sure that those sedate, static images will look good, when the eye ISN'T tracking a fast moving subject across the field.

    We have also seen significant improvements in codecs of both broadcast and distributed media. Bluray delivers HD images, but it also delivers images with greater dynamic range - true blacks and far less intrusive colour and contrast stepping and blocking (to the point of being non-existent to most people).

    So I don't think it's fair to say that the industry hasn't acknowledge that static resolution isn't the only goal, neither is is quite accurate to say that static resolution isn't at all important - as I say, as a medium, film and TV have to allow for the fact that some images will be (at least relatively) static, so you can't just ignore that aspect completely.


      Re: A great article with just one point that I'd take issue with...

      > Not so much in the US, but they already had big-ass back-projection CRT 60"+ sets when we brits were still squinting at 26" tubes

      My first big screen TV was one of those. HDTV sets are a dramatic improvement over those beasts.

    2. M Gale

      Re: A great article with just one point that I'd take issue with...

      I consider HDTV as being the point when shite over-compressed digital television finally caught up with broadcast-quality SD from 30 years ago.

      Seriously, I know people who subscribe to HD channels and watch them with an old CRT, because the resultant picture isn't full of MPEG artefacts.

      1. James Hughes 1

        Re: A great article with just one point that I'd take issue with... @m Gale

        I'm confused. How is their CRT getting rid of the MPEG artefacts, or have I misread your post? It still has to display what it's given, and it's given an image with MPEG artefacts.

        1. M Gale

          Re: A great article with just one point that I'd take issue with... @m Gale

          I mean they are watching HD channels on an old SD CRT screen, because it gives a better picture than the shitty blockfest that is standard-def "digital television".

          Basically, HDTV finally fixed what digital television broke in a horrific manner and then tried to proclaim as "better".

  40. Anonymous Coward
    Anonymous Coward

    Fine point about motion, but what about static pictures?

    Not everything on-screen is in motion all the time, so resolution still counts for something.

  41. OrsonX

    HD Tennis

    Looks like a computer game! As everything is in sharp focus!

    SD tennis looks more "real" to me...., or perhaps it's just my myopic eyes.

    Also, the article author said: "that HDTV programme you are watching with 1080 lines is interlaced".

    Surely not the case if it says: 1080p

    1. M Gale

      Re: HD Tennis

      Your telly might be 1080p, and some of the media you might watch through it might be 1080p, but the majority of broadcast media is still 1080i or lower.

  42. Anonymous Coward
    Anonymous Coward

    This article was very interesting and no doubt well researched. But it all falls down on this line: "Some movies are being shot at raised frame rates, generally to audience approval."

    In fact the 48fps version of The Hobbit was consistently visually distracting and for most critics and this cinema viewer a failure. That The Hobbit made lots of money in 24fps, 3D and HFS version doesn't equal audience approval.

    Scientifically the 48fps film should look and move "better" than the 24fps version but on the evidence shown thus far doesn't. Is there some human element going beyond tradition to account for this?

    1. M Gale

      Is there some human element going beyond tradition to account for this?

      Probably not.

      Try just showing the film and not telling people it runs at 48fps. See what the approval rating is then, particularly amongst people who said that 48fps was crap when they were told what it was.

  43. triceratops triceps

    soon enough

    not images will be broadcast, but a set of instructions to which our TV sets will generate the motion pictures.

    1. Rimpel

      Re: soon enough

      That's essentially what mpeg compression is - a set of instructions on how to recreate a moving image!

  44. Anonymous Coward
    Anonymous Coward


    The author seems to have only the most basic understanding of the neuroscience underlying human vision. I'd be surprised if more than 1% of the population can detect the flaws in visual representation pinpointed by the author because of the amazing ability of the nervous system to filter out "noise", even of the types pinpointed by the author.

    With so many unsubstantiated assertions, some references would be nice.

    1. Anonymous Coward
      Anonymous Coward

      Re: hmmmm

      Try reading some of the author's books, or even just Googling his background, and then see what you think about his "basic understanding". A short article on the Reg is inevitably only going to be able to scratch the surface.

  45. StanM

    HFR Kool-Aid is the best!

    I completely agree with the assessment that motion compensation is the next logical frontier. However, Hollywood is a bureaucracy -- 60fps and above entails massive, expensive hardware changes. I would like to believe that HFR will prevail, preferable at the 120fps level. I have an FRUCing projector and already I cannot abide 24fps video when the upconverting feature is disabled; regular movie speed looks cheap to me. I love the crispness of HFR, even when 24fps movies are upconverted by my projector, which is really all I have until the Hobbit HFR is released on video. Unfortunately Blu-ray doesn't support 48fps 1080p so I'm a little curious on how they are going to do that.

    1. M Gale

      Re: HFR Kool-Aid is the best!

      I have an FRUCing projector

      Naughty boy, mind your language!

      And wash that mouth out before I do it for you!

  46. William Higinbotham

    Vision Study

    My problem is that I personally cannot follow moving objects very well. I think it comes from closing my eyes when something came at me fast (reflexes) when I was young. How the eyes and brain work is still being studied. An article I saw was "The Myth of Persistence of Vision Revisited," Journal of Film and Video, Vol. 45, No. 1 (Spring 1993). But I could not understand it. Here are the notes - The article is copyrighted.


    [1] A sampling of recent texts that perpetuate the notion of persistence of vision are:

    • Steven Bernstein, The Technique of Film Production (Boston: Focal Press, 1988) 3

    ("retention of image").

    • Thomas W. Bohn and Richard L. Stromgren, Light and Shadows, 3rd ed. (Mountain

    View, CA: Mayfield Publishing Co., 1987) 6.

    • Steven E. Browne, Film Video Terms and Concepts (Boston: Focal Press, 1992)


    • David A. Cook, A History of Narrative Film, 2nd ed. (N.Y.: W.W. Norton & Co., 1990) 1.

    • Louis Gianetti and Scott Eyeman, Flashback: A Brief History of Film (Englewood Cliffs,

    NJ: Prentice-Hall, 1991)2.

    • Gorham Kindem, The Moving Image (London: Scott Foresman, 1987) 16.

    • Bruce Kawin, How Movies Work (Berkeley: University of California Press, 1992) 48,


    • Lynne S. Gross and Larry W. Ward, Electronic Moviemaking (Belmont, CA:

    Wadsworth Publishing Co.,1991) 81.

    • Gerald Mast and Marshall Cohen, A Short History of the Movies, 4th ed. (NY:

    Macmillan, 1986) 9-11, 28.

    • James Monaco, How to Read a Film (N.Y.: Oxford University Press, 1981) 2.

    • Edward Pincus and Steven Ascher, The Filmmaker's Handbook (N.Y.: New American

    Library, 1984) 2

    [2] For an explanation of the "phi phenomenon" see Lloyd Kaufman, Sight and Mind:

    An Introduction to Visual Perception (NY: Oxford University Press, 1974) 368.

    [3] Portions of this historical survey were presented in Joseph and Barbara Anderson,

    "Motion Perception in Motion Pictures," in Teresa DeLauretis and Stephen Heath (eds.),

    The Cinematic Apparatus (New York: St. Martin's Press, 1980): 76-95.

    [4] See, for example, Andre Bazin, Qu'est-ce que le cinema? 4 vols. (Paris: Cerf, 1958-

    62); trans (selection) What is Cinema? 2 vols. (Berkeley: University of California Press,

    1967 and 1971); George Sadoul, Histoire generale du cinema (Paris: Denoel, 1948); and

    George Potonniee, Les Origines du cinematographe (Paris: P. Montel, 1928).

    [5] Joseph A. Plateau, as quoted in Georges Sadoul, Histoire generale du Cinema, Vol.

    1, p. 25:

    Si plusieurs objets differant entre eux graduellement de forme et de position se montrent

    successivement devant l'oeil pendant des intervalles tres courts et suffisamment

    rapproches, les impressions qu'ils produisent sur la retine se lieront entre elles sans se

    confondre, et l'on croira voir un seul objet changeant graduellement de forme et de


    [6] For discussion of work done by Rudiger von der Heydt and Esther Peterhans on the

    response of cells in V1 and V2 to illusory contours, see Zeki 76.

    [7] In the motion picture, a series of rapidly presented, closely spaced images, the

    duration of each image (34.72 ms with two interruptions of 6.95 ms each), the interval

    between images or interstimulus interval (6.95 ms), and the spatial displacement from

    one frame to the next (generally less than 15' of visual arc), fall well within the

    parameters of short-range apparent motion.

  47. Anonymous Coward
    Thumb Up

    Explains a lot...

    Finally I know why watching pictures at the cinema always appear blurred to me...

  48. Jerome
    Thumb Up

    Brilliant article

    I finally understood why my home-made 720p60 movies looked higher resolution than my 1080p30 movies.


  49. Spoonsinger

    4000 pixels across the screen at only 50/60Hz?

    Shirely the fact that the screen dimensions would only be circa 63x63 pixels would be the problem?

    1. Spoonsinger

      Re: 4000 pixels across the screen at only 50/60Hz?

      opps misread... Carry on.

  50. Anonymous Coward
    Anonymous Coward

    If you want more of the science and current thinking on this...

    There is a published white paper from the BBC on this, with more work going on, as outlined in a recent Blog post as a result of a recent international workshop on the topic:

    25 frames per second (fps) is one matter for film, an artistic medium, where motion is carefully controlled.

    Sports coverage, for example, where the motion is unscripted, is certainly going to need a higher frame rate and shorter shutter in the camera to ensure that the Dynamic Resolution matches any increase in Static Resolution.

  51. Anonymous Coward
    Anonymous Coward

    Standard 8 film

    A Standard 8 film still frame always looked significantly less sharp than the pereceived effect of the moving film.

    While delving into history: old 405 line TVs sometimes had a switch inside the case labelled "spot wobble". Apparently the idea was to diminish the visible gaps between the interlaced lines.

  52. Jim84

    Why not have both higher resolutions and higher frame rates? If we are going to build improved data networks that can handle one or the other, why not bite the bullet and make it good enough to support both?

  53. Piro Silver badge

    You can have both. It was born years ago.


    15/70 film ran horizontally at 48fps.

    Although that's kind of an ancient behemoth, and will need ungodly amounts of film.

    1. M Gale

      Re: You can have both. It was born years ago.

      Quarter of a metric ton per platter, apparently.

      Or 59.5238 Jubs in the preferred measurement unit for this esteemed organ.

  54. John 62

    'film look'

    Modern films look awful. Action films depend on fast action, but because the frame rate is so low you can't work out what's happening. I think the cinematographers of today grew up using camcorders and didn't care about frame-rate.

    But older films, particularly The Blues Brothers*, use the limitations of the format to their advantage. The thing that people think they like about 'film look', in my opinion at least, is that the lower frame-rate forces things to move slower, making everything a bit more thoughtful and giving the viewer time for everything they see to sink in. That and cultural bias. People just expect things to be a certain way after they've been conditioned to think that way.

    I came late to the HDTV/PS3 era of gaming and one thing that was a revelation to me was seeing Guitar Hero, of all things, at 50fps on a big TV. Everything was so sharp!

    * In my opinion, probably the best cinematography of all time.

  55. Dick Emery

    The answer lies in...

    ...variable framerates.

    High framerates for action where there is plenty of light. Low framerates for low light (usually less actin goes on in darker scenes for the most part).

  56. stu 4
    Thumb Up


    It's strange that we are now in a situation where consumers can record video quality at a far higher quality, both in resolution (bit rate per frame) and temporally.

    I shoot all my video in 1080p60 these days when I can (one of my pet annoymances is that SONY still restricts their cameras to 'PAL' in the UK. i.e. 50fps... even though the word 'PAL' is meaningless now.

    Watching 60p footage in a laptop, HDTV far exceeds anything broadcast.

    Personally I'm hoping that these are just more nails in the coffin of broadcast TV - so - broadcast TV can't do 1080p60 at 28MBs... who cares - our computers can - our compression formats (H264) can, so stuff broadcast and download and play.

  57. Dick Pountain
    Thumb Up

    Great article

    This is truly great tech writing with a serious point. Reminds me why I keep coming to read El Reg (I sometimes forget when the Google-hating and climate-denying are too strong).

    1. Andrew Orlowski (Written by Reg staff)

      Re: Great article

      I can say with some certainty that nobody at El Reg denies the earth has a climate.

  58. Anonymous Coward
    Anonymous Coward

    I'm wary about spending on early UHDTV's whenthey hit the stores, since the overrated (IMHO) 3DTV revolution.

    i.e. sounds good, but is it? really? This article opened my eyes (heh) to thinking about more than mere resolution.

  59. Willy Wonka

    Excellent, article... but is there no chance for something remarkable in the next ten years ?

    This article, imho, reflects The Register at its best. Concise information presented clearly by an obviously world-class expert. The absence of the usual convoluted droog-language ornamentation is appreciated :)

    The article leaves me with the question of what possible next technology, optical, electronic, etc., could possibly achieve, within ten years some qualitative improvement of the perceived experience of watching film/video with "live" human interaction and motion.

    thanks, Bill

  60. Anonymous Coward
    Anonymous Coward

    Not all films involve movement

    This is an interesting, thought-provoking article but it's a bit one-sided, it seems to assume that films = lots and lots of movement, but very often this isn't true. Kubrick's work was often full of lingering static or slow-moving shots, costume dramas (or anything involving vivid textures) tend to emphasise static high resolution shots, dialogue-based films are very often static etc etc.

    Even in a blockbuster action film like Jaws, arguably its most powerful scene is where Quint sits down and tells his war story: he doesn't move, no one else moves, the camera doesn't really move either. High frame rate would be wasted in such a scene, resolution would be much more important so you could see the nuances of the actor's facial expressions.

    The one-size-fits-all tone of this article is a bit odd. Don't we need good-looking static stuff just as much as good-looking movement? Isn't cinematography always going to be a balancing act, considering the diversity of material produced for cinema/television/the internet?

  61. Rob Moss

    Not entirely true

    I'm not sure whether the author has tried watching any of Wimbledon in 480i as compared to 1080i, but the difference is most certainly worth having. Sure, the motion compensation gets horribly confused every now and then causing the ball to jump around for no particular reason, but at least in 1080i I can actually see the ball.

  62. chiller

    Holy shit ...

    .... link

    1. Nick Ryan Silver badge

      Re: Holy shit ...

      Hahahaha.. That's almost a utterly messed up as the monster cables website.

      So far the gem of the site has to be the "13 amp - High Performance Hi-Fi Fuses"... from only £34.94.

      And the inevitable techno-babble bullshit: High performance fuses will protect the circuit but the integrety of the supply line is vastly upgraded, Like a good power cable allow 50+ hours to bed in.

      I'm also going to have to have serious words with my "bad" power cables that have a very nasty habit of obeying the laws of physics (and sense) and tend to work straight away without requiring 50+ hours to start to work properly.

  63. Vic

    I'm probably not the first person to say this...

    ...But the author has made so many mistakes in this piece, it's simply garbage.

    I'm going to pick on just one topic - motion estimation. The author claims this is to deal with eye tracking.

    Utter, utter nonsense. Motion estimation is used to get a best-estimate of what the camera is doing so that temporal compression (aka inter-frame compression) can be used. This essentially says "this frame is just like the last one, except shifted a bit". You then only need encode the delta between that predicted frame and what you really want, and you have an efficient use of your bandwidth. This is basic MPEG-2 stuff...

    I might have a rant about the author's incorrect description of interlacing later - both the reason it is used and the fact that HD1080 is perfectly able to support progressive scan. But I'll read through some of the comments first to see how many others have beaten me to it.

    Disclosure: I've worked in digital TV for ~20 years now.


  64. mic check

    "...Utter, utter nonsense"

    Agreed, per basic MPEG-2 coding 101. I also (like post) noting how the Figure / diagram depicts light emitting from viewers eye-ball. This reminds me of the concept of light emerging from a given camera implying it's was sucked in from the user display.


  65. Egtverchi

    Another Crucial Aspect of the Human Eye

    Previous posts have touched on the difference between peripheral and central vision as regards flicker, and color, but there is a resolution issue of which most people - and all marketers - seem to be unaware. Detail vision occupies a tiny area, only about the size of an American dollar coin held at arm's length. Everything else is peripheral vision, which becomes progressively worse the farther out you move. Try this experiment: turn to a news channel with a "crawl" along the bottom; fix your eyes on the left side of the crawl; and WITHOUT moving your eyes, try to read the right side - or even the middle. You can't.

    This has huge implications for the role of screen size. If you get a two metre UHDTV, the resolution will look fabulous - in the tiny area covered by your detail vision (and, as pointed out in this article, on a still image) - but the entire rest of the picture is viewed with peripheral vision, which is substantially below VHS tape! So in fact a smaller screen, while it may not dazzle the corners of your eye, actually gives you much better functional resolution, because more of the image will be viewed with the center of your eye.

    This isn't to say that all screens should be small, of course. I'm simply pointing out that if the filmmaker is capable of composing his or her images with interesting detail throughout the entire field, with a room-filling TV screen you will actually be missing more than you are seeing.


      Re: Another Crucial Aspect of the Human Eye

      Not totally correct as the eye scans the fovea over objects so that all detail is the reconstructed in the brain.

      A 2000mm screen with 4kx2k is a whole different experience from 1920x1080 at 2000mm. It is much more immersive as detail wherever we look convinces us that things are real.

  66. Al, PFM

    The powerline frequency wasn't used as a time base for analog TV, the purpose of having the vertical sweep rate the same was, as mentioned by others, to keep 50 or 60 Hz power supply ripple stationary on the screen. Analog TVs had (nominally sawtooth) oscillators controlling the vertical and horizontal sweep of the electron beam over the face of the CRT. The video signal contained a "blacker than black" sync pulse at the end of each horizontal line and at the end of each of the interlaced half-frames (fields). Most commonly the vertical sweep oscillator would be set to run slightly lower than the field frequency when no signal was being received, but with a video signal each vertical sync pulse would trigger the oscillator to begin another sawtooth and sweep the beam down the face of the picture tube. The horizontal oscillator was more commonly configured as a multivibrator locked to the horizontal sync pulses by a two diode phase detector. There was no need for an ultra-stable timebase, since the triggering of the vertical and horizontal sweep were a function of the broadcast signal itself. All that was necessary was to get the oscillators reasonably close to the correct frequency so they could lock to the sync pulses -- this is what the vertical hold and horizontal hold controls were for. I must be very old, because nobody remembers this!

    In the U.S. at least, the vertical field frequency was originally 60 Hz but was reduced to a fraction of a Hz lower when color was introduced.

  67. mic check

    "saccadic eye movements" i.e. how we linearly parses image detail(s) and in turn sequentially assemble composite image. HVS dynamic range or simultaneous contrast per human eye/brain resolves ~100-150:1 max. Through these saccadic eye movements we are able to perceive a white rabbit scurrying atop pure white snow. Edge detail are realized by human eye dithering over a objects edge or boundary. It is not a revelation or news that image reproduction is merely illusion and paramount that "it works" near perfectly.

    Our current 2K format is sublime enough for near 35mm film experience. Till 1080P @ 60 Hz is realized there is not much downside to 19.39 Mb/s MPEG-2 @ 1920 x 1080i format. The current technical and fundamental problem is multiplexing (multi channel) more than one DTV channel. Until we technically cut bait with MPEG-2 codec altogether, the best one can do is 1080i.

    Also the rumor 1920 x 1080i = 1920 x 540P is lacking empirically.



      Also the rumor 1920 x 1080i = 1920 x 540P is lacking empirically.

      Depends how you look at it.

      If I shoot 1080i 50 fields per second each of those fields when displayed progressively is going to have a vertical resolution of 540 pixels. That is all the vertical information which is recorded per field. If I play this back so that the viewer sees both fields (typical television) at the same time, we are asking the brain to do some pretty weird things (the brain is good at this). First it has to blend the two fields together then interpret the sequence of still images as motion. The problem of recording interlace (not psf) is that the second field is acquired after the first field but presented at essentially the same time. Any and I mean any! motion is going to cause our brains to blur the image so that we can put it back together in our heads.

      I just don't understand why modern TV's do not present interlaced recordings as progressive with double the frame rate. Try it out on your computer, subtle but visible. Motion vector sampling for the uprez instead of just linear interpolation does make a difference.


    Shoot 1080 50i and get the temporal resolution. Present at 1080 50p using some uprez algorithm. Easy to do if your watching on a computer. Why won't your TV do this?

This topic is closed for new posts.