Manual Settings
Why does it need to identify the frame rate? Can't it just have a setting to manually tell it to use 50Hz?
Google disappointed fans of its priced-like-a-pizza Chromecast TV dongle this week, when it said it won't be fixing an annoying video quirk that has some European customers' eyes twitching. The Chromecast, which lets users stream online content to their TVs while using their PCs and mobile devices as controllers, first went on …
It costs £30. Live with it. Most folk would spend £30 on a Friday night family pig-out down at Domino's pizza and not bat an eyelid, and it's all going down the toilet a few hours later. Suddenly when their super-cheap TV dongle isn't absolutely perfect it's the end of the world.
Meh :-)
If you buy it from Currys - or any other 'brck and mortar' store - you are protected by the Sales of Good Act - it must be fit for the purpose for which it is sold. So it can just be returned if it doesn't play back video properly.
If you order it online, you are additionally protected by the Distance Selling Regulations - and can send it back within a fortnight for no reason at all.
Maybe you should read the article again, a bit more carefully (the issue isn't the TV).
Although it does raise the question, when you use Netflix et al on a regular TV, does the TV know to switch the right frequency to avoid the same problem? How do you even find out what FPS iPlayer, NetFlix, Amazon, etc, use for their streaming content in the first place.
@JDX
Netflix has segregated data silos for content for different markets, to the point where stuff you have watched in the GB region does not show as having been watched when you log on in the USA region.
I imagine this means they can resolve the regional variation in hertz problem at source.
@AC
"Whose TV doesn't do 60Hz these days?"
You are misunderstanding the point. If the Chromecast is outputting at 60hz and the TV doesn't do 60hz - i.e. is only 50hz - then it just won't work. No signal.
So, this only affects TVs that can show 60hz because if it can't then you get nothing. (Said in best Gene Wilder plays Willy Wonker rage voice.)
US content is 60p to match their TVs. The 60p content is sent to the Chromecast which outputs it at 60hz to a TV that displays it at 60hz. When someone in the UK watches US content on a Chromecast they get the same - 60p at 60hz, judder-free.
The problem occurs with EUROPEAN content, which is 50p. The Chromecast converts this to 60hz for output and the TV faithfully displays this content at 60hz. BUT, the conversion from 50 to 60 by the Chromecast introduces judder. The judder comes from the CC at 60hz and the TV shows the juddery, 60hz signal exactly as it gets it.
AS the (European) TV could handle the native 50p content, if the Chromecast just output it without processing the results would be as expected.
This would be necessary if one was viewing 50p content on a US 60hz TV that wouldn't be able to display the content natively but is a pain on a UK TV that can display the content natively.
The Chromecast MUST be detecting the source material or else it wouldn't be able to output it as 60hz. What it is refusing to detect is the TV's supported frame rates and matching this up with the content to determine the best processing or lack thereof.
What is required to automate this is not overly complex. But if they are unwilling to do the work then all they need to do is implement a very simple feature for the user to select this. All they have to do is provide a warning.
But VLC is GPL (not sharing is stealing), meaning they'd have to give back their changes.
Google love the Apache (sharing is not stealing) licence; where you're allowed to view, modify and distribute the Source Code, but they aren't actually obliged to give it to you in order for you to be able to exercise these rights in practice. You can legitimately call an Apache licenced product Open Source, even if you only ever make it available in binary form.
Perhaps they meant 'it's hard to know the frame rates the TV will accept'?
I'm not hugely familiar with HDMI but isn't there some data coming back from the display listing resolutions and frame rates available?
(And 24 fps films are traditionally shown at 25 fps on PAL (and other 25 frames per second) systems - on 30 fps systems they do a dot and carry one method that repeats one frame in five. From memory, traditional telecine machines - and please remember it's at least thirty years since I last worked on one - could interpolate optically.)
I think Google are complaining about the incoming video stream. As others have pointed out, this doesn't seem to be a problem for other players, and I suspect that it isn't actually a problem for the Chromecast because after all if it chose 60fps for everything then it wouldn't be adding frames to pad a 50fps source to 60fps, it would be playing a 50fps at 60fps and therefore at a noticeably faster speed!
I'm not hugely familiar with HDMI but isn't there some data coming back from the display listing resolutions and frame rates available?
Yes, there is. This also applies to DVI. I have had cause to examine it quite closely in my work with using Raspberry Pis as video players (oh, there's a £30 device that can sort itself out with regard to frame rates).
http://elinux.org/RPiconfig#Video
(Particularly the bit underneath the long list where it shows how to get the Pi to read the data out of the connected device)
Essentially it is possible to query a device connected via HDMI or DVI and it will reply with a list of "CEA" and/or "DMT" modes which it supports. It will also flag up which of those is its native mode. CEA modes are only usually found on devices designed as televisions as they are oriented towards common video standards, while DMT modes are found on both TVs (usually) and computer monitors.
I believe a similar facility exists for devices connected by VGA cable (cf the fact that your OS will give you a list of resolutions supported by your VGA-connected monitor or projector), but as the Pi doesn't output VGA I can't be specific.
Of course the Pi is even more flexible than that:
M.
> I'm not hugely familiar with HDMI but isn't there some data coming back from the display listing resolutions and frame rates available?
According to wikipedia it's v1.3 of the EDID standard. It's admittedly not perfect, but it's definitely there for them to use.
Some people don't. I was in a TV shop years ago when 100Hz was new and a couple came in asking which where the 100Hz TVs. The salesman went to find out, and I just pointed out the ones I could see. "You know a lot about TVs?" they asked? "No - I can just see which ones aren't flickering" I did wonder why, if they couldn't, they wanted a 100Hz TV.
Judder, audio and video artefacts are similar. They drive some people wild, and others barely notice. But -- it's still completely unacceptable for Google to manufacture an international product that only works properly in the USA. It doesn't cost less in the UK because it is less suitable - you'd have a good argument for taking it back as unfit for purpose (IANALBIPOOTI).
This is NOTHING to do with being American or where you live, just nonsense. It works no better in the USA than it does here.
The problem is with the source, not the device. If somebody in the USA chooses to watch a 25fps encoded episode of Doctor Who, they will surely get the same issue.
In reality, a massive majority of content is fully compatible. US TV dominates broadcasting.
"US TV dominates broadcasting."
Accepting this as true - for the sake of this discussion - the problem is not where the content was created but where it is broadcast/distributed from. An American movie, shot at 24p is broadcast by American broadcasters at 60hz so that it works properly with American TVs.
If that 24p content is broadcast from a UK broadcaster then it is sent as 50hz so that it works properly with UK TVs.
Thus, if you get a 24p movie from a US source then it comes to your Chromecast from the streaming service at 60hz and is displayed by the TV at 60hz and all is good. If the 24p movie is from a UK source then it comes to your Chromecast from the UK streaming service at 50hz, where it is then converted to 60hz before being sent to the TV for display at 60hz.
It's the conversion by the Chromecast from 50hz to 60hz that causes the problem.
I don't like 3:2 pulldown.
24 frames/second Film on 60Hz my last 4x3 TV could take 60Hz RGB, but not NTSC. Never tested the WS CRT with NTSC, just went RGB.
Prefer the BluRay way of letting an HDTV handle 24FPS
As to not noticing, may be on computer monitors they are happy.
But 50Hz uploaded to Youtube gets converted quite dodgily to 60Hz.
I own 3 Chromecasts and can confidently say this is a complete none issue for me, never noticed it once and I'm hypersensitive to this stuff.
Analysing closer, it must be because I simply never really watch any 25fps content on it. Pretty much all the video I watch is US or blu-ray sourced. You won't notice judder on YouTube, for example.
The majority of my video collection is on DVDs encoded at 25fps, but the odd few are 24fps and some are 29.97fps, and I have never seen an issue with 24fps, 25fps, 29.97fps playback on any of my Chromecasts.
I completely agree it's a non-issue. People really need to stop shining about imaginary problems with a $30 device.
48 FPS now
The problem was how TVs were made and Lighting in the later 1930s when EMI, RCA and Germany started Electronic TV. It had to be the same as the local mains frequency. They used Interlace as clever 50% (temporal) analogue data compression scheme. Static detail is essentially twice quality of fast moving detail, but we can't see as much detail in fast movement.
Digital should have been 48 fps progressive from the start with 96 fps as an option. But the cheapskates wanted to be able to sell off spectrum to Mobile and save money on Satellite. Hence we have the HUGE number of broadcast and disc digital frame rates, interlace and progressive, various anamorphic formats and many resolutions.
The problem isn't cinema which is moving to 48 fps UHD, progressive only, with some projectors doing 96fps interpolation, but TV.
The problem with TV is solved increasing resolution from 1920 x 1080, but the issue of interlace (a pig to interpolate to higher resolutions or frame rates) and many frame rates, all too low.
You really need to stop propagating the high fps usefulness myth.
The most important thing to separate is the distinction between the frame rate and the flicker rate - the two are not the same thing.
Relevant quote from wikipedia:
"Persistence of vision is still the accepted term for this phenomenon in the realm of cinema history and theory. In the early days of film innovation, it was scientifically determined that a frame rate of less than 16 frames per second (frame/s) caused the mind to see flashing images. Audiences still interpret motion at rates as low as ten frames per second or slower (as in a flipbook), but the flicker caused by the shutter of a film projector is distracting below the 16-frame threshold."
24fps is plenty to display non-jittering motion, but 24Hz flicker is very obvious. This is why most film projectors have 2 bladed shutters (48Hz) or even 3 bladed shutters (72Hz).
More info here:
http://en.wikipedia.org/wiki/Persistence_of_vision
When it comes to modern TFT displays there is no flicker per se, so the refresh rate becomes a lot less relevant.
Peripheral vision is more affected. You certainly don't need more than 48fps capture. Then distribution could be 24 fps or 48fps, but needs to be PROGRESSIVE.
Cinema has used x2 frame rate for display for a long time (Mechanically, now Electronic and triple DLP projector). Doing this with Interlaced source is a problem, Only film source can be be deinterlaced perfectly, which must be done to effectively interpolate for a higher frame rate display or higher resolution for giant panel.
There is no need to TRANSMIT or Distribute 96 fps 3840 x 2160 etc, if the source is 48ps and UHD anti-aliased to 1920 x 1080 @ 48fps, PROGRESSIVE, then displays can do excellent up-scaling and if required, frame interpolation. Interlace OTH is best quality left unchanged.
Most didital content isn't interlaced to begin with. Interlace is a disappearing remnant of the analogue age of CRTs. For example, DVD content isn't interlaced, what you get on a 50Hz display is each frame of the 25fps displayed twice.
No content is interlaced at source (film is not interlaced, contrary to what you imply by mentioning de-interlacing it). The whole interlacing issue only arises when recording from broadcast TV source. If it's analogue, the definition will typically be poor by today's standards anyway. If it's digital and the broadcaster is using interlaced format to squeeze more bandwidth out of their cable network, they are almost certainly also over-compressing the video to the point where it looks horribly blocky anyway.
> No content is interlaced at source (film is not interlaced, contrary to what you imply by mentioning de-interlacing it). The whole interlacing issue only arises when recording from broadcast TV source. If it's analogue, the definition will typically be poor by today's standards anyway. If it's digital and the broadcaster is using interlaced format to squeeze more bandwidth out of their cable network, they are almost certainly also over-compressing the video to the point where it looks horribly blocky anyway.
Many (all?) European broadcasters doing live events come straight out of the back of the camera in 1080i, keeping it interlaced throughout the entire broadcast chain, and long before there's any compression done.
Film 35 mm is still very much in use by great filmamakers for it's artistic qualities and compatibility with such an important medium cannot be trashed for the sake of novelty.
Really , what's all the millions of films and documents made for 24fps going to play on ?
24 needs to stay.
It's not just about blur or persistence. Not just.
Different combinations of frame rate and shutter/refresh rate produce different artistic effects. Whatever the original reasons, 24 frames at 48hz is what we are generally used to and anything faster looks strange and home-movie like.
That said, different combinations can be employed - judiciously - to achieve certain effects.
The simple fact is that 24p @ 48hz is not the same as 48p - something that the Hobbit movies have shown.
It is largely personal preference, which is why said movie was released in conventional 24p as well!
The point is that movies are an art form and things like frame rate and refresh rate are like brush strokes, and serve to provide a texture and feel to a movie. We instinctively feel that higher frame rate movies are 'too realistic'. Perhaps that may one day be seen as desirable but, as a visual art form, neither option is, objectively, better, which is why some film makers mix and match in their shooting to achieve the correct effect at the correct moment.
The Chromecast was used purely to get Streaming. I bet that they will discontinue the Chromecast and push the Android TV set top box as a replacement for it that resolves all the issues that could have been resolved by using regional firmware depending on the location and as you say, download and set that up at first boot.
It's mis-sold in Tesco with little explanation.
I'll stick to using a hidden £2 HMDI cable spliced to 2 x screened Cat5e running to Media PC in another room. It also has IR remote sensor at TV over a Cat 5e. The USB hub at TV simply uses 2 x max length USB cables and a USB repeater in the middle.
Also use laptops, phones, tablets with TV.
The Chromecast needs a Laptop/Tablet anyway.
My first Amazon Fire TV went crazy bonkers, wrestling to handshake with my telly to find a happy resolution and refresh rate. I had to send it back.
Amazon sent me a new one that was better but it still falls off the wagon sometimes. Out of the box it's 1080p/60Hz and my TV sometimes is happy to play ball, sometimes it's not.
In the settings you can choose all combinations of resolution and frame rate. Can't understand why the Chromecast won't. Since I bought a AFTV the Chromecast hasn't had a look in, in our house. Simple reason - remote control.
The same issue was raised with the Now TV box when it was launched. However, only a minority of people seem to have complained - but I suppose the content is targeted at a mass market that's apparently happy to watch stretched, garish and flickering pictures provided the screen is impressively large.
To be fair, I hadn't seen any juddery images, although I do have a nice TV which does iron out a lot of crap (24Hz compatible TV, AMP and BD player have removed any jerky BD videos for example).
However, since Android Lollipop landed, my previously working (on 4.4.4) "Cast Screen" functionality has gone walkies. YouTube and other apps work, but I (and others, judging by the forums) can't cast screen anymore...
Way to go backwards!
Its not just the Chrome cast that exhibits this issue!!!!!
If you watch Amazon instant video or BBC iPlayer on a smart TV from one of many major TV manufacturers (e.g. Sony, LG, Samsung etc) the apps play out video at 59.976hz which works fine for US TV shows, and 23.976 movies which pull down to 59.976hz quite well (this technique only gives a small amount of judder but has worked well on USA TV for years)
....but if you watch a UK / European TV show via the iPlayer app or Amazon instant video the tell tale sign that things are broken are the end scrolling credits which look hideous due to the fact that the source frame rate is 25hz progressive (for 50hz interlaced they would have de-interlaced prior to encoding). If you then add the built in TV motion compensation algorithm e.g. motion flow etc. then this can actually make things even worse!
This problem also exhibits itself watching through a web browser using flash as PC & laptop LCD monitors refresh at 60hz unless in full screen mode with the refresh rate deliberately changed to be a multiple of 25hz in the video card settings (and even then this doesn't always work because a lot of video cards can't maintain this exact frequency so end up dropping frame to ensure the audio stays in sync).
The interesting thing is that nobody in the wider media seems concerned that all of European TV is inherently broken for streaming media on all manor of PCs, laptops, phones, tablets etc.
This post has been deleted by its author
Usually I like Google, and I sort of like/hate Amazon too, basically because they're companies that push the boundaries, innovate, and actually deserve to do well. However, it's arrogant shit like this that really pissed me off. The reason I mention Amazon is because their fourth-gen Kindle will not work on WiFi channels outside the US's 1-11 range. I want to use Ch.13 though because there's a shit load of other WiFi signals where I live, but of all my devices with WiFi, and I have a lot, ONLY the stupid fucking Kindle was designed with apparently no care in the world for anyone outside the US.
This whole multiple refresh rate thing has been unnecessary for years. The reason for it was to eliminate noise and other issues by matching the refresh rate to the local power frequency. This was only an issue with analog TV though and digital TVs aren't susceptible to the issues this was meant to address. Just create a worldwide standard, problem solved.
It's not like resampling the frame rate is any kind of exotic technology. Simple anti-aliased blending gets rid of the harsh judder. Some really fancy TVs will even analyze motion to produce intermediate frames by morphing. An extra or upgraded chip in the Chromecast should fix it.
.... but something crazy like 60000/1001 Hz, because when they stared with colour TV (called color TV there) they found out their chroma sub carrier was interfering with their audio sub carrier. Instead of moving the audio, they simply changed the framerate.... which actually makes monochrome and colour TV in the US completely incompatible if you go by the specs. It also means that a show produced at a monochrome station will play slower at a TV station that already switched to colour... and of course computers here and there use 60 Hz straight.
And of course they use this weird scheme were they cut off part of their chroma sub carrier by bandwidth limiting their colour difference signals in weird ways... which gives them the ability to squeeze their image into 4.2 MHz... while PAL can be limited to 4.33 MHz, without having to resort to such a low subcarrier frequeny and weird trickery.
So far my experience with most Google software has not been good. It is released buggy when in actual fact it is, at best, a very early Beta.
It's up to us mugs who use it to sort out the problems. I have basically given up with most Google software as I want to do real work and not sort out their bugs.
What really frightens me is the Google self-drive car. If it's anything like their other software, there's going to be millions of dead people on our roads.