@ Giles Jones
Hz myth? It looks like you've fallen for the Hz myth.
You've said off your own back that a 50Hz refresh "looks flickery" on a TV which suggests in the same sentence that the eye can detect more than 50Hz, it has to be able to for 50Hz to appear flickery in the first instance.
The human eye starts to interpret meaningful "fluid motion" of a sequence of frames at about 25/30 fps with clarity, but the human eye is certainly not limited to 50 frames per second (50 Hz refresh in this case), that is the wide spread ill conceived myth born from the fact the film industry set minimum fps limits on their motion pictures.
It really depends what you're looking at but remember the input to your eyes is a constant stream of light being translated into images sent to the brain. A good explanation of differening stimulus situations can be found here:
http://www.100fps.com/how_many_frames_can_humans_see.htm
but be under no illusion at all, that the eye can differentiate light from dark quite easily at 200 fps, the article even recommends 500fps to capture all sensitive information about anything that flicker, blinks or moves. The most important fact is being able to understand and make detailed sense of what you're looking, which is why lower fps are more suitable for detailed streams.
If you can't notice a difference between 50fps and 100 fps in a game of Quake, you're either kidding yourself of you've got very poor eyes indeed. You might not interpret the difference as a lack of flicker, but you should certainly notice the smoothness of the motion increase. End of the day 200 Hz Tv's are highly beneficial over 50 or 100Hz, end of.