How to make DisplayPort an instant success...
Bypass HDCP
Market watcher In-Stat reckons DisplayPort, the new PC-centric monitor connection system will trounce DVI to feature in 600m shipped products in 2012. Earlier this year, In-Stat forecast that DVI will be included in just 3m devices by 2011, down from 112m in 2007. In the main, that's because vendors will opt instead to fit …
How does PC *AND* Multimedia(TV/HDDVD/Console) connectivity compare to just PC connetivity?
DVI is just right I think, i'm not a fan of HDMI, but atleast it is cross compatible and both have integrated audio.
I think the winner was already decided.
Paris, because everyone else does.
DisplayPort is pointless. It has about the same bandwidth (360MHz) as dual-link DVI (usually at least 330MHz, no official upper limit) and single-link HDMI 1.3 (340MHz), and much less than HDMI 1.3 type B (680MHz) - although the HDMI consortium shot themselves in the foot by not making the connectors interchangable like single- and dual-link DVI. Even going from 330MHz to 360MHz doesn't start allowing QWUXGA displays to be driven down a single cable at full refresh, and we can already do 2560x1600 panels quite nicely with dual-link DVI. There's some ability to combine high clock rate with high colour (which HDMI can do, but DVI needs to waste the second link for high colour), but for so long as desktops are only 24bpp the only benefit to higher colour depth is to give you a more accurate look-up table - something Eizo and NEC have been doing better by putting it in their monitors for years. It's not appreciably smaller than the type A and B HDMI connectors, although admittedly it's less prone to falling out, but it's huge compared with HDMI type C. On non-portables, DVI connectors are by far the most robust.
DisplayPort will add complexity and cost (cards and monitors will need to support multiple standards for years so as to work with the installed base), possibly cut down on connectivity (going from 2xDVI-I to DVI-I + DisplayPort means I lose a CRT) and confuse the hell out of everyone (just because it's DisplayPort doesn't mean it can run at full speed). The patent/licence issues aren't resolved and it's got HDCP DRM anyway because the industry decided they didn't want DPCP. We've already got monitors-cum-TVs with S-VHS, composite, SCART, VGA, DVI and HDMI connectors. Adding another won't help matters.
DisplayPort does, indeed, drop the "backward compatibility baggage" (although so does DVI-D, without causing such an upheaval). Given the choice of a standard that supports extra functionality and one that doesn't, and given that the older standard(s) will be sticking around anyway, I don't feel the need to move to the latest big thing. Sadly, it sounds like I'll be paying for it, whether I want it or not. I blame Dell for feeling the need to be leaders in some kind of technology, whether it's useful or not (now, did the 3008WFP disappear because of DP issues?)
*Mutter*.
VGA, SCART, S-Video and hell even Composite (just the yellow), i see being used day in day out, hell i still use svideo on my sony wega 28'' CRT tv, HTPC wth media portal and Nvidia GFX card looks fine to me maybe when HD content becomes available from freeview, but even then 1024x768 is available with standard S-Video so maybe when this set dies and i can get something for the 250 quid i payed for this.
/coat with "im not a HD mug" on it
Can't they make some sort of breakout cable/box that will software switch between the physical graphics output chips in the PC and have a standard interface? That way they just stuff PCs with every flavour available and it will only come down to what the connected equipment carries and the types of header cable people buy.
man... i hope you mean you use s-video for pc to Tv connections not for any dvd to tv etc connections. if so get yourself an RGB scart and see a massive quality difference :) i have my old 32" sony wega in the bedroom now, im all HDMI in the lounge
to me HDMI does what displayport can do and then some - another pointless introduction in the PC world!
Can't wait, no HDCP well I doubt the graphics cards that ship with it will be able to display Bluray content.
Oh and S-video does not support 1024x768, it supports SD PAL and NTSC (I guess SECAM too if you want to get arsey) so thats 576.5 or 480 vertical interlaced time divided lines.
Why Paris, well the poor dear isn't getting out much these days...
Unless your planning on having a bionic eye or two fitted then all this super-duper-mega-ultra hi resolution is pretty pointless for my money, you just adapt to what your watching, and your brain filters out all the stuff it doesn't need, unless you are scrutinising a portion of the picture then there are hardly any benefits to be had, the big picture (so to say) looks as good at 720p as it does at 1080p on a 42" telly!
An optical video connector, like the way you already connect audio components, is the next standard that will matter.
DVI and HDMI are already sagging with age and bad design. Banking limits are already a problem for high-end users. The physical specification is also much more expensive than a simple plastic fiber.
Better yet, make a TV with a 10 gigabit ethernet port -- the same speed as DisplayPort -- that shows up with UPNP or Bonjour as a multimedia target. You would never run out of inputs, and the cabling limit would be enough to reach almost anywhere in your house.
Yes, but it had much larger cables! And large, properly screened cables are a problem... on the back of a 42" Plasma screen... that you don't move... or look behind...
Just for my information, and later gaming pleasure (all 0fps of it), which monitors support 4096x3072?
Also, why can't they just go the whole hog and stick 3G HD-SDI (through copper to fiber converter) connectors on TVs? It's fiber-optics, so uses lasers and hence is cool, and it's interference proof, smaller cables (even with armouring), longer cables possible, and it can handle ridiculous resolutions. AND would mean that the cable manufacturers would make a killing in the custom- cables market as Fiber is reputed to be a pain to shorten and re-terminate.
VGA and DVI are both big and chunky. They stay in because there's room for big arse screws to hold them. They don't break easily because there's enough strength in the connectors to withstand most bumping and knocking.
Displayport's connector alone is crap. Too small, which means it will break, has less options for holding it in, etc. It's not a good thing to have such a connector as a tiny one. One bump and it's out, and possibly snapped.
DVI and HDMI can co-exist nicely, and that's about all we need.
"the big picture (so to say) looks as good at 720p as it does at 1080p on a 42" telly!" - wtf? what shite 42" TVs have you been watching? my lovely thoshiba 42" upscales from 720p to 1080p pretty well but i can see a definite difference between 720p and 1080p sources - this is more apparent on ps3 games - everything is that little bit sharper and nicer to look at.
"i hope you mean you use s-video for pc to Tv connections not for any dvd to tv etc connections. if so get yourself an RGB scart and see a massive quality difference"
Not so much. The encoding on a standard DVD is YUV, i.e. component, with the UV part substantially reduced in bandwidth during the encoding since the human eye is must less sensitive to colur then brightness. S-Video (Y/C) just encodes the UV into a single C(hrominance) signal with very little additional loss.
The only real difference between linking with RGB versus S-Video is where the conversion to RGB is done, in the TV or in the DVD player, and the respective quality of those circuits.
Of course, if you've spent £200 on a gold-plated RGB SCART cable, you'll certainly want to believe there's a difference, but... :)
"The only real difference between linking with RGB versus S-Video is where the conversion to RGB is done, in the TV or in the DVD player, and the respective quality of those circuits."
I would suspect that it is the difference in those circuits that make a difference but people migth not know it - and so some people might see a difference and tribute it to the RGB cable. In my own previous setup using a Sony Trinitron TV it certainly made a huge and visible difference - and I only had cheap cables connected... But then most of the equipment I used might not have been of particularly great quality overall.. But still - when the RGB was used the picture was better...
The same thing goes for HD - it has a lot to do with the particular circuits used and in the actual implementation - I have seen many HD TVs which have a very poor quality picture. Poor quality in many pixels... is not necessarily better than good quality in few(er) pixels...