back to article Any port in a storm: the display tech battle

Take a look at the back of a typical HDTV, and you'll almost certainly see at the very least one or more Scart connectors; a handful of RCA jacks to feed in stereo audio, composite video and component video; and at least one High-Definition Multimedia Interface (HDMI) port... You might even see a VGA or DVI connector to allow …

COMMENTS

This topic is closed for new posts.
  1. wayne

    Debacle

    As I remember, the reason for UDI was so that companies could get out of paying a relatively small license fee for the HDMI interface and content protection scheme. Now it looks like they will have to equip monitors and cards with two ports anyway, license the content protection scheme anyway, and foot the bill to the customer. there is little reason for anything other than HDMI, except maybe an cut down version for computers and laptop LCD panels under the HDMI standard. Kill off display port and UDI.

    I wish somebody made an universal IO interface that could also stream to monitors and HDTV's, and hard drives. So we could have only one sort of IO port, which anything could be plugged into outside and inside our computers. USB2.0 is long overdue for an upgrade to USB3.0. Adapted HDMI 1.3, 10/GigE, PCI-E 2.0, SATA, external etc would be interesting candidates.

    Firewire was clearly superior except for price, but then price differences fell, and more with USB2.0. Firewires design required an lot less processor time being tired up and better timing. As USB2.0 data rate approaches it's maximum data rate these can become substantial factors, imagine the system having to service 6 high data rate USB2.0 devices. USB caused a major side track in the industry, had slow penetration, and delayed, and headed off, the arrival of the authentic alternative Firewire. We are all the worse as users for it, now is this being repeated here with these video standards, and display-port?

  2. Nick Ryan Silver badge

    But does the average end PC user care?

    The main reason that most PC end-users don't care about "new fangled" connectors is that they already have PCs and monitors that work and despite the urges and best efforts of the hardware and software industry, they're largely "happy" with what they have.

    Most PC end-users I've come across don't know (or even care) what DVI is, and most of their PCs and monitors are connected using VGA connectors. Why would they care about new connectors unless they have direct compatibility with what they currently have. Even more absurd is that where monitors and PC graphics cards both that DVI connectors, they're still using the VGA connector (via a DVI-A > VGA adaptor) because that's what they're used too.

    So to sum it all up, if it's not easily compatible with existing connectors, forget it. The average user won't care for it and the average tech-jockey that gets lumbered with supporting friends and family will just recommend equipment that has the backwards compatible technology (this is especially true when upgrading piecemeal or when users by a monitor-less system and just re-use their old monitor, for example).

  3. Anonymous Coward
    Anonymous Coward

    I agree

    I certainly agree, the whole Microsoft not reading the firewaire standard properly so firewire 800 support was messed up destroyed firewaire in a way.

    Firewire has the advantage of not needing a main host so that two devices can talk to each other without involving the PC. No need for hubs, any device can add power to the bus and allow others to draw power. USB was just a way of avoiding a license fee and it is a shame. I believe there is work on firewire 1600.

    I also agree, This is a mess. Why not design a Grpahics port which is the same for all devices. Since we are now running digitally it doesn't matter! No special TV out on graphics cards or anything. If HDMI can handle DVI resolutions then go for it, just remove the encryption. That and it would mean computers coud play blue ray, although I dont agree with HDCP anyway.

  4. JJ

    What's the point?

    Simplicity, robustness, compatibility and low cost are the criteria that should always - and usually do - define what standards win out.

    Case in point - Firewire vs. USB.

    Firewire may well be faster, and have some technical advantages but ultimately very few applications actually require the features and performance it offers - chains of disks, and video cameras are pretty much it. The connectors aren't complex, but not the cheapest design.

    USB provides a cheap, simple interface which is 'good enough' for the majority of purposes it's used for. Implementations of both host and device controllers are easily and cheaply available. And all sort of devices are available that can connect via USB that would never have dreamt of going the Firewire route. And the connectors are simple and robust, and come in a variety of formats to suit specific applications. At the lowest level you can even get away with a raw PCB edge connector. It may well not be the best technical solution, then again the winners usually aren't.

    As far as video interfaces are concerned, VGA still has the advantages of using a simple, robust connector format, and being compatible with any signal that can be fed through the cable. There isn't any licensing fee. There aren't 1.0, 2.0 or other flavours, merely whether the equipment being connected can support the signal timings, and you'll always find something that works even hooking old equipment to something brand-new. And the available bandwidth is huge, so no real timing or resolution limits. If ultimate image quality and performance is required, the analog interfaces still win out.

    The digital interfaces have all suffered from a few common problems, possibly because they're all variations of the same thing: bandwidth is inadequate, crippling the supported resolutions for the majority of implementations. The designs are inevitably proprietry, so there is always a cost involved in implementation. New versions of the interfaces are always being proposed to work around shortcomings - compatibility is an afterthought. And the most recent trend is towards tiny, fragile connectors that offer no real advantage apart from being cosmetically better - DVI is at least a proper 'industrial' type connection which won't kick loose or break off, the others are pathetically flimsy.

    The only real advantage of the move to digital interfaces has been the removal of the d/a a/d convertor stages, and that wasn't a particular problem anyway. All that seems to have happened is that the move to digital has introduced a lot of problems that didn't exist before, and as the encrypted, consumer electronics, toytown design trend continues I can't see things particularly improving.

    So for now I'll happily admit to staying in the analog domain as much as possible, as the various digital options really don't seem to have much advantage to offer.

  5. DrFix

    To USB or not USB?

    USB is indeed simple, but for the love of all thats holy why such a stupid connector that you're constantly having to flip because you can't tell one way or the other which way is "up"? Oh, it may be stamped on the cord itself, but the devices you may plug your memory stick in are another matter. There is no consitancy. Thats an annoyance I wish would disappear. Firewire400 at least made it clear with how it was keyed, but FW800 uses yet another type of connector.... AAaaaarrrrghhh!!!!

    VGA connectors are tried and true, DVI for some mysterious reason doesn't get the respect it deserves, and along come HDMI and more.... Sheesh! I'm not even going to go near Blu-Ray or any other formats until the dust settles. There just isn't enough motivation, nor mad money, to get caught up in this cross-fire of dueling formats.

  6. Pascal Monett Silver badge

    Connectors

    As a consumer, all I see here is yet another connector specification.

    Frankly, I'm fed up with all this hackneyed hot air concerning something that could have been a lot more simple. I understand that technology changes, but given that the basic signals don't, why do I have to be confused by all these different models ?

    HDMI, DVI, DPCP, good Heavens people, get your game together !

    Let me be clear on one thing : as said above, VGA works fine. And if I want top quality, then RCA jacks are the solution.

    The rest is just there for the media industry to lock me down with arbitrary restrictions that have nothing to do with my rights and everything to do with their monopoly. I don't want anything to do with it.

  7. Chris Beach

    HDMI not for Pro's

    I thought that professional installers didn't like HDMI at all, the lack of secure connections meant that loose connections are a problem. And that shouldn't have existed on a modern connector.

  8. M Gale

    It's new therefore it's good?

    I agree with Nick Ryan and DrFix. At the moment I have a 19" CRT monitor that provides me with excellent picture quality at 1600x1200, and is passable at 2048x1536. It far surpasses any flat screen technology I've seen yet, and connects via ye olde VGA port. HDCP is just a good reason for me not to buy something, and I really can't be bothered with high-def disk formats until they reduce to the price that DVD drives are at now.

    Some people might call me a Luddite. I'd prefer to think that I'm not stupid enough to pay for something that will not give me one iota of noticeable performance increase.

  9. De Zeurkous

    Bleh

    Why don't they just develop a SCSI Display Command Set, employ IEEE1394 as a PHY, and shut up?

  10. De Zeurkous

    RE: To USB or not USB?

    Memory stick? I presume you mean a flash memory card in general (and not that awful piece of crap by Sony)? If that is the case, use CF. It's compatible with some important (in this field, that is) standards (PCMCIA-ATA, to be precise) and can be plugged into many devices with a simple wiring adaptor.

    USB isn't simple at all. It's a horrendous, overcomplicated kludge. Also, don't treat IEEE1394 as a seperate interface standard; it's a PHY for SCSI, which was designed for flexibility, and should be treated as such. Anyway, a simple wiring adaptor can convert between 4 pins<-->8 pins connectors.

    VGA connectors don't suck, pins always break and the signal quality is horrible. DVI doesn't get much respect because it doesn't deserve it. It's a quick-and-dirty signalling system using techniques ages old, which makes it a very worthy substitute for RGB/HV&DDC signals.

    See also my previous comment, if it gets through.

    Purely optical disks don't suck. They always break and get scratched horribly during normal usage. I'd rather see them developing MO technology more intensively. Personally, I prefer transporting not-for-archival data via IP-over-S800 (way cheaper than 1000Base-TX and approximately the same effective performance) instead, though...

  11. A J Stiles

    Time for Governments to Intervene

    Time for Government intervention, methinks! The EU could get together, pick one standard, annul any IP protection relating to it and mandate NO encryption. This should then become the dominant connector in the EU, simply because it can be implemented without paying royalties.

    Would the SCART connector ever have become popular if setmakers had had to pay a royalty on it?

  12. Anonymous Coward
    Anonymous Coward

    Title

    JJ, virtually any device that USB could do, firewire could do (admittedly a stripped down USB 1.1 like mode might have helped for keyboards etc). But we are stuck with USB now, so if USB 3.0 could replace VGA and SATA it would be an easy no brainier for users to plug and play. Plus the polling of simple USB devices greatly strained systems in the past.

  13. Richard Silver badge

    USB connectors...

    The really irritating thing here is that the USB spec does actually state exactly which way up the connector should go - the USB logo should be stamped on the top of the connector.

    But a lot of manufacturers ignore it - in fact, a lot of manufacturers ignore much of the spec, which means that there is a lot of fairly dodgy USB equipment out there.

    When it comes to the USB/FireWire debate:

    USB devices are extremely simple. This makes device (and hub) software and hardware is easy to implement, which in turn means that there are many very cheap and very useful USB devices available - it's a near-perfect replacement for that previous cheap'n'cheerful connection - RS232.

    This cheapness means that anybody can make USB things - there are many off-the-shelf chips that just 'do' USB, both hubs and devices.

    USB hosts are more complex, but that's ok because people don't mind paying lots of money for a PC that has USB, because it's a PC.

    FireWire on the other hand, is very complex. FireWire devices can also act as hosts, and they need complex, almost network-like software and hardware.

    This is expensive, and means that only expensive kit can have it.

    This means that for the high-end HDD and video functionality, FireWire wins.

    But for the ubquitous, cheap, random 'stuff' that people find useful, USB is perfect.

  14. Anonymous Coward
    Anonymous Coward

    Optical

    Surely with all the issues surrounding bandwidth and cable length, fibre optical cable is the sensible way forward for any standard to have any longevity. Speaking as someone who has struggled with the issues of moving assorted video standards around homes and businesses over the years, it strikes me that copper just won't cut it. Many of us use TOSlink cables to move digital audio, and in my experience they have been the easiest to handle and most reliable cables.

    Or does the indutry like copper because it's easier to justify "premium" cable products for audio- and video-philes to drool over and where they can make some margin back in an increasingly commodities market?

  15. Andrew Garrard

    Backward steps

    Firstly, a factual correction: the type B connector is part of the HDMI 1.0 spec, not just 1.3, and it's pretty much equivalent to dual-link DVI-D. Almost nobody used it, just like it took years for dual-link DVI to appear on consumer graphics cards. HDMI 1.3's bandwidth increase uses the same connector as before, but increases the maximum bandwidth down the (single) link from 165MHz (DVI single-link maximum, HDMI 1.2 and below) to a 340MHz pixel clock equivalent. HDMI 1.3 retains support for the type B connector, so, if anyone uses it, you'd get roughly the same bandwidth as the upper end of DisplayPort.

    Begin rant.

    The display industry doesn't seem to be concentrating too hard on actually improving things with each update.

    First, we had a VGA cable. People got quite good at producing decent RAMDACs, and they gave a pretty good image at up to 2048x1536 at 85Hz on a cheapish 19" CRT. Certainly 1600x1200 at 75Hz was common.

    Then DVI got launched. Most people implemented only single-link (reserving dual-link for workstation cards), and, lo, these gave better images for driving the $1000 17" LCDs that were available at the time. The CRT-based consumers were rightly unimpressed, and the standard took years to find acceptance - and it's done so now only because millions of office workers decided that a 1280x1024 19" screen with pixels the size of bricks is an improvement over a 19" UXGA CRT. Digital signalling is a good thing, but introducing the technology at a less capable point than the analogue equivalent is a hard thing to sell. For evidence, note that Matrox's dual/triple-head2go products are mostly analogue-based, because most laptops still aren't providing dual-link DVI bandwidth.

    However, DVI did two things right: 1) it made the connector capable of carrying analogue data as well, so people didn't have to throw their monitors out (almost no cards are DVI-D only), and 2) it made the dual-link mode backwards-compatible. It's also a reasonably sturdy connector, although I prefer the LVDS connector on the SGI 1600SW.

    Then came HDMI. In spite of it being possible to run the HDMI protocol (which is DVI with extra bits added) through a DVI connector with an adaptor, consumers were told a graphics card with an HDMI connector on it is an "improvement". This is "improvement" as in "doesn't support VGA, and doesn't support dual-link DVI, but is otherwise identical" (i.e. it's a significant reduction in functionality). I find the decision to run the audio over the video cable to be a little dubious, but it uses the same signal wires as DVI (the audio "islands" are sent during blanking periods) so, other than saving a few cents, there's little to have been gained by switching connector.

    HDMI tried to keep equivalency with DVI by the type B connector, but because it's not a superset of type A, and because everyone *used* type A, nobody put a type B connector on "just in case" in the way that the dual-link DVI connector got added (where there was effectively no downside).

    Giving up, apparently, on the type B connector as a generally-accepted item, HDMI 1.3 adds more bandwidth for either higher colour depth or higher resolution - both of which are specified as part of dual-link DVI (that is, there was a standard way of doing this with the type B connector anyway). The spec says that HDMI signals will (now) switch to dual-link only at some unspecified frequency above 340MHz, whereas DVI signals will continue to go dual-link at 165MHz. Note that upping the frequency while basically obsoleting the type B connector (the standard can't be bothered to say when it should be used, except for DVI data) means that huge numbers of graphics cards with dual-link DVI outputs and plenty of bandwidth for doing so can't be used to drive HDMI 1.3 devices at full resolution/bandwidth - they'll be limited to 165MHz. This mess might have been avoided if some attempt to promote the type B connector, rather than giving up on it, had happened - but the two connectors should never have been incompatible in the first place. Allowing for higher bandwidths per link would encourage higher dual-link DVI support, with backwards compatibility, from consumer graphics cards - and the DVI spec places no upper limit on performance under dual-link anyway.

    DisplayPort intends to replace both HDMI (depending on who you ask) and DVI, and also LVDS (for internal connectors). Is it a "better" standard than HDMI? Probably, from an objective viewpoint (in the way that FireWire is "better" than USB). Does it actually offer the consumer anything more? Probably not: the bandwidth is of the same order as a type B HDMI 1.3 connector, the ability to drive multiple displays is limited and not such an issue for the average user (multiple HDMI cables are possibly more flexible anyway), and the much-vaunted fibre-optic link possibilities are available for DVI and HDMI from third parties such as Gefen (at a cost). DisplayPort appears to be an attempt by Vesa to re-establish themselves, and if it succeeds, it appears that it will be only through corporate politics.

    UDI seems to be sinking without trace, and we can be thankful that yet another standard may no be foisted on us.

    I can see no reason why DVI, the oldest digital standard being discussed, is inferior to any of the alternatives - other than the minute cost of the connector. As JJ commented, the other connectors are more flimsy and likely to be damaged (although I approve of the HDMI 1.3 mini-HDMI connector, which I accept has its place in portable media devices); I have no problems with the three SCART connectors on the back of my television (they'd be easier to plug in if they were even bigger!) and I doubt the size of the HDMI connector makes much difference to the average 60" plasma screen.

    A change in connectors (now that DVI is finally becoming established), of course, forces the consumers into an upgrade cycle. You can guess who that benefits, and it's not the consumer. The display industry - especially in home electronics - has a long history of not being able to settle on one standard (I've used all of UHF, composite, S-Video, component, 5-pin DIN, SCART, VGA, LVDS, LFH-60 and DVI in my time, and could have added Firewire, CAT-5 and HDMI, not to mention the number of display standards and the marketing departments muddying the "what is 1080p" question). It's time the media stood up for the consumer for once.

    Whatever can be argued for any of these connectors, I'm in no doubt that the best thing for the consumer is that there be *one* connector, and that any upgrades should be backwards compatible (which hopefully guarantees that they really *are* upgrades). It's not acceptable to agree to disagree and then let the consumers fight it out - the inconvenience to the consumer vastly outweighs the inconvenience to the companies who couldn't agree between so many ways of sending essentially the same data down a cable. This rebounds on the companies; HD adoption would, I'm sure, have been faster if people hadn't been waiting out the HD-DVD/Blu-Ray war, the HDCP disagreements, the digital TV/EDTV/1366x768/1080i/1080p30/1080p60/overscanning mess, etc. By refusing to compromise, and refusing to treat the consumer with respect, everyone loses; the sooner this is accepted, the nearer to a utopia of easy-to-use high-quality displays we can get.

    But I'm not bitter. (Incidentally, I have a desk full of four CRTs and an LVDS 1600SW at work, and a CRT + a quad-DVI T221 at home. Neither HDMI nor DisplayPort have any appeal for me, which will cause a problem when the next graphics card I buy drops dual-link DVI-I support.)

    --

    Fluppeteer

  16. Stu

    HDMI. Thats it!

    Now that HDMI is already very well established, and that the new 1.3 spec allows for greater resolutions, colour depths AND audio support, the race is already won IMHO.

    These other connectors anger me, its like we're getting back into another VHS / Beta war over video cabling!!! I'd gladly pay the VERY small premium for HDMI just so I don't have to worry about buying gfx cards and monitors and TVs with the right connectivity, its not difficult I know but its just stupid having multiple 'standards'.

    By definition, a 'standard' should mean - ONE, 1, uno, un, ein!!!

    HDMI all the way for UNIVERSAL graphics connectivity, including that for connectivity between laptops and external monitors / TVs.

    Just think of the benefits.

  17. De Zeurkous

    RE: Optical

    Optical fibre is the only thing that's going to survive on the long term (except for subspace communications, that is :^). <SCSI zealot> Luckily, both IEEE1394 and Fibre Channel (duh) support optical links...</SCSI zealot>

    Anyway, the reason is probably that a lot of investments have been in copper technology, and the consumer industry (always lagging behind severely and producing unbalanced equipment) won't be fast to switch to new technologies. That's why DVI was a non-solution and these are non-solutions. Most consumer-grade displays (/especially/ LCD) won't even show the difference between a well-generated RGB/HV signal transmitted through a decent cable and the worst DVI signals...

  18. Andrew Garrard

    Re: USB connectors

    I've already had a long rant, but wanted to chime in on the USB debate, specifically about the "way up" of the cables.

    Yes, they're (usually) stamped, and you can work it out by peering in the end of the cable anyway. This helps not a jot when you're reaching under the cosmetic flap on the front of your computer (thanks, Dell) and have to remember that, even if you *could* feel which way up it is from the outside of the connector (thanks, USB consortium) it has to go in upside-down (thanks, Dell) and at a 45 degree angle (thanks, Dell) in a socket that feels a lot like the air gaps next to it (thanks, Dell).

    It's possible to design connectors so that you can feel which way up they are. The same applies to sockets. It's amazing how many connectors are designed by people looking closely at them in a CAD drawing, irrespective of how they have to be used by someone fumbling blindly down the back of a desk/television. (Even with SCART, I can feel which way up the cable is, but can never work out the socket even if I run my thumb nail around the rim.) About the only connectors that I can give credit for this are the power sockets (kettle lead and UK 3-pin mains). It used to be possible to connect PS2 and (especially) AT keyboards blindly by rotating the plug until the socket accepted it, but that doesn't work for USB, which I usually end up plugging into a spare ethernet port.

    Not that any of the proposed display standards are particularly better than others for this. (I can tell which way up an SVGA socket is by running my finger around it, but often get it wrong; I can tell a DVI plug by the analogue pins, but can't feel the slots in the socket easily. 9-pin VGA, like 9-pin serial, is easier to identify, which is another backward step.)

    It's almost an argument for wireless connectivity, if I didn't have such strong feelings about reducing the available bandwidth of everyone within 100 yards.

    --

    Fluppeteer

  19. De Zeurkous

    RE: Backward steps

    While I agree with the spirit of your rant, in my opinion, you're thinking way too low-level. Every time we design another way of transmitting roughly the same signals we're wasting time. There is an excellent interface standard that's already being used for A/V: SCSI.

    ``Probably, from an objective viewpoint (in the way that FireWire is "better" than USB). Does it actually offer the consumer anything more?''

    First, FireWire is a subset of the IEEE1394 standard, which is a PHY for SCSI. Second, IEEE1394 supports a load of link types, from 100 mbit/s on copper to 3.2gbit/s on optical fibre. Third, it's in active development and expected to replace both Ethernet, Fibre Channel (usable as a SCSI PHY) SAS (another SCSI PHY) in the future; it has obsolescensed the popular SCSI Parallel Interface already.

    Also, since it's SCSI, it's possible to work without a host bus. For example: if a digital camera supports the SBC (SCSI Block Command Set) and/or RBC (SCSI Reduced Block Command Set) sub-standards and knows a bit about common file systems, it can write directly to an attached hard drive. If a SCSI Display Command Set is developed, the same displays will be able to be used for DVD players (yuck, but a consumer favorite it seems), signal receivers, digital camera's (still or motion), Personal Computers (IBM-compatible and otherwise), and UNIX workstations. It will unify the display market.

  20. De Zeurkous

    RE: HDMI. Thats it!

    I smell a sense of a tunnel vision here. Speaking as both an engineer, a hacker, and a SCSI zealot, I can assure you that there are far better solutions than HDMI, which you won't have to upgrade to a newly-hyped technology with little improvement in a few years.

  21. De Zeurkous

    RE: USB connectors...

    ``This is expensive, and means that only expensive kit can have it.''

    Are you under the impression that mass-produced hardware and firmware (developed from an open standard) and neglegible licensing costs is more expensive than having a lot of incompatible standard products, each which must be designed, maintained, produced and marketed seperately?

  22. Dillon Pyron

    VGA vs HD products

    Let's see. I have to hook my laptop up to a 21" widescreen LCD monitor. 1280x1024 works fine. I have to hook it up to a projector. 800x600 is the limit.

    My cable DVR has an HDMI output that isn't supported yet (but it does have an eSATA connection, and I have a 500GB eSATA drive. My DVD player has a HDMI output. But my TV only supports one HDMI input. And only provides 4 channels of audio out. So what are my options? The TV has three channels of YCbCr. Guess what I choose. And my A/V receiver has coax and optical digital input for audio.

  23. De Zeurkous

    RE: Re: USB connectors

    Personally, I couldn't agree more. However, remember that Dell is firmly within the consumer segment; they have no incentive to improve anything except to keep the price down and the profit up. The industry adoption of IEEE1394 (which uses connectors ``inspired'' by the link connectors on the Nintendo Game Boy series, BTW) has the potential to crack open another part in that segment.

  24. Tom

    Why do we need this?

    Video is pretty simple, actually. Three colors and two sync's (or one if you combine them). If you need to be fancy, add in a "back channel" to describe the box you connect to. It has all been done before. Sun used the DB13W3 connector that had three coax lines for the colors, and a bunch of pins for other things. They had it working almost 20 years ago. Everyone else since then has really stirred up the pot and just made a mess. The problem is that everyone wants their "own" connector even though it doesn't add much. Sure some digital ports have come up, but are they REALLY necessary? The display IS analog unless it is a simple on/off monochrome monitor.

    If all else fails, go to 5 BNC connectors (Red, Green, Blue, H Sync, V sync), and add the RCA jacks for audio. Anything else is silly window dressing!

  25. De Zeurkous

    RE: Why do we need this?

    Yup, 13W3 pwnage 8) </Sun zealot>

    On a bit more serious note, we need digitally transmitted video for the

    following reasons:

    1) It does not have a theoretical quality loss at any level;

    2) Converting D/D is a hell of a lot simpler than converting A/D.

    Now, if the manufacturers have at least a bit of a clue and implement it

    properly, the following would also apply:

    2) We would never have to care about the opinion of devices about

    interface standards;

    3a) Transmitting to and from non-display devices (recorders and the like) becomes trivial;

    4) The cost and difficulty of producing and handling video signals would

    both drop dramatically.

  26. De Zeurkous

    RE: Why do we need this?

    Note to mod: previous submit malformed due to method of temporary storage. Please replace with this one. Thanks :)

    Yup, 13W3 pwnage 8) </Sun zealot>

    On a bit more serious note, we need digitally transmitted video for the following reasons:

    1) It does not have a theoretical quality loss at any level;

    2) Converting D/D is a hell of a lot easier than converting A/D.

    Now, if the manufacturers have at least a bit of a clue and implement it properly, the following would also apply:

    2) We would never have to care about the opinion of devices on interface standards;

    3) Transmitting to and from non-display devices (recorders and the like) becomes trivial;

    4) The cost and difficulty of producing and handling video would both drop dramatically.

  27. Rick

    HDMI is the way, and that includes PCs...

    There are now graphics cards that can combine the audio from s/pdif pins on a compatible motherboard, by a short internal cable to the graphics card.

    I have just upgraded my media computer to this.

    It means that the tv signal (DVB - Mpeg2) does not have to be converted at any point.

    see the article on Tom's Hardware:

    http://tomshardware.co.uk/2007/05/04/we_build_4_diy_hdmi_uk/

  28. Andy Bright

    But nothing has either UDP or DisplayPort so why does either matter?

    Computer monitors only have DVI and VGA, TVs HDMI, DVI and VGA.

    There is no display device that supports either of the other two - at least nothing on the western side of the Atlantic - and until there is neither is at all relevant.

    Are they are good idea? I'm sure they are, as long as someone actually makes a graphics card or monitor that has one of the ports.

    Until then it seems the only answer is to not watch HDCP movies on computers - unless you want to watch them in 16 million shades of black or in a resolution lower than good old fashioned DVD.

  29. Andrew Garrard

    Why we should care

    I'm not suggesting that we can't gain from a digital standard - there's a clear advantage to using a digital connection over an analogue one, I'm just pointing out that single-link DVI (or even dual link) wasn't universally better than the VGA technology available at the time, and that this harmed its market penetration. I'm not against progress, just against the introduction of standards that take backwards steps because the high end of the old standard was considered irrelevant - future-proofing has a way of redefining the "high end". As an example, UDI appears to max out at 36 bits per pixel (although I'm not sure I'm reading an authoritative source); dual-link DVI can handle 48bpp, 8 bits per channel per link. The Canon 1DMk3 has a 14-bit image sensor; if this technology gets into video cameras, UDI requires downsampling where DVI wouldn't.

    The DVI digital signal is a direct equivalent of the analogue one (regardless of whether an analogue monitor would cope with the signal, in the case of reduce blanking). This both makes it easy to combine analogue and digital output (the graphics card can throw the same pixel data at a common output component and have the data transmitted in both forms, rather than needing to scan the frame buffer twice) and makes it relatively easy to convert between video connectors. While I'm a fan of the SCSI protocol (and various networking protocols that have proper error correction), even once sufficient bandwidth has been routed to the display (3.2Gbit/s is quite a way under single-link DVI) there's still a need for a frame buffer to reconstruct the image; not a big problem in a monitor (remember to triple-buffer everything) but expensive in a display format adaptor. As I've said of DisplayPort, would it be a nicer spec? Yes, from my point of view. Would it gain us enough to be worth replacing the existing video stream approach? Personally, I doubt it.

    It's true that neither UDI nor DisplayPort have any current market penetration; the concern is that, once they do, one of six things happens:

    1) The new monitor you want is DisplayPort only, and you have to upgrade your graphics card/the new graphics card you want is DisplayPort only, and you have to upgrade your monitor, otherwise they won't talk to each other.

    2) Devices gain yet another port on the back, which adds to the cost and space but essentially gains us nothing.

    3) People use an unnecessarily expensive DVI/HDMI/DisplayPort adaptor.

    4) The ports are multi-mode, which limits them to single-link support (AFAICT).

    5) Graphics cards start being able to run all the standards down a set of pins, and we end up with an octopus dangling off the back of the computer (see some VIVO solutions).

    6) Other connectors start to go missing. As someone with lots of CRTs, this lacks appeal.

    It does not appear to be the case that DisplayPort gains us anything (except being marginally easier to plug in without seeing the socket, allegedly); in return, it possibly forces an upgrade cycle and definitely causes unnecessary incompatibility and confusion. And this is if all devices talk to each other perfectly (because that worked so well with HDMI and DVI). It seems unlikely that a new connector will make anything cheap, because there'll be years of backwards compatibility requiring *both* connectors.

    Other than a few companies, I don't see who DisplayPort helps. Having re-read a presentation on the subject, the bandwidth appears to be slightly greater than HDMI 1.3 single-link, slightly *more* greater than dual-link DVI (assuming dual-link DVI is 330MPix/s, which is not actually a limit but the expected minimum because, for dual-link, one link should be capable of at least 165MPix/s), but substantially less than HDMI 1.3 down a type B connector. It is not, for example, enough to drive a WQUXGA T221 at full refresh on its own (whereas a dual-link + single-link DVI connector *is*), and the "next smallest" common display size (the WQXGA 30" panels and QSXGA medical panels) are well-catered for by existing DVI. Even if increasing bandwidth is not a significant aim for DisplayPort, it seems that a greater step should be taken in this direction.

    All this assumes that four DisplayPort lanes are available. Although cables have to support this (kudos for avoiding the "it's a dual-link DVI cable" "so why are there pins missing?" debacle) there's no requirement that devices themselves do. I'll be interested to see how long after devices get the "DisplayPort" tick box it takes for the number of lanes to be labelled, because I bet - just as dual-link DVI took a while to appear - the first batch of products will be castrated to the common range of monitors. One lane can do 1080i; two lanes can do 1080p, and I'll be pleasantly surprised if four lanes are considered necessary just for the happy minority with decent resolution displays; that said, I'm a cynic (can you tell?)

    On the plus side, (new) larger monitors are, I believe, obliged to have HDCP support in the US now. I suspect that higher resolution displays will be limited to HDCP over DVI for a while (when nVidia don't break dual-link support), but the HDMI type B connector might turn up eventually, you never know. (For so long as most protected content fits in a single link, if you don't mind your 30" screen running at 1280x800 and you didn't want to view in a window then single link support might not get replaced.)

    I can't see how adding another standard to the mess improves matters. If the industry would concentrate on making an existing standard dominant, and maybe improving it in a backwards-compatible way, I'd have more sympathy.

    Turns out I had more ranting left in me. :-)

    --

    Fluppeteer

  30. Colin Bull

    Happy birthday ?

    In my domain VGA is still king, and if my memory serves me well was launched 20 years ago next week. 20 years is not bad for a defacto standard. And it was invented in the UK.

  31. De Zeurkous

    RE: Why we should care

    While I pretty much agree with your rant, I'd like to point a few things out:

    1) As I indicated, IEEE1394 is still in active development. A boost from the video industry joining the fray would probably speed up that development substantially, easily leading to the transfer rates needed.

    2) Framebuffers are standard, inexpensive equipment; then again, why shouldn't we interpret the image chunks as draw commands on low-end displays?

    3) If we implement the SDC-over-1394 solution, we don't need format adaptors for much longer; simple SCSI PHY converters would be enough for backward combatability.

    As for the final paragraph of your rant, I disagree. There should be one more standard that is not only _A_ standard, but _THE_ standard, as well. That's what really matters on both the mid- and the long term.

    `` that said, I'm a cynic (can you tell?)''

    Duh -- I recognize a fellow in that art when presented with one :)

  32. A J Stiles

    Why they really don't like fibre

    The reason why the industry doesn't like fibre optics is because there's no way to monetarise it.

    Once upon a time, people invented things. But those who didn't have the patience to keep trying things in the hope of finding something that worked got jealous, and took over the system. Modern business practice is to find something that people already do for free or low cost, and then work out how to charge them money -- or more money -- for it. Usually by waving a bogus patent claim in people's faces. This works in the USA, thanks in no small part to the American system of allowing lawyers to demand payment whilst a case is still ongoing. (Another method which sometimes works well is by insisting that a popular commodity be paid for in US dollars, in order to skim a small amount off every transaction, and invading any country that threatens to start selling it by the Euro instead.)

    Now, those pesky laws of nature say you can couple a signal into a piece of fibre-optic without any expensive proprietary connectors: all you have to do is cut the end cleanly with a single chop from a very sharp kitchen knife, and hold it in place with Rizla papers and Blu-tack. This actually works surprisingly well over short distances and for equipment which is generally considered furniture and so not moved about much. Send SCSI commands serially over fibre-optics, and you've suddenly got an open standard that nobody can make money from.

    And therein lies the problem; because, without government intervention, such an open standard is never going to be popular with the big established players. They want their own proprietary standards (so you can't just use a brand X recorder with a brand Y TV), or at least a common proprietary standard that saves them from having to compete on merit by allowing them to close ranks and keep young upstarts out of the game.

    All of which is ignoring the fact that we have *already* had for years a royalty-free standard connector that supports RGB+Csync or Composite Video (with graceful degradation if only one device is RGB-capable), stereo audio (plenty good enough if you're using TV speakers; if you want multi-channel, you really should be using a dedicated amplifier with its own fibre-optic input) and data communication. RGB is what CRTs and LCDs use natively, and so provides better picture quality than either SVHS or YPrPb. It ought to be possible just to extend the SCART standard to deal with higher sync rates, using the data channel to indicate what the display supports and falling back to 15kHz in the worst case.

    SCART (with higher scan rates) and VGA should also be reasonably compatible: the only problems are in the different signal levels, impedances and sync formats (Csync vs. separate Hsync and Vsync), but expect a single-chip solution to emerge as soon as there is a need for it. Yes, SCART is analogue; but since the signal from the SCART cocket goes, as near as d**n it is to swearing, straight to the CRT and speakers, then that oughtn't to be a problem.

    If we really do need digital signals (for, say, recording from a receiver without an integrated HDD -- as if anyone will make them that way in future -- or transferring from a fixed device to a mobile one in a way consistent with the fair dealing provisions of copyright law), fibre-optic is the logical way to go.

  33. Andrew Garrard

    I'd love to agree with you, but...

    De Zeurkous - I approve in principle of a SCSI/1394-based protocol, but the re-tooling to produce this from the existing displays is significant. Likewise, I approve of a subset of displays running with display commands (in the manner of X terminals), but it gets difficult to do this when the whole display is being updated - short of lossy video compression, the amount of work required to produce an arbitrary image on the screen (e.g. during the playing of a game) easily exceeds the memory requirements of sending the image in its raw format. For a point-to-point protocol, there's little benefit in reducing the bandwidth some of the time at the cost of complicating the protocol; nothing will be using the spare bandwidth. Obviously the situation is different if the display is streamed over a shared network.

    Something supporting higher total resolutions/colour depth and multiple displays, better connection distances, better connectors and a more flexible protocol *would* be a good thing, but the trick is to achieve some of these without making any of the others worse than we've got already - and *enough* better (or future-proof) that the consumer both has a benefit to upgrading and evidence that they won't need to do so again immediately after. If this isn't the case, the benefits of a new connector don't outweigh the costs of switching. I don't say that we should stay with HDMI (or DVI) forever, just that we should wait until the replacement is worthwhile - and I don't think that's true of DisplayPort. I'll reserve judgement on a 1394-video connector (not the existing compressed scheme) until someone comes up with a detailed proposal, but - much though I like the idea of a more elegant standard - I won't advocate it unless there's actually an end-user benefit. Matching the capabilities of DVI with a "cleaner" protocol isn't enough.

    A J - I sympathise (especially about the patent rant), but I really think a digital connection is a good idea. There are real problems with running a CRT through a switch or longish cable at much over UXGA resolution, and VGA inputs on LCDs have to do a lot of work to convert the signal back to digital (before, admittedly, making it analogue again). SCART isn't universal (at least in the US), and I have to admit that - nice though it is on a TV - it's a bit bulky for the back of a PC (I've no idea what its signal quality tolerances are). I think it's too early to throw away analogue completely, and certainly too early to throw away a signal that can be converted to analogue easily (as DVI-D and HDMI can), but keeping digital from the frame buffer to the display does seem more practical. There are reasons for going with YCrCb (better use of the bandwidth), even if it requires some conversion at both ends, but for computer displays I suspect we're mostly talking RGB for *any* standard these days.

    A fibre-optic standard would be nicer, and I'd hoped that DisplayPort might go down this route (or UDI, for that matter), but having copper *and* fibre is the worst of both worlds. We're back to fibre being useful only for the minority with the need for longer cable runs, just like dual-link DVI is useful only for the minority with high resolution/colour displays, and not making it the default will result in incompatibility, confusion, and unnecessarily high prices. Again.

    It appears that the display industry is too busy trying to work its way through the standards messes that it makes for itself to learn not to do it again. I'd like to think that the consumers could put their foot down at some point, but I suspect they'll be too confused by now to have a chance - all that's happening is that everyone's holding off buying *anything*.

    Cheery, innit?

    --

    Fluppeteer

This topic is closed for new posts.

Other stories you might like