back to article DisplayPort standards bods school USB standards bods with latest revision

The USB Implementers Forum, USB-IF, rolled out the spec of USB 4 version 2 just last month, as you've probably read in The Register. We noted at the time that the nitty-gritty stuff would debut in time for developer events scheduled for November. Well, it's nearly November, and a more detailed announcement [PDF] is here. (We …

  1. Anonymous Coward
    Anonymous Coward

    Another tip, you need to a special adapter to go from VGA to HDMI as well.

    Chaining VGA <-> DVI <-> HDMI also won't work.

    1. Yet Another Anonymous coward Silver badge

      DVI does have the option for VGA, it's the 4 pins around the flat blade connector. It's just that almost nobody implements it

      1. Richard 12 Silver badge

        Anymore

        It used to be very common.

        DVI-* is also a great example of how to handle compatibility.

        A DVI-I to VGA adapter will not physically fit into a DVI-D socket. Because it won't work, so it doesn't have the holes.

        DVI-D and HDMI 1.2* are actually exactly the same, the only difference is the licence fee.

        (I think it's 1.2. Might be 1.1)

        1. Martin an gof Silver badge

          Re: Anymore

          DVI-D and HDMI 1.2* are actually exactly the same, the only difference is the licence fee.

          and the fact that you don't get audio over a DVI connection, whereas you can with HDMI.

          M.

          1. ADRM
            Boffin

            Re: Anymore

            I disagree I have a Dell T3400 with a Gigabyte 9800 1GB connected to a Samsung 40" TV via a DVI to HDMI cable and it does indeed pass audio. There is even a connector on the Graphics card to connect to the Sound cards digital audio out. This is on a 10 year old PC running Windows 10 for CD, DVD and Blu Ray playback but has the audio on HDMI to the TV for wife mode when I want to play a DVD but don't want to fool with the receiver. Why do we need so many remotes? Many Video cards do produce audio from the DVI but the cable needs to be fully wired.

          2. Ideasource Bronze badge

            Re: Anymore

            Yes you can

            They are electrically the same exact signal.

            Slipping audio into the bit stream is no big deal as long as it meets protocol.

            I have video cards for my computer that have DVI outputs with audio. It makes sense that the more robust connector be used on the computer because passive adapters are easily replaced being super cheap to manufacture.

            It really was just a license on a physical shape. It's the same pinout.

            I used to have a TV that had DVI and supported audio over DVI.

            By avoiding the HDMI licensing fee customers got a superior connection at a lower price.

            It very much appealed to hardware nerds. We're always unplugging and unplugging our connectors. HDMI connector gets loose way too quick and has no screws to really secure it in.

            1. 42656e4d203239 Silver badge
              WTF?

              Re: Anymore

              >>By avoiding the HDMI licensing fee customers got a superior connection at a lower price.

              Ahhhhh hahahahahahahahahahahah (and breathe)

              Granted, physically the DP is more robust and latches however guess which sort of PC here invariably objects to connecting to monitors (of various sorts, including, somewhat bizzarely, DP monitors)? the DP ones... I have no such bother with the HDMI or VGA equiped PCs. Yeh p[ossibly not down to the PC end, but still, they are Dell PCs and (IIRC) Dell were a (the?) driving force behind DP.

              1. Anonymous Coward
                Anonymous Coward

                Re: Anymore

                That's odd. User of multiple Dell laptops and monitors here, all connected via DisplayPort with no issues :/

      2. captain veg Silver badge

        Yes, but that doesn't help you if the output device doesn't do analogue.

        -A.

    2. Solviva

      DVI ports can carry both analogue on the bladed pins, as well as digital on the pinny pins. You can then take VGA from a DVI and HDMI from DVI, but HDMI and VGA are two rather different things, hence you need an extra adapter from HDMI <-> VGA.

      1. Jamesit

        VGA is analogue and HDMI is digital.

  2. Yet Another Anonymous coward Silver badge

    HDMI to DVI works both ways

    Mostly. For computer graphics and no audio HDMI / DVI are equivalent (except you don't have to pay the HDMI licence fee for DVI)

    For video HDMI can do more high bit depth modes and of course has sound

  3. DS999 Silver badge

    Can you just punt on the "what kind of USB-C cable is this?" question

    By using Thunderbolt cables instead? Since Thunderbolt uses the same connector, and supports the highest speeds, wouldn't they be a superset of all that USB-C nonsense? Or is there is a gotcha in there? I'm guessing there's a gotcha in there, because there's no way the USB standards people want to make ANYTHING easy!

    1. doublelayer Silver badge

      Re: Can you just punt on the "what kind of USB-C cable is this?" question

      No. Thunderbolt requires support on both ends. If you use a Thunderbolt cable and one of the sides doesn't support Thunderbolt, then you will have a problem. At least some cables will have a fallback USB connection, but I think that's optional and it certainly won't use the faster Thunderbolt connection if your USB would support faster speeds than the fallback wire does. It might not work at all, and if it does, you won't be guaranteed any speed advantages for doing it.

      1. What? Me worry?

        Re: Can you just punt on the "what kind of USB-C cable is this?" question

        Interesting here (at least for my needs) is that Apples Thunderbolt cable is compatible with their USB-C chargers (https://support.apple.com/en-us/HT208368). Their USB-C cable though, won't provide data above USB 2.0 speeds (or carry video).

        Now I just need to figure out how to get my iPod (FW800) to USB-C... It worked in the past with FW800>FW400 Thunderbolt adapter. Can I go one adapter further, or will that be a bridge too far?

  4. Old Used Programmer

    About HDMI to DP....

    I'll agree with the point that HDMI to DP adapters aren't cheap. I have a couple of them and there were on the order of $30 each. They're also hard to find, mostly because when you try to look them up you get vast numbers of DP to HDMI adapters and very few HDMI to DP. What I'd love to see would be decent ones made that work both ways. Like wise for conversion between VGA and HDMI.

    And, yes, my HDMI to DP adapters are for Pis connected to original PackedPixel monitors that only have DP. However since PackedPixel went over to HDMI (at least as an option) and then proceeded to go out of business, they're not a long term plan. Fortunately, there are now more and more companies putting portable monitors on the market that use HDMI input. Basically laptop displays packaged with a power supply and a cover, and for a lot less money.

    1. Martin an gof Silver badge

      Re: About HDMI to DP....

      What I'd love to see would be decent ones

      There is plenty of decent video conversion kit out there, but you do end up paying for it, for example Kramer AV and Blackmagic.

      Those tend to concentrate on broadcast formats, so Displayport and DVI are thin on the ground. Canford have been around since I was in primary school and have a range of video converters from various manufacturers, for example Muxlab. I dare say there's other stuff out there.

      I use a lot of Pis, but with large monitors and projectors which feature multiple inputs, so connection is no problem. The biggest issue I had recently was the need to use a Pi to play video on an honest-to-goodness old black-and-white portable TV. As the Pi has a composite video output (you try finding one of those on a graphics card these days) all I had to do was plug it in to an RF modulator. It even gave the visitors the genuine experience of knocking the large tuning dial (no buttons on this TV) and being met with snow and white noise. Surprising how many adult staff couldn't work out how to retune the set!

      One of the reasons some converters might be more expensive is if they include the ability to scale the input to suit the output. For example, it's no good expecting to be able to convert 4k HDMI to VGA without downsizing it a bit. Converting to DisplayPort probably requires some quite hefty processing as, while most formats (e.g. HDMI) send a continuous stream of pixel data, DisplayPort is some kind of packetised system so that video data can be sent along the same wires as network or audio or, well, whatever I suppose.

      M.

  5. heyrick Silver badge

    Are they trying to kill off Thunderbolt?

    One would generally expect older slower kit to work on newer faster incarnations of the same interface.

    Like USB, for instance. Once you've navigated the maze of different interface types and managed to electrically connect the two, a crappy USB 1.1 MIDI adapter (the sort that shits itself if you play two chords at once) will work on a USB 3 port. I also have an old computer with 10baseT plugged into the ADSL box with 100baseTX equipment and it just works (but obviously is notably laggy to that one device).

    1. Solviva

      Re: Are they trying to kill off Thunderbolt?

      Whilst Ethernet negotiates itself from low to very high, USB 1-2 and USB 3+ are actually separate buses. You could have a port that supports only USB 3+ and not USB 2-.

      Things get even muddier when using adapters. I have a Samsung S8 that seems to have a not uncommon problem whereby it detects liquid if charging with a C-A cable and refuses to charge. Works perfectly with a C-C cable.

      Techmoan recently posted a video where he also found some Minidiscs refused to work with some software when plugged into an A port (USB-A devices), but worked fine when using an A-C cable plugged into a USB-C port (still using the USB-A bus on the host side).

  6. gotes

    Clear as mud

    I guess I'll just carry on using my existing gear then. That said, I have one of those Plugable 4K laptop docks, and the ethernet adaptor is pretty useless with three monitors hogging the bandwidth.

  7. Terry 6 Silver badge

    I may be wrong

    I know nothing about this stuff beyond articles like that that I don't understand. BUT, the whole USB thing sounds like a total muddle of incomprehensible and unrecognisable inconsistency.

    1. Liam Proven (Written by Reg staff) Silver badge

      Re: I may be wrong

      [Author here]

      Exactly, yes. :-(

      USB 3 and above are complicated. Then there is, in addition, the bonus complexity of carrying Thunderbolt and DisplayPort over the same connection with the same sockets... but not necessarily the same cables. Although they might be.

      Then if that was not bad enough, add in that there are multiple different versions of Thunderbolt... and some are not backwards compatible with others, even though they are carried over the same connector...

      And then, if that were not bad enough, that DisplayPort *and* HDMI can be carried over the same connector *as well*. And both of *them* have different versions too.

      So, as I said: it's a mess.

      And while VESA is trying to impose sense on DisplayPort, the USB group... aren't.

      1. John Brown (no body) Silver badge

        Re: I may be wrong

        It's almost as bad as SCSI :-)

    2. Anonymous Coward
      Anonymous Coward

      Re: I may be wrong

      And USB was meant to make things easier. No messing about with serial or parallel ports and cables for your printer. No plugging in proprietary interface cards with jumper or switches to set IRQ, I/O or memory addresses for your scanner.

      Just plug in the USB cable and it's all plug and play they said....

  8. captain veg Silver badge

    USB 4 version 2

    Aargh!

    This is like OS/2 version 3, but worse. Which version is it, exactly?

    USB 4 version 2.

    Is that version 4.2? Or what?

    -A.

  9. Sandy Scott

    Not quite true

    "You can connect any HDMI display to a Displayport connector..." - Not exactly- it needs to be a dual mode DisplayPort https://en.m.wikipedia.org/wiki/DisplayPort#DisplayPort_dual-mode_(DP++).

    1. Liam Proven (Written by Reg staff) Silver badge

      Re: Not quite true

      [Author here]

      OK, thank you for the clarification.

  10. Solviva

    Back in the day God (USB Forum) created USB, which then had two flavours - low speed and full speed. Both being USB 1 - woe betide you got confused with a USB 1 peripheral/hub that wasn't what you expected. Then came USB 2, using the same cables/ports, but with a new colloquial name High Speed. Then came USB 3 with a same-looking USB-A connector (with more pins and typically blue), along with USB-C ports.

    Then came a plethora of things using the USB-C port, but offering a multitude of different capabilities, USB3, USB4, Thunderbolt... Not necessarily a bad thing, but if you have a port supporting one of these, it's not instantly clear what subset of features your port has. Some laptops only allow charging on one of the USB-C ports just to confuse things more. If your device has multiple USB-C ports, why should they not all be identical?

    To me it seems a mess, I mean it's nice that you can use the same USB-C port for charging the device as well as charging devices from it. Using the same ports for Thunderbolt - if they support it, but the naming conventions just seem wrong. Not that I have any better ideas though :)

  11. Anonymous Coward
    Anonymous Coward

    Connecting DP

    > You can connect any HDMI display to a Displayport connector,

    Hey I'd be happy if you could connect a display port connector on your PC to the display port connector on your monitor and have a better than 1 in 4 chance they'd agree to talk to each other. If you are lucky enough to find a working pair for heaven sake don't load any updated to the video drivers or anything could happen.

    As for DP -> HDMI connectors well they might work, if you're lucky then again maybe not or maybe not at the right resolution. They might be stable or they might flicker in and out. They might connect OK run for a minute and then either stop or cut out for several seconds and then become stable again.

    Hell it would be nice if there was some way to know which DP PCs and Monitors would talk to each other before you actually take delivery

    1. Richard 12 Silver badge

      Re: Connecting DP

      The problem with DP is the labelling.

      A DP++ socket means it does DP and HDMI.

      But that also means a DP++ to HDMI adapter is simply a passthru, and is not as good as a DP to HDMI adapter. Despite the ++

      On top of that, many GPUs have multiple DP+ outputs but can only do HDMI on a subset of them at a time. So it works just fine until you plug in one more than it supports, then nobody knows what happens, because it's not supported and usually, not tested either.

      And unplugging any of them makes the others work again. Highly confusing.

      1. DialTone

        Re: Connecting DP

        > The problem with DP is the labelling.

        This is also a problem with HDMI displays I believe - Just because a device is advertised as being HDMI 2.1 for example doesn't necessarily mean it supports Variable Refresh Rate (VRR), High Frame Rate (HFR) or any of the other features specifically added in the HDMI 2.1 spec - essentially being just HDMI 2.0 in all but name - a name which is required to be advertised as HDMI 2.1 per the spec.

        Finding out which of the specific sub-features are supported can be challenging at best.

  12. Kev99 Silver badge

    We had about a dozen Dell PCs where I worked. Before we upgraded from 2014 models (with a whopping 4GB RAM and 125GB HDD) HDMI ports were nowhere to be found. Only display ports. Which was okay until monitors needed to be replaced and the ones we found only had HDMI ports. I wonder if the industry will ever decided to stick with a SINGLE video port. You know, kinda like USB (original), CD/DVD, Blue Ray, RAM, etc.

    1. Yet Another Anonymous coward Silver badge

      Displayport is cheaper for the PC maker and is generally better for computer graphics stuff.

      Displayport input to HDMI out adapters are cheap and reasonably good, unless you want 4K@60hz, or you need long hdmi cable runs, or it's full moon

      1. Diogenes

        Pencil case of adapters

        I am a supply teacher* and never know what ports are on the laptop** I will be issued for the day nor the connectors for any particular projector in any classroom, so I carry a pencil case full of adapters, and sometimes different brands of the same. I have a HDMI to vga adapter that will work on all but a handful of devices, but the adapter which works on those will only work for @50% of the others, and don't get me started on USB-C to whatever.

        * day to day casual.

        ** As a supply teacher I get the oldest in the school, permies get the best - no complaints. And no, I will not take my own in as I do not want to put MSIntune on it,

    2. Liam Proven (Written by Reg staff) Silver badge

      [Author here]

      As far as I can tell, a set of dead basic DP→HDMI cables, at around £6 each, would have sorted this out. That was the point of my bootnote. :-)

    3. eldakka
      Boffin

      > I wonder if the industry will ever decided to stick with a SINGLE video port. You know, kinda like USB (original),

      which was competing with Firewire for a while, You could have peripherals that used USB, and some that used Firewire. Competing, incompatible standards. USB eventually won.

      > CD/DVD,

      which competed with DAT, LaserDisc, MiniDisc, until CD and DVD won out.

      > Blue Ray,

      had HD DVD as a competitor for a while, until Blu Ray won - mainly due to being a Sony standard that they shipped in the PS3, putting a BluRay player in many households plus a game console for less than - at the time - many standalone BluRay players.

      > RAM, etc.

      RDRAM? In the SD-RAM days RDRAM and SDRAM competed. The introduction of DDR-SDRAM, which was startling similar to RDRAM, even so that Rambus who invented RDRAM sued many over DDR-RAM (all 'DDR' RAM is SD RAM, so the 'SD' part is left out). In fact, DDR-RAM was so similar to RDRAM, that many companies settled with Rambus and agreed to pay it royalties on their DDR-RAM products. Plus there are still other RAM types, LR-DIMMs for example, although that is limited to the server space. Oh, and Optane memory too (RIP).

      There has never really been a standard or common anything in computing that didn't have to fight to become 'the' standard against other things.

    4. SloppyJesse
      Joke

      > I wonder if the industry will ever decided to stick with a SINGLE video port. You know, kinda like USB (original), CD/DVD, Blue Ray, RAM, etc.

      You're right, there are too many different options. What we need is a standard...

      https://xkcd.com/927/

  13. AdamWill

    big problem: trademarks

    Someone on an Ars Technica thread about this pointed out a factor that explains a lot of it: https://arstechnica.com/gadgets/2022/10/thunderbolts-next-spec-triples-bandwidth-to-120gbps-with-a-catch/?comments=1&post=41317569#comment-41317569

    The term "USB" is not trademarked. That means nobody actually has any leverage to "enforce" USB standards in the way Thunderbolt(TM), Lightning(TM) and DisplayPort(TM) standards can be enforced. If you want to say your product is a Thunderbolt(TM), Lightning(TM) or DisplayPort(TM) device, you'd better meet the requirements or else the trademark holder can sue your product off the market. This is not the case for "USB" products. This is a large reason why all these apparently-odd names and decisions about how everything "USB 3.0" is now "USB 3.2" or whatever keep getting made: the policy has to be very permissive, because there's really no stick to use.

    This also explains why "USB4" is a thing - and why it's "USB4" not "USB 4". It's "USB4" because it's "USB4(TM)". The USB-IF realized that not trademarking USB was a big mistake, and they *have* trademarked USB4: https://usb.org/sites/default/files/usb4_language_product_and_packaging_guidelines_final__0.pdf . This also explains why we get the apparently ridiculous "USB4 version 2": if they call it USB4.1 or USB5 they have to keep taking out more trademarks.

    As a bonus, this also kinda explains some of the old "SuperSpeed" and so on shenanigans - they took out trademarks on the SuperSpeed logos to try and work around not having a trademark on "USB". That didn't work very well, though, because nobody really cared about the "SuperSpeed" branding.

    Anyway, this should, hopefully, give them more leverage to have clearer and stricter policies in the vein of Thunderbolt from now on.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like