back to article Just in case you were expecting 10Gbps, Wi-Fi 6 hits 700Mbps in real-world download tests

The long-awaited future of super-fast wireless is here, with the Wireless Broadband Alliance (WBA) claiming speeds of 700Mbps in a real-world environment using the Wi-Fi 6 standard. In a trial that the WBA says is “the first of its kind in the world,” carried out in the “hostile” environment of the Mettis Aerospace facility in …

  1. Blockchain commentard
    Boffin

    So, all their testing is effectively just streaming. What's it like in the real world such as opening web pages, gaming etc. which needs to continually establish connections and transfer just small amounts of data?

    1. Pascal Monett Silver badge

      Well if you're streaming up to 700Mbps then surely browsing can't be all that bad, and I don't see any impact on gaming. These days you're connected to server, so in effect it's just like streaming.

      1. Bronek Kozicki

        Streaming is only minimally affected by latency; games on the other hand ... 6ms might not hurt you if you are lucky, but it is a significant enough proportion of human reflexes to have some impact.

        1. Loyal Commenter Silver badge

          The human eye can detect changes at around 20 Hz. Your peripheral vision is a little more sensitive to movement, say 25 Hz if you're lucky. Most monitors have a refresh rate of 60 Hz; if you have sme fancy gaming one, and have a graphics card that can drive it at that rate, you may get 120 Hz. A latency of 6ms corresponds to a delay of 1/166th of a second, i.e 166 Hz. I can pretty much guarantee you won't be able to detect that delay with your eyes, or indeed any of your senses, no matter how much of a committed gamer you are.

          1. Graham Dawson Silver badge

            20Hz is the minimum necessary frame rate for the eye to perceive continuous motion. I do wish people would stop repeating this myth that 25fps is the maximum frame rate we can perceive, when it's obvious from even a cursory examination that the eye doesn't work that way.

            1. Anonymous Coward
              Anonymous Coward

              Agreed. Many of us remember noticing the flicker in 60 Hz TVs/monitors - I'm sure it was even worse across the pond at 50 Hz. When I was able to get a monitor that ran at 100 Hz I found it so much better since I could no longer see the flicker. It isn't really an issue with post-CRT TVs, plasma TVs "flicker" at 480 Hz to 960 Hz, and LED TVs that advertise 120/240 Hz refresh rates do so by cycling their backlight at that rate, neither of which are visually perceptible to me.

              Of course like most people when I was young I used to be able to easily hear (and was annoyed by) the 15 KHz flyback transformer in TVs (I could hear if a TV was "on" even if the screen was dark) back then too, but after years of DJing in a club in college/grad school I no longer hear frequencies that high. Perhaps the ability to see/sense 60 Hz flicker goes away with age as well...

            2. Graham Dawson Silver badge

              Offering up a slight correction, because someone will (or should) pull me up on this. A purely 24fps film projection would actually have a very noticeable flicker, though the illusion of motion would still exist. One of the tricks projectors use is to open and shut the light gate twice on each frame, create a 48Hz strobe, which cons the eye into persisting the image longer and blending each frame together. It's kinda sorta hacking into visual processing that eliminates saccades (the constant re-orienting of the eye to scan across a scene) which we don't perceive because our brains are pretty fancy at visual processing. But again, this is still the barest minimum necessary to create the illusion of continuous motion without any obvious flicker.

              Active screens work a little differently, for obvious reasons, but they all aim for the same goal of hitting your eye with as many screen refreshes as possible in any given second, because the only realistic limit on how much information our eyes can take in from a screen is technological. We can theoretically perceive visual changes in terms of kHz, though that's rather ignoring the reality of how the eye works. There is no latency, framerate, or response time to measure in the eye, because the eye is not a discrete, quantised sensor, but a set of analog receptors backed up by an immensely powerful visual processing machine.

              tl;dr we don't see the world in frames. They're an abstraction generated to describe a particular technology, which also serve as a pretty good example of the restrictions on thought and understanding that a linguistic or cultural paradigm can create.

              1. A.A.Hamilton

                "...good example of the restrictions on thought and understanding that a linguistic or cultural paradigm can create."

                This statement should be engraved in the minds of every person (especially male ones) with aspirations to be some sort of leader, in business, politics or military activity, before those aspirations get out of control.

            3. Stuart Halliday

              I have to have a minimum 72Hz monitor or a 100Hz TV. I can see flicker if it's 50-60Hz.

              Went to see a Harry Potter film in the cinema , only to discover it was flickering like mad. Seemingly it was supposed to look 'ye oldie' style.

          2. Mr Sceptical

            Judder much?

            Err, you might want to check out the multitude of monitor refresh rate tests on reaction times and gaming performance before you go any further. Try the Linus Tech Tips recent video on YouTube where it was only over 144Hz there was limited improvement.

            As for saying 20fps = motion, have you tried playing an FPS at that rate?!? It would be more fun poking yourself in the eye with a blunt stick...

            1. Michael Wojcik Silver badge

              Re: Judder much?

              To be fair, I find poking myself in the eye with a blunt stick is more fun than playing an FPS at any refresh rate.

    2. Anonymous Coward
      Anonymous Coward

      Streaming is chosen as a worst case example as it exhibits:

      - constant bandwidth demands on the shared access media (i.e. air)

      - easy to scale to high bandwidth

      - availability of tools to measure packet loss, bandwidth, jitter etc

      Comparing this to web browsing and gaming, they tend to both cope quite well with <10Mbps per device outside of large downloads which approach the requirements for streaming. Latency tends to be a bigger issue, and decent wifi shouldn't contribute noticeably to latency (i.e. 1-3ms for wifi, <1ms on ethernet for local LAN). The problem is, most wifi latency comparisons for gaming focus on home wifi that are less than ideal due to insufficient coverage (i.e. one wifi AP per house) and distance optimised settings.

      1. JDX Gold badge

        Surely worst-case is live-streaming/video-chat 4K because then you cannot buffer like you would with streaming video.

        1. Anonymous Coward
          Anonymous Coward

          Disable buffering and there's no difference between live-streaming and streaming aside from error-correction.

          For straight streaming, it allows you to easily check for packet loss - the error correction codes in live streaming make it more difficult to compare at low levels of packet loss although I guess that depends on how much telemetry you have.

        2. veti Silver badge

          I guess, but who livestreams at that rate? Just watch your porn pre-recorded, like the rest of us.

  2. kuiash

    If you want to transmit 4K video at 60fps and 10bits per channel you'll need about 15Gbps.

    HDMI 2.1 scales up to 48Gbps.

    If we want to talk about sending compressed streams then that's a different problem space.

    1. defiler

      I think the real world tends to talk about compressed streams. HDMI is a different technology, and uncompressed video is very much a niche space. (And if you have bandwidth demands like that, you're going to go wired anyway.)

      1. Baldrickk

        Wireless VR is kinda desirable though.

        1. Jimmy2Cows Silver badge

          Although, lugging the enornous battery for wireless VR needing this kind of bandwith to work for more than a few seconds is kinda less desirable.

          And while wireless AR might be desirable, I'm pretty sure you don't want to be wandering around in a fully VR environment, wireless or otherwise.

          1. Baldrickk

            Headsets can run for quite a while on a Lithium Ion/Polymer battery these days. While there isn't a wireless kit for mine, friends with a HTC VIVE Pro get about 5 hrs off a single charge on a 10,000mAh battery pack.

            The Oculus Quest is fully stand-alone, based off a mobile chipset, and works great even outdoors in a field, apparently.

            Obviously, you want to make sure the area you are doing it in is clear.

            The 60hz connection to the Vive is limited, not by the wireless transmission protocol, but by internally masquerading as a USB 3.0 device on the PCI-e bus, and is thus limited by USB 3.0 speeds...

  3. Khaptain Silver badge

    How will it fair in corporate environments

    Questions :

    How well does it work for VOIP ?

    Will it pentetrate walls, floors, stairwells better than 2.4Ghz, 5Ghz.

    What would be the optimal, not theoretical range in order to calculate antenae placement ?

    How well does it deal with roaming intra-office, especially with overlapping antennaes ?

    Speed is only good when all other factors work.....

    1. Anonymous Coward
      Anonymous Coward

      Re: How will it fair in corporate environments

      For your questions:

      - How well does it work for VOIP ?

      Assuming QoS is deployed and there are adequate AP's to avoid congestion (i.e. some overlap between AP's), you MoS scores should be similar to wired connections at the same location all else being equal.

      - Will it penetrate walls, floors, stairwells better than 2.4Ghz, 5Ghz.

      Only as well as existing 2.4GHz/5GHz solutions - anything that absorbs most or all of the RF will require more AP's to avoid not spots.

      - What would be the optimal, not theoretical range in order to calculate antennae placement ?

      This depends heavily on the office design and number of users. If you have a large open square/rectangular office with high user density, you will likely want an AP radio per 15-20 users for VoIP. For less regular offices or lower user densities, assume similar with additional AP's for poor coverage areas such as stairwells if you expect users to move between floors.

      - How well does it deal with roaming intra-office, especially with overlapping antenna's ?

      Assuming either 802.1X or shared key networks and users remaining in a common VLAN (to avoid client needing to obtain a new DHCP address), roaming should present no issues for modern phones and laptops. If it does cause issues, update drivers/patch the OS - older devices may cause issues.

      - Speed is only good when all other factors work.....

      Yes...I have seen many good wifi solutions ruined by poor end device choices. i.e. an environment that can't use 2.4GHz due to interference from other sources, wifi is designed around 5GHz clients to address this and then IT purchases devices that only operate at 2.4GHz to save some money...

      .

      1. Khaptain Silver badge

        Re: How will it fair in corporate environments

        Lol:

        "Yes...I have seen many good wifi solutions ruined by poor end device choices. i.e. an environment that can't use 2.4GHz due to interference from other sources, wifi is designed around 5GHz clients to address this and then IT purchases devices that only operate at 2.4GHz to save some money..."

        We fell into exactly this problem, our Alarm system operates in the 2.4Ghz as was constantly sending out signals that heavilly interfered with our Wifi. We only found this out after buying a batch of 2.4Ghz only smartphones. In the testing phase things were acceptable because we were only using one or two phone at a time but as soon as we deployed 35 phones things just went haywire....

        Final result, the 35 smartphones, Samsung G3s, went back into a carboard box and are still to be found in the IT Office to this day.... We then reduced the number of phones which use Wifi, all of which now have 5Ghz supported, bought a couple of Spectralinks ( which have a crap battery life) and voila a solution that works correctly...( dependant on where you are in the office.., lots of small offices, thick reinforced concrete floors and walls, stairwells that act as Faraday cages, lifts that don't like roaming signals etc).. We have all the worst barriers that could exist.

        1. Anonymous Coward
          Anonymous Coward

          Re: How will it fair in corporate environments

          Well, lifts and stairwells are probably to expected nigh everwhere. Lifts (elevators) are metallic by nature so when closed have cage-like properties. As for the stairwells, they're usually built as fire escapes so are built with reinforced concrete (extra thick with steel rebar embedded to boot).

    2. veti Silver badge

      Re: How will it fair in corporate environments

      All of those questions are basically determined by the signal frequency, not the protocol. The answers will be exactly the same as they are for your current version at the same frequency.

  4. sbt
    Coat

    So for 3D on my 4K, it's not 5G, but Wi-Fi 6 I need? Sounds like Seven

    It's getting a bit arbritary here with these codes (although less confusing than the 802.11 series, I grant you).

    Now that MS is talking about 10X, will Apple complain that they were there first with OS X, or move on quietly since it's all gone to California from the cats?

    Roman numerals in III, II, I...

    Mine's the 3XL. -->

    1. MJB7
      Pint

      Re: So for 3D on my 4K, it's not 5G, but Wi-Fi 6 I need? Sounds like Seven

      With a headline like that, you should apply for a job as a sub-editor at El-reg.

  5. DaemonProcess

    I'm still sceptical. Until they try it in a paper mill, which is notorious for absorbing wireless, or in my house which has 2 foot thick internal stone walls, I am reluctant to accept a test in a facility which has reflective surfaces.

    1. GlenP Silver badge

      Agreed, I have to deal with similar internal walls in part of one office building and a heavily reinforced concrete stairwell in another.

      As this is using 2.4GHz and 5GHz signals though I'd expect penetration to be comparable to existing setups (i.e. not good).

    2. Anonymous Coward
      Anonymous Coward

      If you expect one AP with an omnidirectional antenna to cover your site then yes, you will see issues.

      For homes, multiple AP's generally resolve the issues.

      For more complex, RF-hostile environments, directional antenna get you some of the way depending where you need it to work (i.e. walkways). And that's assuming the RF doesn't cause you issues with other equipment.

      And that you're not expecting wifi coverage in anything resembling a Faraday cage from devices outside said cage.

      1. Phil Endecott

        > For homes, multiple AP's generally resolve the issues.

        But introduce the problem of getting your devices to hand over from one AP to the other.

        As I understand it, there is still no “official” way to do this. There are some proprietary solutions.

        1. Roland6 Silver badge

          > For homes, multiple AP's generally resolve the issues.

          And if your home has wooden floors, I found mounting the APs in the loft was a simple way of getting good coverage including the patio/garden.

          >But introduce the problem of getting your devices to hand over from one AP to the other.

          As I understand it, there is still no “official” way to do this. There are some proprietary solutions.

          Well you only have an (AP) interop problem if you use AP's from different vendors... Not had problems with pure Aruba (now HP) or Draytek installations.

          However, agree it is irritating that as yet there is no IEEE/WiFi Alliance sanctioned interop standard - wouldn't be surprised if Aruba are sitting on some key patents.

    3. JDX Gold badge

      Well yes, the capability of Wifi in a paper mill is directly reelvant to most users.

      1. Khaptain Silver badge
        Trollface

        But doesn't everyone have reams and reams of A4 quietly floundering in a 2m high stack no more than 5 metres distant from their desktop ?

    4. Anonymous Coward
      Anonymous Coward

      It's still better than home installations with many competing APs, thin walls and almost everybody on the same channel... plus all the other things using the 2.4GHz band...

    5. big_D Silver badge

      Yes, I have reinforced concrete floors, I am lucky to get 80-120mbps out of my 802.11ac mesh setup.

      1. Roland6 Silver badge

        >Yes, I have reinforced concrete floors, I am lucky to get 80-120mbps out of my 802.11ac mesh setup.

        Poorly designed mesh network - just split it into an upstairs and a downstairs mesh with wired backhaul to the router.

    6. Anonymous Coward
      Anonymous Coward

      > or in my house which has 2 foot thick internal stone walls

      Serves you right for living in a castle.[*]

      [*] Unless you're a prisoner in the dungeon in which case, I've heard that The Prisoner of Monte Cristo is a good read. ;-)

      1. Jan 0 Silver badge

        > [*] Unless you're a prisoner in the dungeon in which case, I've heard that The Prisoner of Monte Cristo is a good read. ;-)

        I think he should read the Count of Zenda first, the story will then make more sense

        1. David 132 Silver badge
          Coat

          I think he should read the Count of Zenda first

          Or the sequel that starred Elvis Presley, “Return to Zenda”.

          *joke (C) Russ Abbott circa 1984

  6. Anonymous Coward
    Anonymous Coward

    This just confuses the hell out of me..

    If they are still performing "trials", how are some phones such as the Galaxy S10 and iPhone 11 already being sold with this?

    1. DC1948

      Re: This just confuses the hell out of me..

      And this week I have replaced my home wi-fi with an asus AX6100 mesh with wi-fi 6. Working fine.

      1. Aristotles slow and dimwitted horse

        Re: This just confuses the hell out of me..

        Interested to understand what end devices you have connecting to it that take advantage of the AX spec?

        I have an ASUS AX88 router that I bought primarily for the enhanced VPN support rather than network speeds, but I can't see any USB wi-fi dongles or such like available anyway that I could use to take advantage of it even if I wanted to.

    2. matt 83

      Re: This just confuses the hell out of me..

      Because all the wifi 6 stuff you can buy at the moment is not fully complaint with the spec. If you're buying wifi6 now you're hoping that the manufacturer will eventually release a firmware that fully supports 6 at some point in the future.

      IIRC the most useful feature of wifi6 is the anti congestion stuff (ofdma) which basically no one does right now and might not even be possible with some of the hardware currently on sale.

      Now probably isn't the time to jump on the wifi 6 bandwagon unless you need new wifi gear anyway.

  7. simpfeld

    My Mantra

    Wireless when you have to. Wired when you can.

    Has served me well over the years. My phone needs to be Wireless but the TV doesn't !

    1. JDX Gold badge

      Re: My Mantra

      Wired when you need it, whatever is easiest otherwise. I do not see any issues with my TV that are attributed to wireless for instance, and generally speaking my internet connection is the bottleneck not my internal network at home.

      1. Anonymous Coward
        Anonymous Coward

        Re: My Mantra

        Not sure why you got downvotes. My home internet connection is 30 Mbps - wifi speed is NOT my issue. My parents' is 8 Mbps, which is the fastest available in their area. They don't really do any device-to-device data transfers, so 802.11g could well be fine for them.

      2. Hyper72

        Re: My Mantra

        I agree. 2.4GHz was awful but since I changed to 5GHz some 10 years ago my connection have been rock solid. Over WiFi I get >50MB/s file transfers to/from my NAS (NAS is wired to the router), which is sufficient for my use-case. As I game a lot I'm often looking at latencies and out of the ~120ms ping I get to the game server, across the border far down in the states, only 1-2ms is due to my WiFi.

    2. Anonymous Coward
      Anonymous Coward

      Re: My Mantra

      At work for fixed infrastructure/devices then wired always unless really not practical. At home - whatever, usually wireless everything unless within a patch cable length of the router.

      I'm not going to do Cat6 cable runs to my TV when wireless works perfectly well with it.

      1. Adelio

        Re: My Mantra

        Or homeplug!

        My router is upstairs (No WIFI)

        TV etc all connected via homeplug including a WIFO router for phones , laptops and tablets.

        1. RizKat

          Re: My Mantra

          I second the motion !

          I use TP-Link powerline adaptors - never have a problem with them, or in the 2 friends houses that I set up.

          1. Charles 9

            Re: My Mantra

            I've been leery about them. Adds noise to the house mains plus there have been hacking issues with them. Then there's what happens after a blackout (both physically and logically).

          2. Glen 1

            Re: My Mantra

            Had some TP-Link powerline adaptors, would occasionally lose connection then take 30 secs to reconnect. (1-2 times a day)

            Absolute PITA when streaming video.

            Finally connected the streaming box to wifi, no problems since.

            1. Roland6 Silver badge

              Re: My Mantra

              >Had some TP-Link powerline adaptors, would occasionally lose connection then take 30 secs to reconnect. (1-2 times a day)

              Had a similar problem, although mine kept loosing their security settings and dropped out of the network. solved when I discovered one was running a different firmware version - network has been rock solid for 2 months now.

    3. deive

      Re: My Mantra

      It is super annoying that ther are still no consumer 10G wired motherboards!

      1. phuzz Silver badge

        Re: My Mantra

        Well, the add in cards are still about £100, so it's not hard to guess why manufacturers aren't including them. But on the same tack, why not just get an add-in PCIe card?

        After all, it's useless without another device that also has 10Gb ethernet, and can actually produce enough data to make use of it.

        1. Glen 1

          Re: My Mantra

          "can actually produce enough data to make use of it."

          With 4K video on phones now being a thing, and people juggling multiple VMs in homelabs, frequently moving 10s of GBs of data isnt *that* niche a thing. You are running regular backups right? Occasionally consolidating the incremental ones?

          You don't need to get anywhere near maxing out a 10Gb connection for it to look more attractive than a 1Gb link. Especially if you spend longer than you would like staring at progress bars.

          That said, I agree with you. If it fills your niche, it's worth spending the money on. $30 per endpoint would be a hell of a lot easier to justify than $100 though... *grumble*

      2. Anonymous Coward
        Anonymous Coward

        Re: My Mantra

        It is super annoying that ther are still no consumer 10G wired motherboards!

        You mean like this or this or this?

        To reemphasize @phuzz's point, few people will even have the required storage at home to come close to 5Gbps, let alone 10Gbps. The better upgrade path is to 802.3bz standard, since it can use Cat5e cabling.

        1. Roland6 Silver badge

          Re: My Mantra

          >since it can use Cat5e cabling.

          Well, out-of-the-box Cat5e cabling is capable of handling 10Gbps, just as long as you don't exceed the total port-to-port distance constraint of 45m and 5 sockets (ie. including patch and drop cables) and are careful about the wiring of (Cat5e) patch panels.

          This is only really of use if you have a load of Cat5e cable already installed, for new installs the price and availability of Cat6 mean it is the sensible choice for new installations.

          1. Anonymous Coward
            Anonymous Coward

            Re: My Mantra

            Cat5e isn't rated for 10Gbps though, so caveat emptor if you decide to play with it. I would expect the distance will be significantly shorter than 45m.

            1. Roland6 Silver badge

              Re: My Mantra

              >Cat5e isn't rated for 10Gbps though

              The cable vendors will give a speed/distance rating for their structured (solid core) cables.

              The IEEE 802.3 Standards committees agreed on the 50m and 100m performance hurdles for media listed in the Standard several decades back. So if you ask an installer to deliver an IEEE compliant 10Gbps structured cabling system they won't be including the use of Cat5e.

              >so caveat emptor

              Basically, if your existing installation has short runs then it is worth buying some kit and trialing it before embarking on a full structured cabling upgrade; obviously, the use of Cat6 patch and drop cables could be helpful.

              >I would expect the distance will be significantly shorter than 45m.

              I was surprised how good Cat5e was and how poor Cat6 is, with it being rated for 10Gbps for distances up to 55m; although I full expect Cat5e to be far more susceptible to crosstalk and system noise.

              Personally, when assessing existing Cat5e installations, I use an maximum patch panel to wall port distance of 28m as this allows for some cable slack, patch cables, drop cables etc. in the actual cable run.

    4. commonsense

      Re: My Mantra

      Check your TV. Mine has a 100mbps ethernet interface, which is slower than what the WiFi is capable of.

      1. mark l 2 Silver badge

        Re: My Mantra

        But your TV can get that full 100mbps over that Ethernet connect, even if you have several other devices plugged into the same router. Where as on WIFI if you have multiple other devices connected to the same wireless network they are only getting a portion of the bandwidth. which could well mean each device is really only getting 20mbps or even less.

  8. LeahroyNake

    Don't think I will need it

    Considering that my home connection is only ~40Mbit and I have type n WiFi that can easily cope with the kids watching YouTube all over the house. My home NAS would be lucky if it could saturate the current WiFi connection / but it is hard wired as I put it in a sensible place so it will never be an issue.

    Maybe I could use this in the office but the only thing that can use all of a 1Gbps ethernet connection would be the SAN and no bloody way that it going over WiFi. This includes the printers by the way, even the £50k one is limited by the RIP processing files and not the network.

    I think the question should be, who needs to stream 700Mbit for any work in a factory? Excepting the video editors etc but it you need that much bandwidth I doubt your workstation is that portable to exclude one extra cable or docking station.

    1. Cederic Silver badge

      Re: Don't think I will need it

      I can stream from my home NAS at over 400Mbps on my wifi.

      More bandwidth is always welcome. Always.

  9. Henry Wertz 1 Gold badge

    What wifi6 is

    So, you may ask, what is wifi6 (802.11ax?) Well, the "10gbps" claims are adding multiple, wide 5ghz channels plus the 2.4ghz channel, adding all those peak speeds together. Yeah...

    Really, in terms of anything new, 802.11ax adds 1024QAM instead of 256QAM being maximum modulation, this only speeds things up if you're within feet of the access point. Honestly not exciting, everything above 64QAM requires a good enough signal that they're mostly to have something new on paper, not really improving speeds.

    What DOES improve things, the wifi guys decided with 802.11ax and REALLY rework how access to the channel is coordinated between the access point, the access point's clients (i.e. phones, tablets, etc.) and even coordinate some between it and neighboring 802.11ax access points. Note, this is on 5ghz AND 2.4ghz... They're expecting to DOUBLE speeds (on the same channel, with same amount of neighbors etc.) by doing a MUCH better job of avoiding packet collisions and retransmissions, and taking better advantage of technologies like multi-user MIMO (effectively, if signals are bouncing around so one antenna on the access point gets a better signal from one device, and another from another device, it can send and receive to both devices at once.)

  10. fidodogbreath

    If you want to stream 4K video without any problems then you really need 25Mbps - which can be hard in real-world environments. [...] However, Wi-Fi 6 will soon offer the realistic ability for multiple devices to simultaneously watch very high definition video off the same network.

    At which the choke point moves upstream to the ISP.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon