back to article That's a bit harsh: Lenovo adds 2 new toughened boxen to ThinkEdge edge computing line

Lenovo has updated its edge computing hardware portfolio, adding two rugged compact machines to its ThinkEdge line-up. The ThinkEdge SE30 is a ruggedized bit of kit, intended for harsh industrial environments. Under the hood, there’s a choice of 11th Generation Intel Core i5 vPro chips, with up to 16GB memory and 1TB storage …

  1. Anonymous Coward
    Anonymous Coward

    > Edge computing is a paradigm that sees complex computational tasks – like data analysis or cloud gaming – performed as close to the “user” as possible. By running these tasks closer to the source or consumer, rather than in a vast air-cooled Amazon data centre, you reduce latency.

    So basically it's a hip and trendy new name for the way computing has been done for the last half century?

    1. willyslick

      No, actually not. Whats new is that the data processing is happening close to the source where the data is collected, and acted upon locally by the application, rather than all data being sent back for processing to a centralized, environmentally-controlled data center which could be a long way away (requires bandwidth and injects latency). An edge device can be located on lamppost, in the middle of a highway, on a factory floor, can be off-grid and use solar or wind power and has different security requirements due to its exposed location. As far as I know, that not the way we have been doing it for the last half century..but it still has a CPU and is a computer, if that's what you mean.

      1. doublelayer Silver badge

        It's between those views. It's not the way we've been doing things for the past 50 years because 50 years ago was 1971 where everything happened in the data center. But it is what we've done for a lot of the past two decades and a lot of the article's text is weird.

        "Edge computing is a paradigm that sees complex computational tasks [...] performed as close to the “user” as possible."

        Nothing wrong there. The most complex tasks have long been done at the data center. Searching a big database or things like that usually happened there. What are the things they're discussing?

        "like data analysis or cloud gaming"

        This though is a little funny. Data analysis can be done in a DC if it's intense enough, but it's not unusual to perform the analysis on a typical computer. Cloud gaming, while done on cloud machines now, is what happened after gaming on user's computers because the cloud couldn't handle things fast enough to return the frames to the user before they became obsolete. That is nothing new.

        If we want something even funnier, check this:

        "The 5G network spec has been touted as a major driver behind this, particularly in the spaces of industrial and agricultural IoT, thanks to its lower latencies when compared to previous generation mobile standards."

        This is so far off the mark. 5G is faster, meaning there is less time (assuming you have 5G and it's working) to get your data from the collection point to the server. That means the latency problem of waiting for the cloud is lowered. 5G is actually reducing the likelihood of edge computing since it doesn't take as long to wait. Edge computing is useful when you don't want to wait, so you collect the data here and process it here too.

        Since the 90s, we've had the ability to take a computer out to places where data was collected. Not only that, but through much of that time, the ability to send that data from the point of collection to a larger computer was a weak point. Mobile connections were not as widespread and much slower. So a lot of the use cases for that collection either waited a long time for slow uploads to a data center or processed the data locally. Only as networks improved did we start doing more of that on remote servers. Edge is not a new concept. It's a new name for "the computer is in the collector box".

        1. Red Ted

          Out on the edge

          It is as ever a term that has been coined and has then become mis-applied.

          My understanding is to think, for example, of a computer attached to several microphones listening to the noises a machine makes as it operates. If it is in a remote location you could use edge computing to do the Fourier analysis and then send the main frequencies and amplitudes over the network to the Data Centre that than uses its big database of all the different noises that the machine makes to schedule maintenance etc at far less network cost than if you just blindly uploaded all the data streams from all the microphones.

  2. Anonymous Coward
    Anonymous Coward


    If you want to collect data from IoT devices, why wouldn't you process the data in a datacenter (or the cloud if you want to be like the cool kids). Yes, this device is great for industrial uses, but IoT doesn't seem to be a great fit, unless this is the Internet gateway for non-Internet connected devices. If so, that's not really IoT, that's just a remote management gateway.

    1. Dave 126

      Re: Confused

      It's not your fault you're confused: the definition of IoT varies. In industrial contexts, IoT retains more of its original meaning, that physical objects be given a unique identifier thus making them addressable. It was said to be "[like the] Internet, [but for] Things".

      Think of boxes with RF tags on a shelf in a warehouse. Each RF tag is unique, so each box is addressable. However, they are not connected to a LAN or WAN, and might only be read as they pass along the 'Goods Out' conveyor belt.

      'IoT' has also been coopted to mean 'cheap, poorly secured tat that is unnecessarily connected to a remote server', which is a different concept.

    2. doublelayer Silver badge

      Re: Confused

      IoT has never meant anything. It can mean things that aren't actually connected to the internet, even through a gateway, and it can mean completely normal computers connected to the internet. It's a term which means almost anything the speaker wants it to mean. So long as there's a processor somewhere that is involved, it counts. Therefore, it's been used to envelop lots of other categories that are clearer. Home automation is what consumer-style IoT normally means. Industrial automation is IoT when applied to a factory or agricultural user, at least most of the time. Sometimes it just means mechanical surveying or monitoring. Just like "edge computing" means "the computer is near where it's used", "IoT" means "there's something technical that probably doesn't need a person there all the time".

  3. richdin

    Global climate change, not.

    According to the scaremongers among us - wouldn't it behoove them to worry less about freezing and look out for higher temps?

    1. doublelayer Silver badge

      Re: Global climate change, not.

      Lots of industrial places need extreme cold temperatures. Pharmaceutical and frozen food warehouses need to be very cold. So do some factories. Also, there are large chunks of the world where it's frequently very cold, warming notwithstanding. A warehouse in most of Canada or Scandinavia is going to be cold in winter even as the planet warms; a few degrees of heating can cause major damage to the biosphere but it's not going to make winter into summer.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like