Work in the cloud...
... and not just any cloud, but Chinese cloud and transmitted over a home grown protocol that is probably has a back door required by the Chinese Government.
I'm sure there will be some takers for this, but not me.
Chinese telecom equipment maker ZTE has announced what it claims is the first "cloud laptop" – an Android-powered device that the consumes just five watts and links to its cloud desktop-as-a-service. Announced this week at the partially state-owned company's 2022 Cloud Network Ecosystem Summit, the machine – model W600D – …
-> it's hard to imagine
It's not hard to imagine. A friend of your could involve himself, whether accidentally or on purpose, with somebody or some group that some government somewhere thinks is bad. Your friend was previously having an innocent conversation with you about adding some sweets or biscuits to your shopping list and drop them off at his place.
It turns out later that "sweets" or "biscuits" is a code word used by the "bad" group for bullets, drugs, or some other contraband. Bam! You are now in the loop. You are now a suspect. Judges in their infinite bias in favour of the police, will issue a warrant to search your premises, to look for these "biscuits". You will have a lot of explaining to do. You will be treated as a suspect, and despite the myth about being innocent unless and until proven guilty, you will have to talk your way out of it. Because it turns out the bad guys have a lot of form for using the word "biscuits".
I challenge ZTE to give this machine to an independent reviewer to test whether they can actually use a remote desktop session with 300 ms of latency. I can sort of believe the bandwidth claim, as long as the user isn't watching video. I can eventually be brought to accept the packet loss claim. I'm quite doubtful that they can deal with that level of latency without making the experience very painful. Maybe the tester they had using this was slow at typing, but the mouse needs to update quickly too. If they claim to support that bad a network, they should prove it.
This maybe the case but with so much delivered from the Internet & moving into browsers now with all the latency and refreshes that involves, ultimately this is where we will end up.
Desktop as a service. Microsoft is already there with the Azure Desktop, it just needs the thin clients.
However much hardcore techies dislike this and see it as a big step backwards, if it is pay-by-month on a subscription and includes an endpoint, then consumers will buy.
99% of consumers simply don't care.
Those who forget about the past are forced to repeat it.
I think I will pass on that one. I did work on an XTerm, twenty-odd years ago. Yes, it was usable. Mostly.
I have to admit that at some recent-ish project we used a windows terminal server for the development environment. This gave everybody a consistent base. No, it was not perfect. Yes it mostly did work. Yes, when the connection dropped we had to do other stuff, but that was a) not too often and b) that other stuff needs to get done anyways. C) if there is a deadline looming it causes too much stress...
Yeah. And X11 was used properly in the days of X terminals, with mostly protocol messages flowing between the client and server, and let the ddx layer render. Very little of this QT-style "I'll do my own rendering into a bitmap and shove it over to the server" crap.
So you didn't need nearly as much network bandwidth. And since the primitives could compress a lot of information and rendering was slow compared to today's hardware, latency was less noticeable, too. If an xterm sent 800 characters in a single XDrawString to a server, it would have a little while before the next message needed to get there.
Had IBM not been forced by DoJ to divest itself of Service Bureau Corporation, and had not the FCC in Computer Inquiry I & II forced the Bell System not to provide integrated data processing services, computing would have always been done by dumbish terminals connected to time-sharing servers, and the detour through on-premises computing would have been avoided.
We now have a 21st century screen that can do nothing on its own
but this one requires an eight core processor (although speed not specified). The last dumb terminal I worked on was entirely built from TTL chips (no microprocessor) and it was highly responsive.
I thought the same, and you could probably get a suitable OS running on that processor if they've provided sufficient internal storage (which I'm sure is soldered in). However, my guess is that this won't cost that much less than a normal cheap laptop, and you don't have to fight with those to let you put your own choice on. Unless they heavily subsidize it, the materials will cost almost as much as any low-end Windows or Chrome OS machine, and if you choose the Windows one, booting Linux is usually only ten minutes of effort.
It's a chromebook, isn't it ?
If not, how is it better (or worse) ?
I'm not saying that it's bad that it's a chromebook, more interesting that someone other than google feel able to offer that partway point between local and remote processing. The 8-core processor, for example, implies that it runs a local browser rather than screen-sharing a remote one.
From the sound of it, this will do even less than a Chromebook would without a connection. That's saying a lot. Maybe they just didn't explain the features, but it sounds like it will only have terminal uses and could therefore get away with only having the local OS handle getting online, handing off to the remote machine after that.
Computers empower. Terminals disempower.
This is one step beyond the Google half-a-laptop-for-the-price-of-a-whole-one Chromebook for 'taking back control' from the user.
They may be inevitable/essential for homeworking as companies can bake in higher levels of security (and lower levels of functionality). Essentially, they offer the chance to create a virtual cubicle in an employee's home. You will still have to supply your own spider plant, coffee mug and a photo of your favourite 'TWICE' member to get you through the day.