so, a 'Network Computer'?
1996 called...
https://www.cnet.com/news/oracle-to-debut-500-network-computer-1/
Alibaba has teased a tiny PC replacement that will stream apps from its cloud. While Alibaba has not released many details about the device it has made its intentions plain: during the device's launch at the company's Apsara conference execs made constant references to the pain of living on a PC upgrade treadmill and the many …
Oh it certainly could be viable - QuakeOnline has been rocking it for years now and it's not your browser doing the graphics.
It boggles the mind to think that we are indeed headed straight back to the 90s. Apparently the future of IT is everyone on a dumb terminal connected to a CPU datacenter. You upgrade by paying a bit more on your monthly plan. I'm guessing they'll want to bill you for amount of CPU activity and electricity you consume, because otherwise how can it be economically viable ?
Well I will leave the younguns the pleasure of finding out how that works. My data and my activity stays local, thank you very much.
"Maybe today's global network infrastructure is better and faster than it was in 1996?"
It is. So let's put ten thousand staff in front of those at home and see how many of them have internet issues ever. Even the tiniest downtime makes this freeze. A longer outage probably causes this to crash. At least with local computing, people can continue to write, read content they've already retrieved, and do a lot of work. This is especially relevant because, despite their protestations, there isn't much of a harsh computer upgrade cycle anymore. A computer from 2015 usually handles everything the average worker wants to do (even one from 2010 handles most of that), and most businesses have been holding on to devices for longer. The recent buying spree was for places that needed portable machines for work from home, but that's already happened. If someone really needs a lot of remote resources, they will use the cloud, but most people don't need that and most who do still want some local computing available to them while their expensive cloud box does the heavy lifting.
Yes, a computer from 2015 could very well handle everything the average worker wants/needs to do. I said it could because Windos 10 makes sure it will not. Thanks a lot Microsoft.
My venerable Dell Latitude D630 (15 years old if you need to ask) with Linux Mint 20 does Internet browsing, email and office documents quite well though I didn't try throwing videoconferencing at it (I'm a merciful person) and it might even run Windows 7 if drivers were available. My point is that 5 to 10 year old hardware is fine but every OEM wants you to buy new stuff every year and Microsoft is not inclined at all to refuse to help them. So in conclusion, even with this new solution, expect to be nudged into buying that little thingy again and again. And everytime expect to pay a sizeable amount money for it. Why? Because they can, that's way.
I'm surprised you're down-voted for stating facts. I can only agree with you; unless your computing needs are large - gaming, video-editing... an old computer will do most of what anyone needs. Mine is 10 years old, also runs Mint, and, incidentally, is fine for conferencing.
Lightweight devices have been the rage for a long time.
I agree with the lock in, to a point. I suspect cloud pricing for storage would take in to account downloads and migrations to other providers.
The larger issue... should be privacy concerns.
So this would be the Acorn Network computer, yes (and running an ARM chip)?
But there is prior art!
NCD had X terminals in about 1987 (which may or may not have had the Display Manager running locally depending on how it was configured), and it it not too far a stretch to get to the AT&T Blit in about 1983, although that was neither cheap nor compact.
It would not surprise me to find something from Xerox PARC knocking around at about the same time as well.
Well, I was thinking about something a little more than a dumb (or even slightly intelligent) ASCII or EBCDIC terminal.
All of the examples I quoted allowed for some measure of overlapping windows with multiple sessions and some graphics capability, much as people do nowadays (although the Mac generation appear to like everything running fullscreen).
For a remote access terminal, you could go back to teletypewriters hooked up using EIA current loop, but these were hardcopy devices. I think that CRTs were being adapted as display devices in the 1950s, typified by that whuch ran Spacewar! on the PDP1 at MIT in 1961, but not very useful for text work.
yes this shift back to "big iron" from individual PCs has been tried several times. Don't forget the "dot bombs" of the early 21st century, too. And MS's cloudy version of DevStudio. All of these aren't doing so well, against the hopes of their developers (and the carnival and bandwagon appeal of their marketeers).
Long ago it was obvious that "big data" works best if you distribute the processing of it, but centralize the storage of it, no doubt with at least SOME niche exceptions but I'm talking about the general case.
There was also an attempt to do something *LIKE* this with "terminal services", and I haven't seen a whole lot of exploitation of THAT platform, either...
"But, but, but, it hasn't been tried by *US* yet!!!"
What was that definition of insanity again?
They said their remote system runs Windows or Linux. Those who are choosing Windows can run exactly the same malware as their local Windows machines could. Those choosing Linux can also be hit with malware, and it's probably a custom version of Linux, and how many among the general public are going to click that anyway. This isn't a locked-down OS with extra security features, it's locked-down hardware giving access to the same OS for which you're charged every month.
Conversely, malware and AV scanning can be run at the hypervisor layer, allowing the back-end provider to detect, prevent, and remediate malware, possibly before it ever hits the virtual PC. On the other hand, if you're doing naughty things on your PC, the provider will probably be able to tell.
It can be run at the hypervisor layer if the customer doesn't mind having their disks scanned at all times, because the detection needs to run on any file before the user clicks on it, including when that file has just been downloaded. And it needs to scan memory and basically do all the things a local antimalware program does but on a disk which is in use by another operating system without messing with the OS (E.G. both malware scanner on hypervisor and user-level application on VM trying to access the same file), or causing performance delays (waiting for file locks to release), or not allowing the user to correct a false positive since they don't run the hypervisor, or having vulnerabilities letting someone crash the hypervisor's protection system with a zip bomb or the like. Otherwise, it'll be exactly like normal Windows and protection will hinge on things running under the VM. Either way, there will still be malware, there will still be misconfigurations, and where in the world the real computer is will make little difference.
"Sure, and personal computers never have problems either."
Here's the difference. I'll be the IT person at a small business with three locations which communicate among one another.
Option 1: Personal computer has a problem:
Phone: Ring ring.
Me: Hello.
Them: We have a machine that's not turning on. It says there's a disk failure.
Me: I see. Where is it?
Them: That site far away from where you are.
Me: Great. Well, this sounds like I'll have to come over to fix it. I can be there this afternoon.
Them: This is important. One of our employees can't work with their computer down.
Me: There's a backup in the closet. If you replace that employee's machine and they log into the backup, things should work. Hopefully they've remembered not to save things to the internal drive.
Option 2: Cloud has a problem:
Phone: Ring ring.
Me: Hello.
Them: Every computer in our office isn't working. Each time we try to turn one on, it says "Network connectivity error: NT929018. The connection to nl83.localarea.cloud.resource.alibabadumbterminals.com could not be completed. Please contact your network administrator to resolve this problem."
Me: Uh-oh. I'm the network admin, so I should be handling this. Which site are you?
Them: That site far away from where you are.
Me: I'll rush over.
Me: Hi. I'm here to test your network.
Me: Your network appears to be working.
Them: None of the computers start. None of us can do any work involving a computer.
Me: I see that. I just mean that the error they're talking about is probably outside this office.
Me: Wait a minute, I need to check something.
Phone: Brrrr brrrr.
Someone else: Hello.
Me: Hi, this is IT calling. I wanted to check if--
Someone: Are you calling about the computers? Can you fix them?
Me: I think I already know the answer, but they're saying to contact the network administrator, right?
Someone: Yes. Nobody here can work. When can you fix it?
Me: I'm at the far away site. They have the same thing. Let me call you back.
Phone: Brrrr brrrr.
IVR: Thank you for calling the network terminal support line. All our representatives are busy. Please stay on the line.
This post has been deleted by its author
Given it's basically a dumb terminal, a UK based person uploads data then there should be some privacy agreement in place between the countries involved (eg the not-at-all-Safe Harbour) for the data involved. However if the user creates new data on a remote cloud computer (a commercial product design for instance), do the same international data agreements apply or the privacy laws of the host country(s) in which the data was actually produced and stored?
But I'm sure there will be lawyers to find otherwise.
and maybe a subpoena or two for data and logs that get delivered without your knowledge or consent, like with predatory and/or divorce-related lawsuits, evidentiary "fishing expeditions", etc..
If your data is purely under YOUR control, you can always "object" to discovery. not so if it's "on the cloud" - those guys will hand it over without even a 2nd thought to a) stay in business and b) stay under the radar of regulatory agencies in general.
[yeah no corporations have EVER gone after regular people with high profile high dollar lawsuits in a predatory way using "evidence" provided by ISPs and content providers in some kind of apparent fishing expedition to "make examples" of people, right???]
"Reg readers may also recall Intel’s PC sticks, very small form-factor PCs designed to be mated to monitors with their HDMI plugs, but also requiring an external power source. PC sticks have found admirers among digital signage providers, but have not set the world on fire."
Either this was a clever and sarcastic reference to the Amazon Fire Stick (which is a little PC plugged into an HDMI socket with an external PSU) or....what?
Apart from the semiconductor wars playing out before our very eyes, and I would prefer as much as the next man to drive a Ferrari instead of a Morris Minor, but in principle why should I need a CPU in my pocket which has more computing power than NASA when they put a man on the Moon? The data terminal thingies are what is important to me.
I am sure I do not need to remind punters here that a massive proportion of what we "do" on our phones whether it is spell-checking, map reading, workout tracking etc etc is already done "somewhere else" with some other organisation's cycles.
Privacy? Forget it. It's so last century. (Many will not like that. Neither do I but I happen to like the 21st Century).
Personally I like that Apple is making phones powerful enough that they can do things locally where no one can spy on you.
Even the language translation feature being added in iOS 14 that Android owners are saying "we've had that for years" is processed locally if you download a language pack. On Google products its all done in the cloud, and no one should be under any illusion they aren't using it to add to the data profile they maintain on you.
In China where you effectively have no right to privacy, you might as well carry around low power devices and do everything in the cloud because they're going to spy on you regardless.
I have laptops, PCs, and Macs scattered about the house for various purposes -- streaming music, graphics-and-multimedia authoring, wife's yoga vids, yadda, yadda. They're all "old" by most standards, and a few of them are "obsolete", but all of them are fit for purpose -- the purpose I assign to them. (Even a 32-bit Toshiba laptop, which seems to be trying to outlast Stonehenge.) So for my user case, there is no "PC upgrade treadmill".
Of course, none of these older computers runs Windows. Depending on Microsoft anything is one way to glue your virtual feet to a very real upgrade treadmill. So I don't.
The current device is a what, a thinner-than-thin client? A brush-fire a kilometer-and-a-half from my house knocked out a fiber-optic line and broke my internet connection for several days. (Other people lost much more than that in the fire, including their lives, so I'm not complaining.) Alas, during that time I was only able to use my computers to record and edit music, work on graphic art projects, watch some movies from a thumb drive, consolidate budgeting-and-retirement spreadsheets, and do some writing.
-- Well, the point of the anecdote is obvious.
But hey, many people seem to be OK with ceding control of their devices and their software to corporations. Not so many on this forum, perhaps. Anyway, I doubt that this particular device will get much traction in the marketplace. But I have proven myself an unreliable prophet, so who knows.
It's perfectly possible to use Windows long term without being stuck on an "upgrade treadmill". At work, we have PCs that are up to about 8 years old that do a good job of running Windows 10. But because we have thousands of PCs, we also have a complete infrastructure dedicated to deploying and maintaining them. I primarily use Macs, but have a Windows PC (primarily for gaming, but also for home working). While I recently spent a lot of money rebuilding it from the ground up, the PC I built from the old parts (which are about 4 or 5 years old now) still do a sterling job of running Windows, and a good job of running most games. No real treadmill, either on the macOS or Windows side.
But I agree with your comments about thin clients from the home. As with you, my home setup is still functional (even for some home working purposes) even if my internet connection dies for a few days. Even when the home broadband failed due to a network fault a couple of days ago, I was able to do some home working. I got complaints from the other people in my house, but that would happen anyway.
If I had been using a thin client, I would have not been able to do anything. Even if the connection hadn't failed, there would be miles of cable, and potentially dozens of items of equipment between me and the computer running my software. Any bit of which could fail. Even if that doesn't happen, my cloud service provider could choose to stop my service, or be forced to (maybe by bankruptcy). My computer is a self build job, with parts from multiple manufacturers. Even if something happened that took out every manufacturer, and even my ISP, my computer would still be functional.
I guess the article passes on all the details known, but it would be really interesting to know about the processor... Could it possibly be RISC-V XuanTie?
And if the OS is something LInux, then could something else - light - be flashed onto it? Could it be used with Nextcloud?